Created 04-29-2016 03:55 AM
Currently we are using HDP2.3.4 with Spark1.5.2 and Scala 2.10.
Can we upgrade Scala 2.11 without upgrading Spark and HDP?
Thanks for your help!
Created 04-29-2016 04:13 AM
No. You will at the least need to build spark for 2.11. From spark apache documentation,
To produce a Spark package compiled with Scala 2.11, use the -Dscala-2.11
property:
./dev/change-scala-version.sh 2.11
mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package
Spark does not yet support its JDBC component for Scala 2.11.
Created 04-29-2016 04:13 AM
No. You will at the least need to build spark for 2.11. From spark apache documentation,
To produce a Spark package compiled with Scala 2.11, use the -Dscala-2.11
property:
./dev/change-scala-version.sh 2.11
mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package
Spark does not yet support its JDBC component for Scala 2.11.
Created 04-29-2016 04:42 AM
Thanks for quick response.
Created 11-23-2016 04:39 PM
how about for HDP 2.5