Created 10-14-2015 02:47 PM
Has anyone tried manually upgrading to Spark 1.5.1 on Hortonworks Sandbox and faced any issues?
Created 11-16-2015 05:35 PM
I used steps from our blog and it worked:
https://hortonworks.com/hadoop-tutorial/apache-spark-1-5-1-technical-preview-with-hdp-2-3/
Created 10-14-2015 06:29 PM
Did not upgrade the existing Spark on sandbox, but installed in a separate location while playing with Zeppelin and it worked fine. Below is the script I used to set it up (see readme for Ambari service for Zeppelin for more info)
sudo useradd zeppelin sudo su zeppelin cd /home/zeppelin wget http://d3kbcqa49mib13.cloudfront.net/spark-1.5.0-bin-hadoop2.6.tgz -O spark-1.5.0.tgz tar -xzvf spark-1.5.0.tgz export HDP_VER=`hdp-select status hadoop-client | sed 's/hadoop-client - \(.*\)/\1/'` echo "spark.driver.extraJavaOptions -Dhdp.version=$HDP_VER" >> spark-1.5.0-bin-hadoop2.6/conf/spark-defaults.conf echo "spark.yarn.am.extraJavaOptions -Dhdp.version=$HDP_VER" >> spark-1.5.0-bin-hadoop2.6/conf/spark-defaults.conf exit
Created 10-15-2015 11:41 AM
Try this . Thanks to @Randy Gelhausen
Created 11-16-2015 05:35 PM
I used steps from our blog and it worked:
https://hortonworks.com/hadoop-tutorial/apache-spark-1-5-1-technical-preview-with-hdp-2-3/
Created 11-16-2015 06:13 PM
@Guilherme Braccialli how about upgrade of existing spark install ?
Created 11-16-2015 06:46 PM
@Neeraj should be the same, add hdp-updated repo, yum install, copy hive-site.xml
Created 11-17-2015 07:02 PM
@Saptak Sen Why try Spark 1.5.1 when you can try 1.5.2?
Spark is sort of like a client, so you can easily run multiple versions of Spark on one cluster. Simply download Spark from their the apache website - https://spark.apache.org/downloads.html - and invoke it to run it (can configure Zeppelin to use it or just run via the CLI).
Created 11-19-2015 12:25 AM
@Andrew Watson I understood Hortonworks is going to support 1.5.1 in December and not 1.5.2, that would be the reason to use 1.5.1 instead of 1.5.2.