Support Questions

Find answers, ask questions, and share your expertise

Steps to upgrade to Spark 1.5.1 on Sandbox

avatar
Rising Star

Has anyone tried manually upgrading to Spark 1.5.1 on Hortonworks Sandbox and faced any issues?

1 ACCEPTED SOLUTION

avatar
7 REPLIES 7

avatar

Did not upgrade the existing Spark on sandbox, but installed in a separate location while playing with Zeppelin and it worked fine. Below is the script I used to set it up (see readme for Ambari service for Zeppelin for more info)

sudo useradd zeppelin
sudo su zeppelin
cd /home/zeppelin
wget http://d3kbcqa49mib13.cloudfront.net/spark-1.5.0-bin-hadoop2.6.tgz -O spark-1.5.0.tgz
tar -xzvf spark-1.5.0.tgz
export HDP_VER=`hdp-select status hadoop-client | sed 's/hadoop-client - \(.*\)/\1/'`
echo "spark.driver.extraJavaOptions -Dhdp.version=$HDP_VER" >> spark-1.5.0-bin-hadoop2.6/conf/spark-defaults.conf
echo "spark.yarn.am.extraJavaOptions -Dhdp.version=$HDP_VER" >> spark-1.5.0-bin-hadoop2.6/conf/spark-defaults.conf
exit

avatar
Master Mentor

Try this . Thanks to @Randy Gelhausen

avatar

avatar
Master Mentor

@Guilherme Braccialli how about upgrade of existing spark install ?

avatar

@Neeraj should be the same, add hdp-updated repo, yum install, copy hive-site.xml

avatar

@Saptak Sen Why try Spark 1.5.1 when you can try 1.5.2?

Spark is sort of like a client, so you can easily run multiple versions of Spark on one cluster. Simply download Spark from their the apache website - https://spark.apache.org/downloads.html - and invoke it to run it (can configure Zeppelin to use it or just run via the CLI).

avatar

@Andrew Watson I understood Hortonworks is going to support 1.5.1 in December and not 1.5.2, that would be the reason to use 1.5.1 instead of 1.5.2.