Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Sandbox HDP2.5 Activate Spark 2.0.0

avatar
Rising Star

I have downloaded Sandbox HDP2.5. I would like to activate Spark 2.0.0. It activated by default Spark 1.6.2.

Ambari Server 'start' completed successfully.                                                                     
[root@sandbox ~]# spark-shell                                                                                     
SPARK_MAJOR_VERSION is not set, choosing Spark automatically                                                      
16/08/23 21:29:14 INFO SecurityManager: Changing view acls to: root                                               
16/08/23 21:29:14 INFO SecurityManager: Changing modify acls to: root                                             
16/08/23 21:29:14 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with vie
w permissions: Set(root); users with modify permissions: Set(root)                                                
16/08/23 21:29:14 INFO HttpServer: Starting HTTP Server                                                           
16/08/23 21:29:14 INFO Server: jetty-8.y.z-SNAPSHOT                                                               
16/08/23 21:29:14 INFO AbstractConnector: Started SocketConnector@0.0.0.0:35616                                   
16/08/23 21:29:14 INFO Utils: Successfully started service 'HTTP class server' on port 35616.                     
Welcome to                                                                                                        
      ____              __                                                                                        
     / __/__  ___ _____/ /__                                                                                      
    _\ \/ _ \/ _ `/ __/  '_/                                                                                      
   /___/ .__/\_,_/_/ /_/\_\   version 1.6.2                                                                       
      /_/                                                                                                         

Using Scala version 2.10.5 (OpenJDK 64-Bit Server VM, Java 1.7.0_101)                                             
Type in expressions to have them evaluated.                                                                       
Type :help for more information.                                                                                  
16/08/23 21:29:18 INFO SparkContext: Running Spark version 1.6.2                                                  
16/08/23 21:29:18 INFO SecurityManager: Changing view acls to: root                                               
16/08/23 21:29:18 INFO SecurityManager: Changing modify acls to: root                                             
16/08/23 21:29:18 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with vie
w permissions: Set(root); users with modify permissions: Set(root)                                                
16/08/23 21:29:18 INFO Utils: Successfully started service 'sparkDriver' on port 46658.                           
16/08/23 21:29:18 INFO Slf4jLogger: Slf4jLogger started  
1 ACCEPTED SOLUTION

avatar
Super Collaborator

Hi Amit, there is some information in another HCC post; see

https://community.hortonworks.com/articles/53029/how-to-install-and-run-spark-20-on-hdp-25-sandbox.h....

SPARK_MAJOR_VERSION is an environment variable used to specify which version to use for a job when you already have two versions running on the same node.

View solution in original post

6 REPLIES 6

avatar
Rising Star

How do we set SPARK_MAJOR_VERSION. In which conf file. Are there any other related conf files to maintain?

avatar
Contributor

SPARK_MAJOR_VERSION is a environment variable, you could set into bashrc or anywhere like normal environment variable. By default SPARK_MAJOR_VERSION=1, which means it will pick Spark 1.6.2 by default. If you want to choose Spark2, you could set SPARK_MAJOR_VERSION=2 before you run spark-shell.

avatar
Super Collaborator

Hi Amit, there is some information in another HCC post; see

https://community.hortonworks.com/articles/53029/how-to-install-and-run-spark-20-on-hdp-25-sandbox.h....

SPARK_MAJOR_VERSION is an environment variable used to specify which version to use for a job when you already have two versions running on the same node.

avatar
Super Collaborator

p.s. installing both versions (side by side) in the GA release will be integrated into the Ambari installation wizard.

avatar
Rising Star

Hi LGeorge, thank you for the answer and the pointers.

avatar

The current HDP 2.5 Technical Sandbox has Spark 1.6.2. When the HDP 2.5 Sandbox GA's (coming soon) it will include both Spark 1.6 and Spark 2.0. If you need a quick way to play with Spark 2.0 check out the Hortonworks Cloud for AWS Technical Preview and here are the docs to get started. Stay tuned on the Spark Tutorial page for future Spark cloud tutorials.

FYI: @Robert Hryniewicz @rmolina @Paul Hargis