Support Questions

Find answers, ask questions, and share your expertise

Use SparkLauncher with Spark2

avatar

Hello,

 

We're testing a programmatical way to launch spark jobs on a cluster.
We use the SparkLauncher class to do so. It needs the location of SPARK_HOME to work so we just point it to /opt/cloudera/parcels/SPARK2/ in order to use Spark 2.

 

Unfortunately the command used to launch the job is hard coded in SparkLauncher class to "spark-submit" and Cloudera distribution of Spark 2 only includes "spark2-xxx" scripts.

 

Is there any way to use SparkLauncher to launche Spark 2 jobs or another way to launch jobs programmatically?

1 ACCEPTED SOLUTION

avatar

/opt/cloudera/parcels/SPARK2/ should not be used as SPARK_HOME.

The correct path to use is /opt/cloudera/parcels/SPARK2/lib/spark2/

 

We were just wrong about using spark2-xxxx scripts.

View solution in original post

2 REPLIES 2

avatar
Champion
The only way I can think of would be to have the Spark2 gateway installed on a node that doesn't have the Spark1 gateway or any Spark1 roles. Then create a symlink of spark2-submit to spark-submit.

avatar

/opt/cloudera/parcels/SPARK2/ should not be used as SPARK_HOME.

The correct path to use is /opt/cloudera/parcels/SPARK2/lib/spark2/

 

We were just wrong about using spark2-xxxx scripts.