Created on 09-13-2017 02:40 AM - edited 09-16-2022 05:14 AM
Hello,
We're testing a programmatical way to launch spark jobs on a cluster.
We use the SparkLauncher class to do so. It needs the location of SPARK_HOME to work so we just point it to /opt/cloudera/parcels/SPARK2/ in order to use Spark 2.
Unfortunately the command used to launch the job is hard coded in SparkLauncher class to "spark-submit" and Cloudera distribution of Spark 2 only includes "spark2-xxx" scripts.
Is there any way to use SparkLauncher to launche Spark 2 jobs or another way to launch jobs programmatically?
Created 10-23-2017 05:41 AM
/opt/cloudera/parcels/SPARK2/ should not be used as SPARK_HOME.
The correct path to use is /opt/cloudera/parcels/SPARK2/lib/spark2/
We were just wrong about using spark2-xxxx scripts.
Created 09-13-2017 09:53 PM
Created 10-23-2017 05:41 AM
/opt/cloudera/parcels/SPARK2/ should not be used as SPARK_HOME.
The correct path to use is /opt/cloudera/parcels/SPARK2/lib/spark2/
We were just wrong about using spark2-xxxx scripts.