We're testing a programmatical way to launch spark jobs on a cluster.
We use the SparkLauncher class to do so. It needs the location of SPARK_HOME to work so we just point it to /opt/cloudera/parcels/SPARK2/ in order to use Spark 2.
Unfortunately the command used to launch the job is hard coded in SparkLauncher class to "spark-submit" and Cloudera distribution of Spark 2 only includes "spark2-xxx" scripts.
Is there any way to use SparkLauncher to launche Spark 2 jobs or another way to launch jobs programmatically?