I have both Spark 1.6 and Spark 2.2 installed in my cluster through CDH. Normally my Livy server starts with default Spark 1.6 but now I want to start my Livy with Spark 2.2 which I figured maybe by changing the SPARK_HOME to point to SPARK2 would do the trick:
But this will led to an error which is understandable as Spark2 in Cloudera is spark2-submit:
Exception in thread "main" java.io.IOException: Cannot run program "/opt/cloudera/parcels/SPARK2/bin/spark-submit": error=2, No such file or directory
Is there anyway to configure the Livy to find the right spark2-submit and not the default name? I looked every where in the config and the code but maybe I missed something.
Is there a way of doing this without having to make Spark2 the default? I tried to search among Livy conf parameters, but I couldn't find anything.