Created on 07-28-2017 06:20 AM - edited 09-16-2022 04:59 AM
Hello,
I have both Spark 1.6 and Spark 2.2 installed in my cluster through CDH. Normally my Livy server starts with default Spark 1.6 but now I want to start my Livy with Spark 2.2 which I figured maybe by changing the SPARK_HOME to point to SPARK2 would do the trick:
Previously:
export SPARK_HOME=/opt/cloudera/parcels/CDH/lib/spark
export SPARK_CONF_DIR=$SPARK_HOME/conf
New:
export SPARK_HOME=/opt/cloudera/parcels/SPARK2
export SPARK_CONF_DIR=$SPARK_HOME/meta
But this will led to an error which is understandable as Spark2 in Cloudera is spark2-submit:
Exception in thread "main" java.io.IOException: Cannot run program "/opt/cloudera/parcels/SPARK2/bin/spark-submit": error=2, No such file or directory
Is there anyway to configure the Livy to find the right spark2-submit and not the default name? I looked every where in the config and the code but maybe I missed something.
Many thanks,
Maziyar
Created 07-28-2017 08:03 AM
Created 07-28-2017 08:03 AM
Created 07-29-2017 02:56 PM
It worked! I just ran the script on the node that is my Livy server.
Thank you 🙂
Created 06-04-2019 10:11 AM
Is there a way of doing this without having to make Spark2 the default? I tried to search among Livy conf parameters, but I couldn't find anything.
Created 06-14-2019 03:39 AM