Hi
I was working with running spark locally in Intellij (sending data from Nifi to spark streaming via site2site). Now I have setup spark standalone cluster and want to run my application on it. I simply changed the master URL from local [*] to
.setMaster("spark://localhost:7077")
it seems to be fine but obviously it throws the class not found error as it does not have the NiFi jars at the executors.
One possible way is to create a standalone jar and then use the spark submit script along with the fat jar to run the application.
Is it still possible to run the application via Intellij some how?
Can I set any of the following properties in the SparkConf().set to make it work?
- SparkConf().SetJars
- SparkConf().set (any of the properties mentioned below)
- spark.driver.extraClassPath
- spark.jars
- spark.jars.packages
Can i create a fat jar and pass it to spark.driver.extraClassPath?
Thanks