Support Questions
Find answers, ask questions, and share your expertise
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Spark - Execute in the worker URL


Spark - Execute in the worker URL


How to submit jobs to the slave node (worker node) from master node in spark.

I started the nodes successfully,

I executed the following commands

./spark-submit --class "SimpleApp" --master "spark://worker@" /home/ssbatch4/.sbt/0.13/staging/b081b85b0b35b548b3ca/simpleproj/target/scala-2.10/simple-project_2.10-1.0.jar

./spark-submit --class "SimpleApp" --master "spark://" /home/ssbatch4/.sbt/0.13/staging/b081b85b0b35b548b3ca/simpleproj/target/scala-2.10/simple-project_2.10-1.0.jar

It is creating exceptions for the above commands.

there is master URL in the master node,

there is no master URL in the slave node


Re: Spark - Execute in the worker URL

What exception you are getting? can you please explain bit more with logs?

Also can you check the spark master UI and cross check your spark://.... URI? and make sure you use same URI in your spark-submit command.

Don't have an account?
Coming from Hortonworks? Activate your account here