Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Spark - Execute in the worker URL

Highlighted

Spark - Execute in the worker URL

Contributor

How to submit jobs to the slave node (worker node) from master node in spark.

I started the nodes successfully,

I executed the following commands

./spark-submit --class "SimpleApp" --master "spark://worker@192.168.117.141:35941" /home/ssbatch4/.sbt/0.13/staging/b081b85b0b35b548b3ca/simpleproj/target/scala-2.10/simple-project_2.10-1.0.jar

./spark-submit --class "SimpleApp" --master "spark://sridhar25.sridhar.com:35941" /home/ssbatch4/.sbt/0.13/staging/b081b85b0b35b548b3ca/simpleproj/target/scala-2.10/simple-project_2.10-1.0.jar

It is creating exceptions for the above commands.

there is master URL in the master node,

there is no master URL in the slave node

1 REPLY 1

Re: Spark - Execute in the worker URL

What exception you are getting? can you please explain bit more with logs?

Also can you check the spark master UI and cross check your spark://.... URI? and make sure you use same URI in your spark-submit command.

Don't have an account?
Coming from Hortonworks? Activate your account here