Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Jars files using Yarn in Spark

avatar

Hi,

I have a .jar file in one of the location (local or cluster) and I want to run that file using spark-shell in Yarn mode.. Can any one help me with this??

Thanks

1 ACCEPTED SOLUTION

avatar

@Harshil Gala

spark-shell --jars <path-to-jar> --master yarn --deploy-mode client/cluster (you can choose client or accordingly)

View solution in original post

4 REPLIES 4

avatar
Rising Star

Try like this command.

spark-shell --jars /app/spark/a.jar,/app/spark/b.jar

avatar

Thanks Peter Kim,

Correct me if I my understanding is wrong,

spark-shell --jars "location" of jar files --master "yarn"

is this command correct? Do I need to write anything else along with --master "yarn" this for Yarn

avatar
Rising Star

If you're using hadoop cluster with ambari of hortonworks, then you don't have to use that --master yarn parameter. Cause' spark service mode of HDP cluster is installed to yarn mode basically.

avatar

@Harshil Gala

spark-shell --jars <path-to-jar> --master yarn --deploy-mode client/cluster (you can choose client or accordingly)