Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Jars files using Yarn in Spark

avatar
New Member

Hi,

I have a .jar file in one of the location (local or cluster) and I want to run that file using spark-shell in Yarn mode.. Can any one help me with this??

Thanks

1 ACCEPTED SOLUTION

avatar

@Harshil Gala

spark-shell --jars <path-to-jar> --master yarn --deploy-mode client/cluster (you can choose client or accordingly)

View solution in original post

4 REPLIES 4

avatar
Rising Star

Try like this command.

spark-shell --jars /app/spark/a.jar,/app/spark/b.jar

avatar
New Member

Thanks Peter Kim,

Correct me if I my understanding is wrong,

spark-shell --jars "location" of jar files --master "yarn"

is this command correct? Do I need to write anything else along with --master "yarn" this for Yarn

avatar
Rising Star

If you're using hadoop cluster with ambari of hortonworks, then you don't have to use that --master yarn parameter. Cause' spark service mode of HDP cluster is installed to yarn mode basically.

avatar

@Harshil Gala

spark-shell --jars <path-to-jar> --master yarn --deploy-mode client/cluster (you can choose client or accordingly)