Created 06-29-2017 04:42 AM
Hi,
I have a .jar file in one of the location (local or cluster) and I want to run that file using spark-shell in Yarn mode.. Can any one help me with this??
Thanks
Created 06-29-2017 07:05 PM
spark-shell --jars <path-to-jar> --master yarn --deploy-mode client/cluster (you can choose client or accordingly)
Created 06-29-2017 08:21 AM
Try like this command.
spark-shell --jars /app/spark/a.jar,/app/spark/b.jar
Created 06-29-2017 09:37 AM
Thanks Peter Kim,
Correct me if I my understanding is wrong,
spark-shell --jars "location" of jar files --master "yarn"
is this command correct? Do I need to write anything else along with --master "yarn" this for Yarn
Created 06-30-2017 06:41 AM
If you're using hadoop cluster with ambari of hortonworks, then you don't have to use that --master yarn parameter. Cause' spark service mode of HDP cluster is installed to yarn mode basically.
Created 06-29-2017 07:05 PM
spark-shell --jars <path-to-jar> --master yarn --deploy-mode client/cluster (you can choose client or accordingly)