Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

spark-shell vs spark-submit command

avatar
Explorer

We are running spark jobs using spark-shell command :

e.g: spark-shell --conf config_file_details --driver-memory 4G --executor-memory 4G -i spark_job.scala

we are not using spark-submit command to submit job but instead of it we are using spark-shell command

Could you please advice which will be fast in terms of performance.

Thanks,

1 ACCEPTED SOLUTION

avatar

@Avinash A Spark shell is only intended to be use for testing and perhaps development of small applications - is only an interactive shell and should not be use to run production spark applications. For production application deployment you should use spark-submit. The last one will also allow you to run applications in yarn-cluster mode.

HTH

*** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.

View solution in original post

3 REPLIES 3

avatar

@Avinash A Spark shell is only intended to be use for testing and perhaps development of small applications - is only an interactive shell and should not be use to run production spark applications. For production application deployment you should use spark-submit. The last one will also allow you to run applications in yarn-cluster mode.

HTH

*** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.

avatar

@Avinash A if the above answer helped you please take a moment to login and click the "accept" link on the answer.

avatar
Explorer

Thanks a lot! Info was very useful.