Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

spark-shell vs spark-submit command

avatar
Frequent Visitor

We are running spark jobs using spark-shell command :

e.g: spark-shell --conf config_file_details --driver-memory 4G --executor-memory 4G -i spark_job.scala

we are not using spark-submit command to submit job but instead of it we are using spark-shell command

Could you please advice which will be fast in terms of performance.

Thanks,

1 ACCEPTED SOLUTION

avatar

@Avinash A Spark shell is only intended to be use for testing and perhaps development of small applications - is only an interactive shell and should not be use to run production spark applications. For production application deployment you should use spark-submit. The last one will also allow you to run applications in yarn-cluster mode.

HTH

*** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.

View solution in original post

3 REPLIES 3

avatar

@Avinash A Spark shell is only intended to be use for testing and perhaps development of small applications - is only an interactive shell and should not be use to run production spark applications. For production application deployment you should use spark-submit. The last one will also allow you to run applications in yarn-cluster mode.

HTH

*** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.

avatar

@Avinash A if the above answer helped you please take a moment to login and click the "accept" link on the answer.

avatar
Frequent Visitor

Thanks a lot! Info was very useful.