Created 06-15-2018 03:38 PM
We are running spark jobs using spark-shell command :
e.g: spark-shell --conf config_file_details --driver-memory 4G --executor-memory 4G -i spark_job.scala
we are not using spark-submit command to submit job but instead of it we are using spark-shell command
Could you please advice which will be fast in terms of performance.
Thanks,
Created 06-15-2018 04:16 PM
@Avinash A Spark shell is only intended to be use for testing and perhaps development of small applications - is only an interactive shell and should not be use to run production spark applications. For production application deployment you should use spark-submit. The last one will also allow you to run applications in yarn-cluster mode.
HTH
*** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.
Created 06-15-2018 04:16 PM
@Avinash A Spark shell is only intended to be use for testing and perhaps development of small applications - is only an interactive shell and should not be use to run production spark applications. For production application deployment you should use spark-submit. The last one will also allow you to run applications in yarn-cluster mode.
HTH
*** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.
Created 06-18-2018 01:33 PM
@Avinash A if the above answer helped you please take a moment to login and click the "accept" link on the answer.
Created 06-18-2018 02:02 PM
Thanks a lot! Info was very useful.