Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Is there any way to find out if the spark session was started using local mode or yarn mode ?

avatar
Explorer

Facing issue where users are starting spark session in local mode and not the yarn mode. So need to identify users and update them on the process how they can start using yarn mode to avoid load on local node.

Is there any document available on best practice for spark, which can help to educate users on the best ways to work on spark and how to launch spark session ?

1 ACCEPTED SOLUTION

avatar

@Gaurav Parmar

Check in spark history server ui to see which applications are run in local mode:

80610-screen-shot-2018-07-19-at-65957-pm.png

In above image you can see local-* are applications launched in local mode and application_* are applications launched in yarn master.

If you like to switch the default local to yarn, perhaps you can add export MASTER=yarn on the spark-env so that users that forget to add --master yarn will by default run in yarn master.

Please let me know if this helps you.

HTH

*** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.

View solution in original post

2 REPLIES 2

avatar

@Gaurav Parmar

Check in spark history server ui to see which applications are run in local mode:

80610-screen-shot-2018-07-19-at-65957-pm.png

In above image you can see local-* are applications launched in local mode and application_* are applications launched in yarn master.

If you like to switch the default local to yarn, perhaps you can add export MASTER=yarn on the spark-env so that users that forget to add --master yarn will by default run in yarn master.

Please let me know if this helps you.

HTH

*** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.

avatar
Explorer

Thanks for sharing the detailed information and will try to update the spark-env to use default mode as yarn.