Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Spark Session timeout

avatar
Explorer

I can see lot of pyspark jobs with Progress (10%) with fixed allocated memory. How do I enable dynamic memory deallocation ? Developers are running pyspark jobs inside Zeppelin interpreter and spark shell .

Any suggestion on above issue ?

1 REPLY 1

avatar
Guru

@Sushant ,There are mainly two ways to enable Dynamic Resource Allocation.

1) Spark level

Enable DRA in spark. Follow below setup instruction to enable DRA. This way all spark application will run with DRA enabled.

https:///content/supportkb/49510/how-to-enable-dynamic-resource-allocation-in-spark.html

2) Zeppelin level

You can choose to enable DRA in Zeppelin interpreter. You can use Livy interpreter to run spark-shell or pyspark jobs.

https://zeppelin.apache.org/docs/0.6.1/interpreter/livy.html