Support Questions

Find answers, ask questions, and share your expertise

Spark Session timeout


I can see lot of pyspark jobs with Progress (10%) with fixed allocated memory. How do I enable dynamic memory deallocation ? Developers are running pyspark jobs inside Zeppelin interpreter and spark shell .

Any suggestion on above issue ?



@Sushant ,There are mainly two ways to enable Dynamic Resource Allocation.

1) Spark level

Enable DRA in spark. Follow below setup instruction to enable DRA. This way all spark application will run with DRA enabled.


2) Zeppelin level

You can choose to enable DRA in Zeppelin interpreter. You can use Livy interpreter to run spark-shell or pyspark jobs.

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.