I can see lot of pyspark jobs with Progress (10%) with fixed allocated memory. How do I enable dynamic memory deallocation ? Developers are running pyspark jobs inside Zeppelin interpreter and spark shell .
Any suggestion on above issue ?
@Sushant ,There are mainly two ways to enable Dynamic Resource Allocation.
1) Spark level
Enable DRA in spark. Follow below setup instruction to enable DRA. This way all spark application will run with DRA enabled.
2) Zeppelin level
You can choose to enable DRA in Zeppelin interpreter. You can use Livy interpreter to run spark-shell or pyspark jobs.