Reply
Contributor
Posts: 48
Registered: ‎07-05-2018

spark2-shell consuming yarn resources even if its idle

Hello Team,

 

We have an issue where if any user runs spark2-shell command it defaults triggers 1 container job in yarn UI.

 

If user not running any processing or code inside spark2-shell and keep the sessios idle containers not getting released.

 

Is there any way we can set timeout on spark2-shell if its idle for some time?

 

- Vijay M

Master
Posts: 426
Registered: ‎07-01-2015

Re: spark2-shell consuming yarn resources even if its idle

I am not 100% sure, but try to turn on yarn dynamic executors:

spark.dynamicAllocation.enabled = True
spark.dynamicAllocation.minExecutors = 0

This should "remove" the unused containers.
Contributor
Posts: 48
Registered: ‎07-05-2018

Re: spark2-shell consuming yarn resources even if its idle

@Tomas79

 

The cluster on which i am working both properties mentioned by you enabled by default, i tried to disable it but results in 3 containers started whn Users ran spar2-shell command.

 

Reverted the change and back to default.

 

Problem is:

 

my cluster has 40 cores and if 40 users just only ran spark2-shell command and do nothing all 40 spark2-shell command consumes 40 cores and no resources will be available in cluster.

 

I need something like timeout, if spark2-shell idle for 1 mins then user should get exited from spark2-shell?

 

Kindly suggest?

 

- Vijay M