Reply
New Contributor
Posts: 2
Registered: ‎09-17-2015

Limit number of executors per node for Spark application

[ Edited ]

Hello ,

 

we have a spark application which should only be executed once per node (we are using yarn as resource manager) respectivly only in one JVM per node.

I know it is possible to define the number of executors for  a spark application by use of --num-executors parameter (which defines the number of executer for the whole cluster and if my understanding is correct it could by possible that application is executed in two executors on one node).

 

Thus my question is if it is possible to limit number of executors per node for a Spark application (in this case only one  executer for one one node, thus e. g. in case of 8 nodes 8 executor should exist for this spark application but max executor per node)

 

thanks and kr Jochen 

 

 

Cloudera Employee
Posts: 366
Registered: ‎07-29-2013

Re: Limit number of executor per node for Spar application

This won't be possible; it's not how Spark's model works. Work
executes on available resources rather than particular machines. This
isn't a job for Spark and could instead be accomplished by directly
running the job on each machine, possibly by repeatedly querying the
YARN API until you can find resources on each of the machines you want
to. But there's no guarantee all the machines have YARN resources
free, even.

Highlighted
New Contributor
Posts: 2
Registered: ‎09-17-2015

Re: Limit number of executor per node for Spar application

ok thanks!