Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here. Want to know more about what has changed? Check out the Community News blog.

Limit number of executors per node for Spark application

Limit number of executors per node for Spark application

New Contributor

Hello ,

 

we have a spark application which should only be executed once per node (we are using yarn as resource manager) respectivly only in one JVM per node.

I know it is possible to define the number of executors for  a spark application by use of --num-executors parameter (which defines the number of executer for the whole cluster and if my understanding is correct it could by possible that application is executed in two executors on one node).

 

Thus my question is if it is possible to limit number of executors per node for a Spark application (in this case only one  executer for one one node, thus e. g. in case of 8 nodes 8 executor should exist for this spark application but max executor per node)

 

thanks and kr Jochen 

 

 

2 REPLIES 2

Re: Limit number of executor per node for Spar application

Master Collaborator
This won't be possible; it's not how Spark's model works. Work
executes on available resources rather than particular machines. This
isn't a job for Spark and could instead be accomplished by directly
running the job on each machine, possibly by repeatedly querying the
YARN API until you can find resources on each of the machines you want
to. But there's no guarantee all the machines have YARN resources
free, even.

Re: Limit number of executor per node for Spar application

New Contributor
ok thanks!