09-17-2015 06:22 PM - edited 09-17-2015 06:25 PM
we have a spark application which should only be executed once per node (we are using yarn as resource manager) respectivly only in one JVM per node.
I know it is possible to define the number of executors for a spark application by use of --num-executors parameter (which defines the number of executer for the whole cluster and if my understanding is correct it could by possible that application is executed in two executors on one node).
Thus my question is if it is possible to limit number of executors per node for a Spark application (in this case only one executer for one one node, thus e. g. in case of 8 nodes 8 executor should exist for this spark application but max executor per node)
thanks and kr Jochen
09-18-2015 01:02 AM