Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

spark dynamic allocation setting

avatar
Expert Contributor

Hi All,

I was testing spark dynamic resource allocation in spark. By default I see "spark-thrift-sparkconf.conf" contains all the dynamic allocation properties. But when I run the spark job "spark-shell --master yarn --num-executors 5 --executor-memory 3G", I expect it complain as I've requested number of executor in the job itself.

Then I modifed the custom spark-defaults.conf and added dynamic allocation properties:

spark.dynamicAllocation.enabled true
spark.dynamicAllocation.initialExecutors 1
spark.dynamicAllocation.maxExecutors 5
spark.dynamicAllocation.minExecutors 1

And when I run the same job, I see below messages :

16/05/23 09:18:54 WARN SparkContext: Dynamic Allocation and num executors both set, thus dynamic allocation disabled. 

Also print below messages if needed more resources. My doubt is is dynamic allocation is defined by default? Which config we should define dynamic allocation properties?

6/05/23 09:39:47 INFO ExecutorAllocationManager: Requesting 2 new executors because tasks are backlogged (new desired total will be 4) 16/05/23 09:39:48 INFO ExecutorAllocationManager: Requesting 1 new executor because tasks are backlogged (new desired total will be 5) 
3 REPLIES 3

avatar
Super Guru

Hi @nyadav

As per doc here when running spark on yarn.

"The number of executors. Note that this property is incompatible withspark.dynamicAllocation.enabled. If bothspark.dynamicAllocation.enabled and spark.executor.instances are specified, dynamic allocation is turned off and the specified number ofspark.executor.instances is used".

avatar
Expert Contributor

Yes @Jitendra Yadav, I can see the same in the logs, spark.executor.instances overrides dynamic allocation properties. But my question is where we should define dynamic allocation settings, in spark-defaults.conf or spark-thrift-sparkconf.conf

avatar
Super Guru
@nyadav

By default dynamic allocation is enabled for spark thrift server. For shell commands you need define it in spark-default.conf. Please refer this doc for more info.