<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question spark dynamic allocation setting in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/spark-dynamic-allocation-setting/m-p/104909#M67800</link>
    <description>&lt;P&gt;Hi All,&lt;/P&gt;&lt;P&gt;I was testing spark dynamic resource allocation in spark. By default I see "spark-thrift-sparkconf.conf" contains all the dynamic allocation properties. But when I run the spark job "spark-shell --master yarn --num-executors 5 --executor-memory 3G", I expect it complain as I've requested number of executor in the job itself.&lt;/P&gt;&lt;P&gt;Then I modifed the custom spark-defaults.conf and added dynamic allocation properties:&lt;/P&gt;&lt;PRE&gt;spark.dynamicAllocation.enabled true
spark.dynamicAllocation.initialExecutors 1
spark.dynamicAllocation.maxExecutors 5
spark.dynamicAllocation.minExecutors 1&lt;/PRE&gt;&lt;P&gt;And when I run the same job, I see below messages :&lt;/P&gt;&lt;PRE&gt;16/05/23 09:18:54 WARN SparkContext: Dynamic Allocation and num executors both set, thus dynamic allocation disabled. &lt;/PRE&gt;
&lt;P&gt;Also print below messages if needed more resources. My doubt is is dynamic allocation is defined by default? Which config we should define dynamic allocation properties?&lt;/P&gt;&lt;PRE&gt;6/05/23 09:39:47 INFO ExecutorAllocationManager: Requesting 2 new executors because tasks are backlogged (new desired total will be 4) 16/05/23 09:39:48 INFO ExecutorAllocationManager: Requesting 1 new executor because tasks are backlogged (new desired total will be 5) 
&lt;/PRE&gt;</description>
    <pubDate>Fri, 27 May 2016 17:53:06 GMT</pubDate>
    <dc:creator>nyadav</dc:creator>
    <dc:date>2016-05-27T17:53:06Z</dc:date>
  </channel>
</rss>

