<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Spark max number of executor to 1 in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Spark-max-number-of-executor-to-1/m-p/335786#M232030</link>
    <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/92928"&gt;@loridigia&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;If cluster/application is &lt;STRONG&gt;not&lt;/STRONG&gt; enabled &lt;STRONG&gt;dynamic allocation&lt;/STRONG&gt; and if you set &lt;STRONG&gt;--conf&amp;nbsp;&lt;/STRONG&gt;&lt;SPAN&gt;&lt;STRONG&gt;spark.executor.instances=1&lt;/STRONG&gt; then it will launch only &lt;STRONG&gt;1 executor&lt;/STRONG&gt;. Apart from executor, you will see AM/driver in the Executor tab Spark UI.&lt;/SPAN&gt;&lt;/P&gt;</description>
    <pubDate>Tue, 08 Feb 2022 12:01:53 GMT</pubDate>
    <dc:creator>RangaReddy</dc:creator>
    <dc:date>2022-02-08T12:01:53Z</dc:date>
    <item>
      <title>Spark max number of executor to 1</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-max-number-of-executor-to-1/m-p/334471#M231781</link>
      <description>&lt;P&gt;Hi everybody, i'm submitting jobs to a Yarn cluster via SparkLauncher.&lt;BR /&gt;Im under HDP 3.1.4.0&lt;BR /&gt;Now, i'd like to have only 1 executor for each job i run (since ofter i found 2 executor for each job) with the resources that i decide (of course if those resources are available in a machine).&lt;BR /&gt;So i tried to add&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;.setConf(&lt;SPAN&gt;"spark.executor.instances"&lt;/SPAN&gt;&lt;SPAN&gt;, &lt;/SPAN&gt;&lt;SPAN&gt;"1"&lt;/SPAN&gt;)&lt;/PRE&gt;&lt;PRE&gt;.setConf(&lt;SPAN&gt;"spark.executor.cores"&lt;/SPAN&gt;&lt;SPAN&gt;, &lt;/SPAN&gt;&lt;SPAN&gt;"3"&lt;/SPAN&gt;)&lt;/PRE&gt;&lt;P&gt;But even if i set spark.executor.instaces to 1, i have 2 executors, do you know why? (I read somewhere that the N° of executors =&amp;nbsp;&amp;nbsp;&lt;SPAN&gt;spark.executor.instances *&amp;nbsp;spark.executor.cores .&lt;BR /&gt;I don't know if that's true, but it seems true.&lt;BR /&gt;Is there a way to achieve my goal of have MIN and MAX 1 executor for each job???&lt;BR /&gt;Could be achieve with dynamicAllocation (i'd prefer not to set that since it's not designed for that and can do a lot of stuff that i don't need) ? Thanks in advance!!!&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 24 Jan 2022 20:00:34 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-max-number-of-executor-to-1/m-p/334471#M231781</guid>
      <dc:creator>loridigia</dc:creator>
      <dc:date>2022-01-24T20:00:34Z</dc:date>
    </item>
    <item>
      <title>Re: Spark max number of executor to 1</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-max-number-of-executor-to-1/m-p/335516#M231964</link>
      <description>&lt;P&gt;Hello&amp;nbsp;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/92928"&gt;@loridigia&lt;/a&gt;,&lt;/P&gt;&lt;P&gt;I have tried to run the sample job like below and I see only one executor and one Driver container.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;# cd /usr/hdp/current/spark2-client
# su spark

$ ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode cluster --num-executors 1 --driver-memory 512m --executor-memory 512m --executor-cores 2 examples/jars/spark-examples*.jar 10000&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Even the spark-shell is also limiting the containers to one when we are mentioning the &lt;STRONG&gt;--num-executors 1&lt;/STRONG&gt;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;$ spark-shell --num-executors 1&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;- What is the spark-submit command you are trying to run?&lt;BR /&gt;- Are you seeing the same issue with the above sample job?&lt;/P&gt;</description>
      <pubDate>Wed, 02 Feb 2022 16:26:43 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-max-number-of-executor-to-1/m-p/335516#M231964</guid>
      <dc:creator>Deepan_N</dc:creator>
      <dc:date>2022-02-02T16:26:43Z</dc:date>
    </item>
    <item>
      <title>Re: Spark max number of executor to 1</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-max-number-of-executor-to-1/m-p/335786#M232030</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/92928"&gt;@loridigia&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;If cluster/application is &lt;STRONG&gt;not&lt;/STRONG&gt; enabled &lt;STRONG&gt;dynamic allocation&lt;/STRONG&gt; and if you set &lt;STRONG&gt;--conf&amp;nbsp;&lt;/STRONG&gt;&lt;SPAN&gt;&lt;STRONG&gt;spark.executor.instances=1&lt;/STRONG&gt; then it will launch only &lt;STRONG&gt;1 executor&lt;/STRONG&gt;. Apart from executor, you will see AM/driver in the Executor tab Spark UI.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 08 Feb 2022 12:01:53 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-max-number-of-executor-to-1/m-p/335786#M232030</guid>
      <dc:creator>RangaReddy</dc:creator>
      <dc:date>2022-02-08T12:01:53Z</dc:date>
    </item>
  </channel>
</rss>

