<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Spark on Yarn - How to run multiple tasks in a Spark Resource Pool in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Spark-on-Yarn-How-to-run-multiple-tasks-in-a-Spark-Resource/m-p/286996#M212800</link>
    <description>&lt;P&gt;I have observed that by increasing the number of cores/executors and driver/executor memory, I was able to verify that around 6 tasks are running in parallel at a time.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks and Regards,&amp;nbsp;&lt;/P&gt;&lt;P&gt;Sudhindra&lt;/P&gt;</description>
    <pubDate>Tue, 07 Jan 2020 09:29:07 GMT</pubDate>
    <dc:creator>ssk26</dc:creator>
    <dc:date>2020-01-07T09:29:07Z</dc:date>
    <item>
      <title>Spark on Yarn - How to run multiple tasks in a Spark Resource Pool</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-on-Yarn-How-to-run-multiple-tasks-in-a-Spark-Resource/m-p/286984#M212790</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I am running Spark jobs on YARN, using HDP 3.1.1.0-78 version.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I have set the Spark Scheduler Mode to FAIR by setting the parameter "spark.scheduler.mode" to FAIR.&amp;nbsp; The fairscheduler.xml is as follows:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I have also configured my program to use "production" pool.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Spark_Fair_Scheduler_1.PNG" style="width: 486px;"&gt;&lt;img src="https://community.cloudera.com/t5/image/serverpage/image-id/25943i8951B456BA2B56F2/image-dimensions/486x340?v=v2" width="486" height="340" role="button" title="Spark_Fair_Scheduler_1.PNG" alt="Spark_Fair_Scheduler_1.PNG" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Upon running the job, it has been observed that although 4 stages are running, only 1 stage run under "production" and rest 3 run under "default" pool.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;So, at any point of time, I am able to make sure that only 2 tasks are running in parallel.&amp;nbsp; If I want to make sure that 3 tasks or more run in parallel, then 2 tasks should run under "production" and rest 2 should run under "default".&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Is there any programmatic way to achieve that, by setting configuration parameters?&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Any inputs will be really helpful.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thanks and Regards,&lt;/P&gt;
&lt;P&gt;Sudhindra&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 07 Jan 2020 06:49:25 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-on-Yarn-How-to-run-multiple-tasks-in-a-Spark-Resource/m-p/286984#M212790</guid>
      <dc:creator>ssk26</dc:creator>
      <dc:date>2020-01-07T06:49:25Z</dc:date>
    </item>
    <item>
      <title>Re: Spark on Yarn - How to run multiple tasks in a Spark Resource Pool</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-on-Yarn-How-to-run-multiple-tasks-in-a-Spark-Resource/m-p/286986#M212791</link>
      <description>&lt;P&gt;Additional Information:&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Active_Stages.PNG" style="width: 999px;"&gt;&lt;img src="https://community.cloudera.com/t5/image/serverpage/image-id/25944i5E16DD23A94E757E/image-size/large?v=v2&amp;amp;px=999" role="button" title="Active_Stages.PNG" alt="Active_Stages.PNG" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;As we can see, even though there are 3 stages active, only 1 task each is running in Production as well as Default pools.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;My basic question is - how can we increase the parallelism within pools?&amp;nbsp; In other words, how can I make sure that the Stage ID "8" in the above screenshot also runs in parallel with the other 2&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks and Regards,&lt;/P&gt;&lt;P&gt;Sudhindra&lt;/P&gt;</description>
      <pubDate>Tue, 07 Jan 2020 06:15:11 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-on-Yarn-How-to-run-multiple-tasks-in-a-Spark-Resource/m-p/286986#M212791</guid>
      <dc:creator>ssk26</dc:creator>
      <dc:date>2020-01-07T06:15:11Z</dc:date>
    </item>
    <item>
      <title>Re: Spark on Yarn - How to run multiple tasks in a Spark Resource Pool</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-on-Yarn-How-to-run-multiple-tasks-in-a-Spark-Resource/m-p/286996#M212800</link>
      <description>&lt;P&gt;I have observed that by increasing the number of cores/executors and driver/executor memory, I was able to verify that around 6 tasks are running in parallel at a time.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks and Regards,&amp;nbsp;&lt;/P&gt;&lt;P&gt;Sudhindra&lt;/P&gt;</description>
      <pubDate>Tue, 07 Jan 2020 09:29:07 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-on-Yarn-How-to-run-multiple-tasks-in-a-Spark-Resource/m-p/286996#M212800</guid>
      <dc:creator>ssk26</dc:creator>
      <dc:date>2020-01-07T09:29:07Z</dc:date>
    </item>
  </channel>
</rss>

