<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Spark job keeps on running even after killing application using yarn kill in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-keeps-on-running-even-after-killing-application/m-p/213463#M175391</link>
    <description>&lt;P&gt;Hi &lt;A rel="user" href="https://community.cloudera.com/users/44326/ravindranathoruganti.html" nodeid="44326"&gt;@Ravindranath Oruganti&lt;/A&gt; you are running spark driver in yarn-client mode i.e. on machine where you initiated the spark-submit command. You must also kill this process where you initiated this command. &lt;/P&gt;</description>
    <pubDate>Wed, 04 Oct 2017 20:03:13 GMT</pubDate>
    <dc:creator>smayani</dc:creator>
    <dc:date>2017-10-04T20:03:13Z</dc:date>
    <item>
      <title>Spark job keeps on running even after killing application using yarn kill</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-keeps-on-running-even-after-killing-application/m-p/213462#M175390</link>
      <description>&lt;P&gt;Submitting a spark job by using shell script as follows:&lt;/P&gt;&lt;P&gt;nohup spark-submit --master yarn-client --py-files libs.zip parser.py --jobname dicomparser &amp;gt;&amp;gt; out.log &amp;amp;&lt;/P&gt;&lt;P&gt;Job started to perform required processing, in between killed using &lt;/P&gt;&lt;P&gt;yarn application -kill &amp;lt;application_id&amp;gt; &lt;/P&gt;&lt;P&gt;and job disappeared from pending list. But could see the process still going on&lt;/P&gt;&lt;P&gt;What am I doing wrong?&lt;/P&gt;</description>
      <pubDate>Wed, 04 Oct 2017 18:34:35 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-keeps-on-running-even-after-killing-application/m-p/213462#M175390</guid>
      <dc:creator>ravindranath_or</dc:creator>
      <dc:date>2017-10-04T18:34:35Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job keeps on running even after killing application using yarn kill</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-keeps-on-running-even-after-killing-application/m-p/213463#M175391</link>
      <description>&lt;P&gt;Hi &lt;A rel="user" href="https://community.cloudera.com/users/44326/ravindranathoruganti.html" nodeid="44326"&gt;@Ravindranath Oruganti&lt;/A&gt; you are running spark driver in yarn-client mode i.e. on machine where you initiated the spark-submit command. You must also kill this process where you initiated this command. &lt;/P&gt;</description>
      <pubDate>Wed, 04 Oct 2017 20:03:13 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-keeps-on-running-even-after-killing-application/m-p/213463#M175391</guid>
      <dc:creator>smayani</dc:creator>
      <dc:date>2017-10-04T20:03:13Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job keeps on running even after killing application using yarn kill</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-keeps-on-running-even-after-killing-application/m-p/213464#M175392</link>
      <description>&lt;P&gt;Hi &lt;A rel="user" href="https://community.cloudera.com/users/220/smayani.html" nodeid="220"&gt;@Saumil Mayani&lt;/A&gt;, thanks for the reply. Tried that way and killed the process with name as org.apache.spark.deploy.SparkSubmit and yarn application as well. Still process is going on&lt;/P&gt;</description>
      <pubDate>Wed, 04 Oct 2017 20:41:51 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-keeps-on-running-even-after-killing-application/m-p/213464#M175392</guid>
      <dc:creator>ravindranath_or</dc:creator>
      <dc:date>2017-10-04T20:41:51Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job keeps on running even after killing application using yarn kill</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-keeps-on-running-even-after-killing-application/m-p/213465#M175393</link>
      <description>&lt;P&gt;Could you share which process is running (driver, AM, executors) and where (local, yarn - nodemanagers server)?&lt;/P&gt;</description>
      <pubDate>Wed, 04 Oct 2017 21:20:55 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-keeps-on-running-even-after-killing-application/m-p/213465#M175393</guid>
      <dc:creator>smayani</dc:creator>
      <dc:date>2017-10-04T21:20:55Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job keeps on running even after killing application using yarn kill</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-keeps-on-running-even-after-killing-application/m-p/213466#M175394</link>
      <description>&lt;P&gt;I am not sure which process is running but my files(which is part of the job process) are getting processed after killing job&lt;/P&gt;</description>
      <pubDate>Thu, 05 Oct 2017 01:31:11 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-keeps-on-running-even-after-killing-application/m-p/213466#M175394</guid>
      <dc:creator>ravindranath_or</dc:creator>
      <dc:date>2017-10-05T01:31:11Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job keeps on running even after killing application using yarn kill</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-keeps-on-running-even-after-killing-application/m-p/213467#M175395</link>
      <description>&lt;P&gt;Hi &lt;A rel="user" href="https://community.cloudera.com/users/44326/ravindranathoruganti.html" nodeid="44326"&gt;@Ravindranath Oruganti&lt;/A&gt; could you run the the following on all servers and check.&lt;/P&gt;&lt;PRE&gt; ps -ef | grep -i parser.py&lt;BR /&gt;&lt;/PRE&gt;</description>
      <pubDate>Thu, 05 Oct 2017 03:21:23 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-keeps-on-running-even-after-killing-application/m-p/213467#M175395</guid>
      <dc:creator>smayani</dc:creator>
      <dc:date>2017-10-05T03:21:23Z</dc:date>
    </item>
    <item>
      <title>Re: Spark job keeps on running even after killing application using yarn kill</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-job-keeps-on-running-even-after-killing-application/m-p/213468#M175396</link>
      <description>&lt;P&gt;Thanks a lot &lt;A rel="user" href="https://community.cloudera.com/users/220/smayani.html" nodeid="220"&gt;@Saumil Mayani&lt;/A&gt;. Process was running for parser.py. After killing those processes, it got stopped. Thanks again&lt;/P&gt;</description>
      <pubDate>Thu, 05 Oct 2017 12:32:09 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-job-keeps-on-running-even-after-killing-application/m-p/213468#M175396</guid>
      <dc:creator>ravindranath_or</dc:creator>
      <dc:date>2017-10-05T12:32:09Z</dc:date>
    </item>
  </channel>
</rss>

