<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question When Container is killed, Yarn/Spark doesn't give me a new container and my app is like a Zombie in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/When-Container-is-killed-Yarn-Spark-doesn-t-give-me-a-new/m-p/318763#M227535</link>
    <description>&lt;P&gt;Hello&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I've a mistake with my app in cluster mode with Spark. My app have 2 containers : one driver and one executor.&lt;/P&gt;&lt;P&gt;When the executor container is killed (by an error or what else), my app doesn't give an other container or kill the current attempt to reset the job. The job is like a zombie, the driver doesn't see it and continue without any error.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I missed maybe something about the yarn conf or spark-submit's params.&lt;/P&gt;&lt;P&gt;Have you ever meet this situation ?&lt;/P&gt;</description>
    <pubDate>Wed, 16 Jun 2021 08:37:34 GMT</pubDate>
    <dc:creator>ChocoChoco</dc:creator>
    <dc:date>2021-06-16T08:37:34Z</dc:date>
  </channel>
</rss>

