<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Previously working spark jobs only now throwing &amp;quot;java.io.IOException: Too many open files&amp;quot; error? in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Previously-working-spark-jobs-only-now-throwing-quot-java-io/m-p/295919#M218032</link>
    <description>&lt;P&gt;Hello&amp;nbsp;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/54172"&gt;@rvillanueva&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P class="p1"&gt;&amp;nbsp;&lt;/P&gt;&lt;P class="p1"&gt;You can check how many threads are used by a user by running ps -L -u &amp;lt;username&amp;gt; | wc -l&lt;/P&gt;&lt;P class="p1"&gt;&amp;nbsp;&lt;/P&gt;&lt;P class="p1"&gt;if the user’s open files&amp;nbsp; limit ( ulimit -n &amp;lt;user name &amp;gt;)&amp;nbsp;is&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &lt;/SPAN&gt;hit then the user can’t spawn any further more threads. Most&amp;nbsp;possible reasons in this case could be,&lt;/P&gt;&lt;OL class="ol1"&gt;&lt;LI&gt;Same user running other jobs and having open files on the node where it tries to launch/spawn the container.&lt;/LI&gt;&lt;LI&gt;systems thread might have excluded.&lt;/LI&gt;&lt;LI&gt;see which application is running and what is their current open files&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;Kindly check application log (&lt;SPAN&gt;application_XXX),if available and see which phase it throw's the exception and on which node the issue is faced.&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Thu, 14 May 2020 10:53:39 GMT</pubDate>
    <dc:creator>Madhur</dc:creator>
    <dc:date>2020-05-14T10:53:39Z</dc:date>
  </channel>
</rss>

