<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Previously working spark jobs only now throwing &amp;quot;java.io.IOException: Too many open files&amp;quot; error? in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Previously-working-spark-jobs-only-now-throwing-quot-java-io/m-p/295850#M217993</link>
    <description>&lt;P&gt;your PB may be caused by caused by several reasons&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;firstly i think 1024 is not enough, you should increase it&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;opened files may be increasing day after day ( an application may stream more data from/into splitted files)&lt;/P&gt;&lt;P&gt;a spark application may also import/open more libraries today&lt;/P&gt;&lt;P&gt;etc...&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;please check opened file by the user (that runs spark jobs) to find the possible cause&lt;/P&gt;&lt;P&gt;&lt;EM&gt;lsof -u myUser&amp;nbsp;&lt;/EM&gt;( | wc -l ... )&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;check&amp;nbsp;&lt;EM&gt;lsof (lsof +D directory) , and find how many opened files per job and how many jobs are runing etc...&lt;/EM&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Wed, 13 May 2020 11:56:55 GMT</pubDate>
    <dc:creator>rachid-berkane</dc:creator>
    <dc:date>2020-05-13T11:56:55Z</dc:date>
  </channel>
</rss>

