<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question ListFile on a huge directory in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/ListFile-on-a-huge-directory/m-p/296740#M218391</link>
    <description>&lt;P&gt;Hi, I'm trying to do a recursive list file on a massive directory. We're talking close to 100 GB. It seems like ListFile is walking through every single solitary file before it will spit out the first flow file. I was assuming it would spit out files one at a time as it finds them.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;It seems as though there's simply too much data. It just kind of sits there and spins. I left it run over the weekend and it never got through everything or spit anything out.&amp;nbsp; I can't tell if its still working or if it's overloaded or what is going on.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;How would I go about walking through a directory like this returning one file at a time without duplicates?&lt;/P&gt;</description>
    <pubDate>Wed, 27 May 2020 19:25:44 GMT</pubDate>
    <dc:creator>ischuldt24</dc:creator>
    <dc:date>2020-05-27T19:25:44Z</dc:date>
  </channel>
</rss>

