<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Block size in PutHDFS in Nifi preventing HDFS files being written in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Block-size-in-PutHDFS-in-Nifi-preventing-HDFS-files-being/m-p/134976#M56043</link>
    <description>&lt;P&gt;Perfect.  Thank you.  Still learning how to rethink data ingestion from Flume to Nifi.&lt;/P&gt;</description>
    <pubDate>Fri, 03 Mar 2017 04:11:41 GMT</pubDate>
    <dc:creator>elloyd</dc:creator>
    <dc:date>2017-03-03T04:11:41Z</dc:date>
    <item>
      <title>Block size in PutHDFS in Nifi preventing HDFS files being written</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Block-size-in-PutHDFS-in-Nifi-preventing-HDFS-files-being/m-p/134972#M56039</link>
      <description>&lt;P&gt;We have a Nifi flow which uses PutHDFS.&lt;/P&gt;&lt;P&gt;We want to have it put files into HDFS of a specific size each time (similar to how Flume does it with hdfs.rollInterval).  I thought maybe it was Block size attribute but that seems to be breaking my file writing completely.&lt;/P&gt;&lt;P&gt;When I set Block Size to any value (I have tried 10kb and the syntax: 10 KB as well as well as a very small size like 500b), it runs without errors but no files show up in HDFS.  If I remove the value from the Block Size attribute, it will put files in HDFS that are correct except I want to specify their size.&lt;/P&gt;&lt;P&gt;Any insight is appreciated.&lt;/P&gt;&lt;P&gt;** As an update to this, I realized I was setting the Block size lower than the minimum block size required so it wasn't writing to HDFS.  That being said, when I changed it to the minimum of 1 GB, it seems to still write files to HDFS of a size less than 1 KB so maybe Im not understanding how Block Size works?  How does one specify the roll over size for files being written to HDFS? **&lt;/P&gt;</description>
      <pubDate>Fri, 03 Mar 2017 03:44:28 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Block-size-in-PutHDFS-in-Nifi-preventing-HDFS-files-being/m-p/134972#M56039</guid>
      <dc:creator>elloyd</dc:creator>
      <dc:date>2017-03-03T03:44:28Z</dc:date>
    </item>
    <item>
      <title>Re: Block size in PutHDFS in Nifi preventing HDFS files being written</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Block-size-in-PutHDFS-in-Nifi-preventing-HDFS-files-being/m-p/134973#M56040</link>
      <description>&lt;P&gt;I take that back. I am receiving this error on the PutHDFS processor:&lt;/P&gt;&lt;P&gt;Specified block size is less than configured minimum value.&lt;/P&gt;&lt;P&gt;This seems to persist even if I change the Block Size to 200kb which is larger than the size of the files that it writes if I put nothing into the Block size attribute.&lt;/P&gt;</description>
      <pubDate>Fri, 03 Mar 2017 03:52:56 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Block-size-in-PutHDFS-in-Nifi-preventing-HDFS-files-being/m-p/134973#M56040</guid>
      <dc:creator>elloyd</dc:creator>
      <dc:date>2017-03-03T03:52:56Z</dc:date>
    </item>
    <item>
      <title>Re: Block size in PutHDFS in Nifi preventing HDFS files being written</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Block-size-in-PutHDFS-in-Nifi-preventing-HDFS-files-being/m-p/134974#M56041</link>
      <description>&lt;P&gt;Its clear I misunderstand what Block Size is because when I set it to 1 GB it is not below the minimum and it generates HDFS files - but they are all less than a kb.  How the heck do you specify the file sizes for the HDFS writes?&lt;/P&gt;</description>
      <pubDate>Fri, 03 Mar 2017 03:56:45 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Block-size-in-PutHDFS-in-Nifi-preventing-HDFS-files-being/m-p/134974#M56041</guid>
      <dc:creator>elloyd</dc:creator>
      <dc:date>2017-03-03T03:56:45Z</dc:date>
    </item>
    <item>
      <title>Re: Block size in PutHDFS in Nifi preventing HDFS files being written</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Block-size-in-PutHDFS-in-Nifi-preventing-HDFS-files-being/m-p/134975#M56042</link>
      <description>&lt;P&gt;You should use a MergeContent processor before PutHDFS to merge flow files together based on a minimum size.&lt;/P&gt;</description>
      <pubDate>Fri, 03 Mar 2017 04:06:00 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Block-size-in-PutHDFS-in-Nifi-preventing-HDFS-files-being/m-p/134975#M56042</guid>
      <dc:creator>bbende</dc:creator>
      <dc:date>2017-03-03T04:06:00Z</dc:date>
    </item>
    <item>
      <title>Re: Block size in PutHDFS in Nifi preventing HDFS files being written</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Block-size-in-PutHDFS-in-Nifi-preventing-HDFS-files-being/m-p/134976#M56043</link>
      <description>&lt;P&gt;Perfect.  Thank you.  Still learning how to rethink data ingestion from Flume to Nifi.&lt;/P&gt;</description>
      <pubDate>Fri, 03 Mar 2017 04:11:41 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Block-size-in-PutHDFS-in-Nifi-preventing-HDFS-files-being/m-p/134976#M56043</guid>
      <dc:creator>elloyd</dc:creator>
      <dc:date>2017-03-03T04:11:41Z</dc:date>
    </item>
  </channel>
</rss>

