Member since
01-05-2017
153
Posts
10
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3433 | 02-20-2018 07:40 PM | |
2362 | 05-04-2017 06:46 PM |
03-02-2017
07:56 PM
Its clear I misunderstand what Block Size is because when I set it to 1 GB it is not below the minimum and it generates HDFS files - but they are all less than a kb. How the heck do you specify the file sizes for the HDFS writes?
... View more
03-02-2017
07:52 PM
I take that back. I am receiving this error on the PutHDFS processor: Specified block size is less than configured minimum value. This seems to persist even if I change the Block Size to 200kb which is larger than the size of the files that it writes if I put nothing into the Block size attribute.
... View more
03-02-2017
07:44 PM
We have a Nifi flow which uses PutHDFS. We want to have it put files into HDFS of a specific size each time (similar to how Flume does it with hdfs.rollInterval). I thought maybe it was Block size attribute but that seems to be breaking my file writing completely. When I set Block Size to any value (I have tried 10kb and the syntax: 10 KB as well as well as a very small size like 500b), it runs without errors but no files show up in HDFS. If I remove the value from the Block Size attribute, it will put files in HDFS that are correct except I want to specify their size. Any insight is appreciated. ** As an update to this, I realized I was setting the Block size lower than the minimum block size required so it wasn't writing to HDFS. That being said, when I changed it to the minimum of 1 GB, it seems to still write files to HDFS of a size less than 1 KB so maybe Im not understanding how Block Size works? How does one specify the roll over size for files being written to HDFS? **
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache NiFi
03-02-2017
04:34 PM
We have also tried creating another Files-View instance and the same error occurs.
... View more
03-02-2017
04:33 PM
We are having an issue previewing and downloading files from File View. I noticed one other user on here had a similar problem and their solutions did not work for us. When attempting to preview a file in file view, it is blank and we get an error in the logs: 02 Mar 2017 16:21:57,867 ERROR [ambari-client-thread-8502] [FILES 1.0.0 AUTO_FILES_INSTANCE] FilePreviewService:96 - Error occurred while previewing /topics/xyz/2017/02/15/xyz.1487116808176.log : And when we attempt to download a file, it states: {
"status": 500,
"message": "Server Error"
} If anyone has any insight or have had this problem, please assist. Thank you.
... View more
Labels:
- Labels:
-
Apache Ambari
01-30-2017
02:35 PM
Thanks. We did check out Nifi and like it. Unfortunately, we only have the budget for one cluster which had HDP on it. So now we have to decide if we want to wipe HDP and get HDF or stay with HDP and use a Flume ingestion scheme. Im kinda disappointed that Nifi cannot be included as part of HDP. Im sure there are reasons. For people who can only have one cluster, they are forced to decide between the power of the analytics in HDP or the power of the data stream control of HDF, but not both.
... View more
01-27-2017
01:55 PM
Thanks I gathered that. So it requires two clusters to have both HDP and HDF. Unfortunate. Im still struggling to understand why Kafka and Storm is on both and not Nifi...
... View more
01-26-2017
04:15 PM
Yeah we installed the default HDP mpack... I guess now we have to wipe all of the HDP stuff and reinstall with HDF...
... View more