i'm trying to decompress a gz file from HDFS location and put it in another location. But the FetchHDFS is not fetching the file at all?
This is the ListHDFS warning I'm getting:
And my configuration:
As I understan, the listing part works: Successfuly created listing with 1 new files from HDFS. But the FetchHDFS is not doing anything. Did I miss something in the configuration?
I managed to get the file to the decompress processor, but got an error saying the file is not in gz format. and it is.
I tried on my local instance and everything works as expected.
If you have .gz file in local FS then try to fetch the file ListFile+FetchFile from your local FS(instead of HDFS) and check are you able to fetch the whole file without any issues?.
Move Local file to HDFS using the below command.
hadoop fs -put <local_File_path> <hdfs_path>
then check are u able to get the file size as 371kb in hdfs?
If yes then try to run
ListHDFS+FetchHDFS processors to fetch the newly moved file into HDFS directory.
Some threads related to similar issue.
@Shu I was able to decompress it in shell. it's not corrupt.
I've also noticed, that when I moved (list, movehdfs) the file with NiFi, the file was moved but with 0 kb. but when I copied it in shell with my user it had 371 kB. I was not able to fetch the 371 kb file because of the https://snag.gy/jExmbl.jpg and https://snag.gy/eVEjG7.jpg error.