- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Failed to write to parent HDFS directory.
- Labels:
-
Apache Ambari
-
Apache Hadoop
-
Apache NiFi
Created 11-05-2016 11:13 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
error-puthdfs1.pngambari-main.jpgerror.jpgerror-puthdfs.pngI am not able to write my files to the parent HDFS directory /tmp/tweets_staging. Please have a look at it, I am attaching the snapshot of the error which I am facing.
Note: I have followed every bit of this page.
http://hortonworks.com/hadoop-tutorial/how-to-refine-and-visualize-sentiment-data/
Created 11-07-2016 01:36 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The HDFS client does not currently support the LzoCodec and the core-site.xml file you are using includes it.
It should work after you remove “com.hadoop.compression.lzo.LzoCodec,com.hadoop.compression.lzo.LzopCodec” from the “io.compression.codecs” property in the “core-site.xml” file you have referenced in your putHDFS processor.
Thanks,
Matt
Created 11-07-2016 01:36 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The HDFS client does not currently support the LzoCodec and the core-site.xml file you are using includes it.
It should work after you remove “com.hadoop.compression.lzo.LzoCodec,com.hadoop.compression.lzo.LzopCodec” from the “io.compression.codecs” property in the “core-site.xml” file you have referenced in your putHDFS processor.
Thanks,
Matt
Created 11-08-2016 02:57 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Right on! Thanks @mclark that did the trick.
Created 03-03-2017 02:56 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
i have the same issue. how can i go to this file ? i dont find the way to go to this path and this file.
Best Regards,
Martin
Created 03-03-2017 03:00 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The core-site.xml file is copied from your HDFS node to your NiFi node. You modify the local copy on NiFi as described above and point to this file using the the "Hadoop Configuration Resources" property in the NiFi HDFS processor.
Created 03-03-2017 03:05 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks for your fast answer. yes the steps to modify and where to put the path in the processor i understand. But where can i find this xml file. What is the NiFi node ? iam working in the sandbox. Can i reach this file from the ambari manager ?
Created 03-03-2017 03:09 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I don't know where HDFS files are placed in the sandbox, but I know you cannot copy the file from HDFS to NiFi from within Ambari. You will need to do this via a command/terminal window. you could use the "locate" command if you are running on linux and have the "mlocate" package installed.
# yum -y install mlocate # updatedb # locate core-ste.xml
Created 03-03-2017 04:37 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I found the file now, downloaded it and changed the entry. How do i get this file now back into my sandbox to the right place ?
Created 03-03-2017 03:59 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
No iam running the sandbox in win 7. ok i will keep searching for the xml file. Thank you.
