Created 01-11-2017 02:42 PM
I am trying to write a HDF process which takes a feed from the listenSMTP receiver, and puts (PutHDFS) files into a folder in HDFS. For some reason, the PutHDFS is not using HDFS and instead putting in a local folder to the HDF setup. I've confirmed the hdfs-site.xml file is referenced correctly. I've also tried the FetchHDFS and ListHDFS options, and they all seem to do similar activity. Anyone have ideas on this? why would a PutHDFS put locally instead of in HDFS
Created 01-11-2017 05:03 PM
You need to provide core-site.xml and hdfs-site.xml, the core-site.xml should have a default filesystem provided like this:
<property>
<name>fs.defaultFS</name>
<value>hdfs://hostname</value>
</property>
Created 01-11-2017 05:03 PM
You need to provide core-site.xml and hdfs-site.xml, the core-site.xml should have a default filesystem provided like this:
<property>
<name>fs.defaultFS</name>
<value>hdfs://hostname</value>
</property>
Created 01-11-2017 09:04 PM
the nifi processors for hdfs only have a spot for 1 xml file, and it wants the hdfs-site.xml. where to I reference the core-site.xml?
Created 01-11-2017 09:24 PM
It takes a comma separated list of files so you can specify:
/etc/hadoop/conf/core-site.xml,/etc/hadoop/conf/hdfs-site.xml
Obviously using the appropriate paths on your file system.
Created 01-12-2017 04:40 PM
Thank you! that was the solution! I fought with that for two weeks with no error message telling me it was missing anything.