Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Nifi not using HDFS

avatar
Contributor

I am trying to write a HDF process which takes a feed from the listenSMTP receiver, and puts (PutHDFS) files into a folder in HDFS. For some reason, the PutHDFS is not using HDFS and instead putting in a local folder to the HDF setup. I've confirmed the hdfs-site.xml file is referenced correctly. I've also tried the FetchHDFS and ListHDFS options, and they all seem to do similar activity. Anyone have ideas on this? why would a PutHDFS put locally instead of in HDFS

1 ACCEPTED SOLUTION

avatar
Master Guru

You need to provide core-site.xml and hdfs-site.xml, the core-site.xml should have a default filesystem provided like this:

<property>

<name>fs.defaultFS</name>

<value>hdfs://hostname</value>

</property>

View solution in original post

4 REPLIES 4

avatar
Master Guru

You need to provide core-site.xml and hdfs-site.xml, the core-site.xml should have a default filesystem provided like this:

<property>

<name>fs.defaultFS</name>

<value>hdfs://hostname</value>

</property>

avatar
Contributor

the nifi processors for hdfs only have a spot for 1 xml file, and it wants the hdfs-site.xml. where to I reference the core-site.xml?

avatar
Master Guru

It takes a comma separated list of files so you can specify:

/etc/hadoop/conf/core-site.xml,/etc/hadoop/conf/hdfs-site.xml

Obviously using the appropriate paths on your file system.

avatar
Contributor

Thank you! that was the solution! I fought with that for two weeks with no error message telling me it was missing anything.