Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Nifi not using HDFS

avatar
New Member

I am trying to write a HDF process which takes a feed from the listenSMTP receiver, and puts (PutHDFS) files into a folder in HDFS. For some reason, the PutHDFS is not using HDFS and instead putting in a local folder to the HDF setup. I've confirmed the hdfs-site.xml file is referenced correctly. I've also tried the FetchHDFS and ListHDFS options, and they all seem to do similar activity. Anyone have ideas on this? why would a PutHDFS put locally instead of in HDFS

1 ACCEPTED SOLUTION

avatar
Master Guru

You need to provide core-site.xml and hdfs-site.xml, the core-site.xml should have a default filesystem provided like this:

<property>

<name>fs.defaultFS</name>

<value>hdfs://hostname</value>

</property>

View solution in original post

4 REPLIES 4

avatar
Master Guru

You need to provide core-site.xml and hdfs-site.xml, the core-site.xml should have a default filesystem provided like this:

<property>

<name>fs.defaultFS</name>

<value>hdfs://hostname</value>

</property>

avatar
New Member

the nifi processors for hdfs only have a spot for 1 xml file, and it wants the hdfs-site.xml. where to I reference the core-site.xml?

avatar
Master Guru

It takes a comma separated list of files so you can specify:

/etc/hadoop/conf/core-site.xml,/etc/hadoop/conf/hdfs-site.xml

Obviously using the appropriate paths on your file system.

avatar
New Member

Thank you! that was the solution! I fought with that for two weeks with no error message telling me it was missing anything.