Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

How can I write to remote HDFS using NiFi?( Assume NiFi and HDP running on separate nodes or HDFS in cloud)

avatar
 
1 ACCEPTED SOLUTION

avatar
Master Guru

@milind pandit If i understand your ? correct you simply want to write to a HDFS. The HDFS cluster is running on different nodes which is typical. Use the putHDFS processor (info here). In the hadoop configuration resources simply point to core-site.xml and hdfs-site.xml. You can download these files from your target cluster. Now NiFi knows where the cluster is.

View solution in original post

5 REPLIES 5

avatar

Hi @milind pandit,

Yes you just need to have core-site and hdfs-site configuration files available on your NiFi nodes for your PutHDFS processor. And obviously that network connections are OK between NiFi and HDP.

Hope this helps.

avatar
Master Guru

@milind pandit If i understand your ? correct you simply want to write to a HDFS. The HDFS cluster is running on different nodes which is typical. Use the putHDFS processor (info here). In the hadoop configuration resources simply point to core-site.xml and hdfs-site.xml. You can download these files from your target cluster. Now NiFi knows where the cluster is.

avatar
Master Guru

@milind pandit if you are running nodes in the cloud make sure the ports are accessible by both clusters. You will need to allow both clusters to talk with each other.

avatar

Thanks Sunile, the scenario is HDP in cloud and HDF on prem. Let me give a try.

avatar
Master Guru

@milind pandit that is straight forward. Simple download the two files from the cluster and place in your nifi cluster.