Created 06-23-2016 03:32 PM
Created 06-23-2016 03:40 PM
@milind pandit If i understand your ? correct you simply want to write to a HDFS. The HDFS cluster is running on different nodes which is typical. Use the putHDFS processor (info here). In the hadoop configuration resources simply point to core-site.xml and hdfs-site.xml. You can download these files from your target cluster. Now NiFi knows where the cluster is.
Created 06-23-2016 03:39 PM
Hi @milind pandit,
Yes you just need to have core-site and hdfs-site configuration files available on your NiFi nodes for your PutHDFS processor. And obviously that network connections are OK between NiFi and HDP.
Hope this helps.
Created 06-23-2016 03:40 PM
@milind pandit If i understand your ? correct you simply want to write to a HDFS. The HDFS cluster is running on different nodes which is typical. Use the putHDFS processor (info here). In the hadoop configuration resources simply point to core-site.xml and hdfs-site.xml. You can download these files from your target cluster. Now NiFi knows where the cluster is.
Created 06-23-2016 03:42 PM
@milind pandit if you are running nodes in the cloud make sure the ports are accessible by both clusters. You will need to allow both clusters to talk with each other.
Created 06-23-2016 03:42 PM
Thanks Sunile, the scenario is HDP in cloud and HDF on prem. Let me give a try.
Created 06-23-2016 03:51 PM
@milind pandit that is straight forward. Simple download the two files from the cluster and place in your nifi cluster.