I have deployed Cloudbreak 2.5.0 on Azure. Then created two cluster: HDP2.6 and HDF3.1. All the services are running fine. Now want to create a data flow using Nifi and write that in HDFS of my HDP cluster. How can I do that? Any step-by-step procedures or blogs would be a great help for me.
I just wanted to build a simple flow path using GetFile and PutHDFS processor. what properties should i configure in those processors?
Where should i define the cluster address in the PutHDFS processor? Nifi is in HDF cluster and i want to write data in HDP cluster. How Nifi will understand that data will be picked from HDF cluster and drop-off in another HDP cluster.? I need how can i configure the processors and whetehre i need to copy/paste hdfs-site.xml and core-site.xml files from HDP to HDF cluster? if yes then where should i copy?