It's my understanding that HDF2 can be managed with Ambari but it must be a different Ambari to the one that manages the main cluster. Is there a way to have the client files (e.g. *-site.xml) from the main cluster automatically pushed to the HDF cluster to support some HDP specific processors such as PutHiveStreaming? Without HDF Ambari I could just install the HDP clients on the NiFi nodes, but this appears to not work as the HDF ambari-agent's would need to point to both Ambari instances for this to work.
Hi @Sebastian Carroll. Right now I don't believe there is an automatic way to push site files from HDP to HDF cluster (nor pull from HDP to HDF). I believe you would need to manually download client files from the HDP cluster and upload to nodes on the HDF cluster.
Couple of things here.
1. It's the same Ambari. When you are doing an install, you get an option in Ambari on what you want to install and those options include supported versions of HDP as well as HDF (for now only version 2.0)
2.HDF has its own cluster and its own Ambari managing it almost always independent of HDP (on different machines). In this case there is no automatic way of pushing client files from HDP cluster to HDF cluster. But it should be pretty easy. You just have to copy files under /etc/hadoop/conf to your HDF cluster.
3. Simply copying files should work as long as Nifi knows where to pick them from. One thing that might be a catch (not hundred percent sure but I have seen with other applications and NOT Nifi) is you should provide a path to client files that is not a symbolic link.
If it still doesn't work, please share your error logs.