Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Best practice for updating Nifi's 'Hadoop Configuration Resources'

avatar
Explorer

Hey everyone,

I have a HDF cluster storing data into a HDP cluster using Nifi. To use the putHDFS processor I needed to store copies of the core-site.xml and hdfs-site.xml in each of my HDF worker nodes. What is the suggested method for automating the updating of those files? For example, if I were to change the configurations of either of those files would I need to manually copy them over to the worker nodes again? Or is there a better, more automated, solution?

1 ACCEPTED SOLUTION

avatar

You can use ListSFTP -> FetchSFTP -> PutFile flow on NiFi to grab the files from wherever you're storing them for the master copy of the configs. This will have NiFi keep itself up to date and you can have your hadoop resources point at the location where you do the PutFile.

View solution in original post

2 REPLIES 2

avatar

You can use ListSFTP -> FetchSFTP -> PutFile flow on NiFi to grab the files from wherever you're storing them for the master copy of the configs. This will have NiFi keep itself up to date and you can have your hadoop resources point at the location where you do the PutFile.

avatar
Explorer

Clever, thank you for the suggestion!