Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Best practice for updating Nifi's 'Hadoop Configuration Resources'

avatar
New Member

Hey everyone,

I have a HDF cluster storing data into a HDP cluster using Nifi. To use the putHDFS processor I needed to store copies of the core-site.xml and hdfs-site.xml in each of my HDF worker nodes. What is the suggested method for automating the updating of those files? For example, if I were to change the configurations of either of those files would I need to manually copy them over to the worker nodes again? Or is there a better, more automated, solution?

1 ACCEPTED SOLUTION

avatar

You can use ListSFTP -> FetchSFTP -> PutFile flow on NiFi to grab the files from wherever you're storing them for the master copy of the configs. This will have NiFi keep itself up to date and you can have your hadoop resources point at the location where you do the PutFile.

View solution in original post

2 REPLIES 2

avatar

You can use ListSFTP -> FetchSFTP -> PutFile flow on NiFi to grab the files from wherever you're storing them for the master copy of the configs. This will have NiFi keep itself up to date and you can have your hadoop resources point at the location where you do the PutFile.

avatar
New Member

Clever, thank you for the suggestion!