- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Best practice for updating Nifi's 'Hadoop Configuration Resources'
- Labels:
-
Apache NiFi
Created ‎02-23-2018 05:00 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hey everyone,
I have a HDF cluster storing data into a HDP cluster using Nifi. To use the putHDFS processor I needed to store copies of the core-site.xml and hdfs-site.xml in each of my HDF worker nodes. What is the suggested method for automating the updating of those files? For example, if I were to change the configurations of either of those files would I need to manually copy them over to the worker nodes again? Or is there a better, more automated, solution?
Created ‎02-27-2018 04:18 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You can use ListSFTP -> FetchSFTP -> PutFile flow on NiFi to grab the files from wherever you're storing them for the master copy of the configs. This will have NiFi keep itself up to date and you can have your hadoop resources point at the location where you do the PutFile.
Created ‎02-27-2018 04:18 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You can use ListSFTP -> FetchSFTP -> PutFile flow on NiFi to grab the files from wherever you're storing them for the master copy of the configs. This will have NiFi keep itself up to date and you can have your hadoop resources point at the location where you do the PutFile.
Created ‎02-27-2018 02:54 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Clever, thank you for the suggestion!
