Member since
10-19-2021
2
Posts
0
Kudos Received
0
Solutions
10-19-2021
02:04 PM
@AA24 The easiest way to accomplish this is to use the PutDistributedMapCache processor in one flow to write the attributes values you want to share to a cache server and on your other flow use the FetchDistributedMapCache processor to retrieve those cached attributes and add them to your other FlowFiles that need them. Another option is to use the MergeContent processor. On flow one where it looks like you are extracting your session_id and job_id you would use the ModfiyBytes processor to zero out the content leaving you with a FlowFile that only has attributes and then use MergeContent to combine this FlowFile with the FlowFile in your second flow. In the MergeContent processor you would configure "Attribute Strategy" to use "Keep All Unique Attributes". If you found this response assisted with your query, please take a moment to login and click on "Accept as Solution" below this post. Thank you, Matt
... View more
10-19-2021
01:18 PM
@AA24 NiFi was designed as an always on type of dataflow design. As such the NiFi processor components support "Timer Driven" and "Cron Driven" Scheduling Strategy types. That being said, the ability to tell a processor to "Run Once" exists within NiFi. You could manually do from within the UI by right clicking on the NiFi processor component and selecting "run once" from the pop-up context menu. The next thing to keep in mind is that anything that you can do via the UI, you can also do via a curl command. So it is possible to build a dataflow that could trigger the "run once" api call against the processor you want to fetch from the appropriate DB. You can not execute "run once" against a PG nor would I recommend doing so. You want to only trigger the file responsible for ingesting your data and leave all the other processor running all the time so they process whatever data they have queued at anytime. First you to create your trigger flow, so you could have a getFile to consume the trigger file and use maybe a RouteOnContent processor to send the FlowFile to either an InvokeHTTP configured to invoke run-once on your Oracle configured processor or an invokeHTTP configured to invoke run-once on your MySQL configured processor. Using your browser's developer tools is an easy way to capture the rest-api calls that are made when you manually perform them the action via the UI. If you found this response assisted with your query, please take a moment to login and click on "Accept as Solution" below this post. Thank you, Matt
... View more