- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Generic Execution Flows
- Labels:
-
Apache NiFi
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I'm trying to figure out if the following scenarios are possible:
1. I have hundreds of tables that need to use the same flow, but have different intervals, different source hostnames and destinations?
How to build such a flow? also I can't figure out how to use dynamic hosts/schemas/table names...
We maintain a table with with all this info but how to execute it with NiFi?
2. If I need to load a file on multiple clusters (each table different clusters) in parallel - how can this be achieved?
tnx!
Created ‎11-17-2019 02:19 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The solution I found is to use an external schedular (like airflow) and to use ListenHttp processor. Then you can send to that listener any data you wish, parse it, and use it as parameters/attributes in the rest of the flow.
Created ‎11-12-2019 02:36 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
So I understand it is not possible to send a parameter via rest api to a flow?
Created ‎11-17-2019 02:19 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The solution I found is to use an external schedular (like airflow) and to use ListenHttp processor. Then you can send to that listener any data you wish, parse it, and use it as parameters/attributes in the rest of the flow.
