Support Questions

Find answers, ask questions, and share your expertise

Generic Execution Flows

avatar
Explorer

Hi,

 

I'm trying to figure out if the following scenarios are possible:
1. I have hundreds of tables that need to use the same flow, but have different intervals, different source hostnames and destinations?
How to build such a flow?  also I can't figure out how to use dynamic hosts/schemas/table names...

We maintain a table with with all this info but how to execute it with NiFi?
2. If I need to load a file on multiple clusters (each table different clusters) in parallel - how can this be achieved?

 

tnx!

1 ACCEPTED SOLUTION

avatar
Explorer

The solution I found is to use an external schedular (like airflow) and to use ListenHttp processor. Then you can send to that listener any data you wish, parse it, and use it as parameters/attributes in the rest of the flow.

View solution in original post

2 REPLIES 2

avatar
Explorer

So I understand it is not possible to send a parameter via rest api to a flow?

avatar
Explorer

The solution I found is to use an external schedular (like airflow) and to use ListenHttp processor. Then you can send to that listener any data you wish, parse it, and use it as parameters/attributes in the rest of the flow.