Created 12-29-2025 12:46 AM
Hello team & community 🙂
I am looking into deploying many instances of a flow that is versioned in a git registry via the REST API. My current issue is that I would like to configure each instance of this flow dynamically, though if I version the flow with a Parameter Context attached to it, whenever I deploy instances of it they all end up attaching to a single Parameter Context.
I understand this is the expected behaviour as described in the documentation. Therefore, I'd like to know if there is any recommended practice for situations such as mine.
The best solution I could think of was perhaps configuring the processors in the flow to use parameters but not to actually attach a PC to the process group. This way I could deploy flows in two steps by first importing them and then creating a new parameter context & configuring/attaching it.
I would greatly appreciate any advice for this use-case,
Thank you.
Created 01-06-2026 09:19 AM
@Green_
When you deploy a Dataflow (that has a parameter context assign to it) from NiFi A via NiFi-Registry to another NiFi B, the parameter context will be added to NiFi 2 if a parameter context of the same exact name does NOT already exist in NiFI 2. If the Parameter Context with same name already exists, that local parameter context will be used.
Additionally, if the parameter context of the same name present in the original flow from NiFi has a new parameter name not present in same named pre-existing parameter context on NiFi B, that additional name/value will be added to the existing same named parameter context on NiFi B.
So NiFi / NIFi-Registry was designed with the intent to handle different parameter values per NiFi deployment.
Now the first time you deploy a flow from NiFi A to NiFi B, you end up with the parameter context from NIFi A being added to NiFi B. You'll need to update values as needed in NiFi B before starting the dataflow(s) in that Process group. But new version after that will not be an issue (unless additional new parameter name/value pairs are added. Those would need to be updated or you could add the new params manually in NiFi 2 before updating version.
I think above solution is better since you'll have the all the parameter name/value pairs when you import the new dataflow from NiFi-Registry, you'll just need to update some values before starting the new dataflow.
Please help our community grow. If you found any of the suggestions/solutions provided helped you with solving your issue or answering your question, please take a moment to login and click "Accept as Solution" on one or more of them that helped.
Thank you,
Matt
Created 01-11-2026 12:15 AM
Hi @MattWho , thank you for the detailed reply 🙂
I might not have been very clear in my original question so I'll rephrase with more context-
My use-case is not deploying a single instance of a flow from NiFi A to NiFi B, rather I've got just one NiFi instance and I'd like to deploy thousands of instances of the same versioned flow (and perhaps in the future, I'll have more clusters in different regions and would like to deploy the flows there too).
In this context, it is problematic for me that Parameter Contexts get created/picked based on the name of the original PC at commit time, since I want each flow instance to have its own unique values for the parameters.
Therefore, my current plan is commiting the flows without a PC attached (though with parameters configured in the processors), and at creation time (via my codebase) creating a new PC (uniquely named per flow) with the specific values for that instance.
My question was whether there is a more preferable way to do this - creating many instances of a parameterized versioned flow in the same nifi environment and ensuring each instance can get its own unique set of parameters.
Thanks for helping out,
Green