Created 09-10-2025 12:29 PM
I am running into a problem with my DBCPConnectionPool service in Nifi and am hoping that someone has some experience with this. I'm using Nifi 2.0.0-M2.
My current Database Connection URL is
jdbc:postgresql://myurl:5432/mydatabase?currentSchema=public&stringtype=unspecified
This works great with no issues. However, I want to add another schema in there so I don't have to specify the new schema in my queries throughout all my nifi flows. For example, I was hoping this would work:
jdbc:postgresql://myurl:5432/mydatabase?currentSchema=myschema,public&stringtype=unspecified
But when I do a query like "select * from mytable" with ExecuteSQL, I get an error:
org.postgresql.util.PSQLException: ERROR: relation "mytable" does not exist
If I change the query to "select * from myschema.mytable", it works.
Chatgpt recommended I try this, but it also didn't work:
jdbc:postgresql://myurl:5432/mydatabase?options=-c%20search_path=myschema,public&stringtype=unspecifi...
Has anyone been able to successfully specify a different schema other than just public? I would just include the schema name in all my queries (which works), but we are going to do a separate prod and dev instance and I don't want to have to change all my queries depending on whether I'm using myschema or myschema-dev.
Created 09-15-2025 10:41 AM
@ShellyIsGolden
What you describe here sounds like the exact use case for using NiFi's parameter contexts. Parameters can be used in any NiFi component property. They make it easy to build a dataflow in your dev environment and then move that dataflow to test or prod environments that have the same parameter contexts but with different values. This even works when using a shared NiFi-Registry to version control your ready dataflows for another environment.
Lets say you create a "Parameter Context" and associate that created parameter context with a Process Group(s). Now you can configure a property in a processor for example and click on "convert to parameter" icon to convert that value into a parameter within yoru parameter context.
Lets say you create a Parameter context with name "PostgresQL parameters".
Then you can configure your Process Group (PG) to use that parameter context:
Now you can configure/convert your component properties that are unique per NiFi deployment environment to using a parameter.
Let's say you are ready to move that flow to another environment like prod. So you version control that PG on dev to NiFi-Registry. Then on Prod you connect to that same NiFi-Registry and import that dataflow. When that flow is loaded in Prod, if a parameter context with the exact same name "PostgresQL parameters" already exists, that imported flow will use that parameter context's values. This eliminates the need to manage these configuration all over the place in yoru dataflows. You can also open your parameter context and edit a values and NiFi will take care of stopping and starting all the affected components.
Please help our community grow. If you found any of the suggestions/solutions provided helped you with solving your issue or answering your question, please take a moment to login and click "Accept as Solution" on one or more of them that helped.
Thank you,
Matt
Created 09-11-2025 04:33 PM
Hello @ShellyIsGolden,
Glad to see you in our community. Welcome!
ChatGPT was not that wrong (😆), in fact that makes sense.
The PostgreSQL documentation refers that method as possible:
String url = "jdbc:postgresql://localhost:5432/postgres?options=-c%20search_path=test,public,pg_catalog%20-c%20statement_timeout=90000";
https://jdbc.postgresql.org/documentation/use/#connection-parameters
Have you tested the JDBC connection outside of NiFi?
Maybe with psql command like this:
psql -d postgresql://myurl:5432/mydatabase?options=-c%20search_path=myschema,public&stringtype=unspecified
Also, check with your PG team to see if that connect string is possible and test more on that side.
Created 09-15-2025 10:41 AM
@ShellyIsGolden
What you describe here sounds like the exact use case for using NiFi's parameter contexts. Parameters can be used in any NiFi component property. They make it easy to build a dataflow in your dev environment and then move that dataflow to test or prod environments that have the same parameter contexts but with different values. This even works when using a shared NiFi-Registry to version control your ready dataflows for another environment.
Lets say you create a "Parameter Context" and associate that created parameter context with a Process Group(s). Now you can configure a property in a processor for example and click on "convert to parameter" icon to convert that value into a parameter within yoru parameter context.
Lets say you create a Parameter context with name "PostgresQL parameters".
Then you can configure your Process Group (PG) to use that parameter context:
Now you can configure/convert your component properties that are unique per NiFi deployment environment to using a parameter.
Let's say you are ready to move that flow to another environment like prod. So you version control that PG on dev to NiFi-Registry. Then on Prod you connect to that same NiFi-Registry and import that dataflow. When that flow is loaded in Prod, if a parameter context with the exact same name "PostgresQL parameters" already exists, that imported flow will use that parameter context's values. This eliminates the need to manage these configuration all over the place in yoru dataflows. You can also open your parameter context and edit a values and NiFi will take care of stopping and starting all the affected components.
Please help our community grow. If you found any of the suggestions/solutions provided helped you with solving your issue or answering your question, please take a moment to login and click "Accept as Solution" on one or more of them that helped.
Thank you,
Matt
Created 09-15-2025 10:55 AM
Thank you for replying, that's the exact solution I eventually settled on.
Best,
Shelly