Support Questions

Find answers, ask questions, and share your expertise

Is calling Dynamic Attributes possible?

New Contributor

New to Nifi but here is my thought, I am trying to make a template for a flow. I would prefer to create as many of the processors to read everything from the variables at the flow level. Essentially....the template would be used to copy data from a sql db to another sql db....only changes would be what server, what copy from/copy to tables by upserting, table columns etc. Will also be adding additional fields for a business and content hashes. Because of the sql and dynamic-ness, I have to use sql.args.$.value to use in the insert statement. So thought is to grab the values and store in a value called "Field##" where ## is the number from the "SourceField" variable (must be in order)....then use this "Field##" to populate the sql args value and then use another variable to populate the destination table (most likely a different column name).

So from my thoughts, in the attributes of the flow file, I would have the following....setting from the initial source db query.

Field1 "test1"

Field2 "test2"

In a groovy script I would get the number of "source variables" and then do a for loop and assign all of the sql args from the Field## attributes. Something like:

for (int i = 1; i < countargs; i++) {

def myType = "sql.args." + i + ".type"

ff = session.putAttribute(ff, myType, '-9') //setting it to use a string type

def myValue = "sql.args." + i + ".value"

ff = session.putAttribute(ff, myValue, '${Field" + i + '}')


The creating of the sql.arg with a # works as it's just a concat string.

But I would like the for loop to set the following:

sql.args.1.value = test1

sql.args.2.value = test2

Any thoughts appreciated.


Super Guru

Have you tried PutDatabaseRecord? The reader provides the schema so there is no need to set the sql.args attributes. As of NiFi 1.9 the reader can also infer the datatypes, so you wouldn't have to specify the schema either.

If you have an older NiFi, you can try ExecuteSQL -> ConvertRecord (Avro to JSON) -> InferAvroSchema -> PutDatabaseRecord, that's a heavy-handed way of getting the schema inferred.

New Contributor

I have thought about it but using the execute sql as it's a glorified Up-Sert statement comparing on a business key hash value that is created during the flow.

The person that initially created this flow did a good job but there are many manual entries that must be completed when doing a copy and paste for the template of this flow. Was looking to remove as many of the manual entries that must be changed.

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.