Support Questions
Find answers, ask questions, and share your expertise

Elaborate query from JSON

New Contributor

Hello everybody, I'm new to Nifi, I've been using it for a week now and I got stuck with something.
The problem I have is: in order to perform CRUD ops on Postgres, I'm extracting XML files, from FS, transforming them into simple flat JSON (10 key/value pairs), then sending them to a ConvertJSONToSQL processor and at last to a PutSQL. Now, I'm struggling to understand how the last 2 processors operate. In order to perform a query, I also have set an UpdateAttribute processor (before JSONtoSQL) to set attributes for the query, like this:

name: "sql.args.<positional-number>.value"

value "$.<key-name-inJson>" .

This is the JSON in flowfile:
{ "master_identifier" : "mfn_m02_medico", "record_level_event" : "MAD", "prg" : "", "codice" : "150189", "descrizione" : "Descrizione", "data_inizio_validita" : "20180506", "data_fine_validita" : "", "utente_operazione" : "operatore562", "nome" : "Riccio", "cognome" : "Francesco", "sesso" : "M"}

Question: How does the duo Convert-Put works? Am I supposed to set attributes like I wrote and the Convert will formulate the query? Can't I just give that JSON to Convert, since all key-value pairs are the one i need to put in the query?

Thanks in advance. I might be a little confused on how this is supposed to work, so any explaination would be greatly appreciated!

Note: I know I can make it work using a Replacetext proc. but i'm trying to handle "different json", meaning I'm trying to handle different tables, thats why i'm looking for something that automatically maps JSON content with the query coloumn and values.


Re: Elaborate query from JSON


you can go through like this xmlTojsonFLatten -> take json pass to EvaluateJsonPath to take those value you need insert -> pass those evaluate JSON path variables to ConvertJsonToSQL processor as the insert statement of that particular table -> Put SQL.

For better understanding go through this link:-

Re: Elaborate query from JSON

New Contributor

Thanks for your answer @Gourav Bhattacharya. So you think the best solution is to set EvaluateJSONPath (1 per table?) and set attributes? So using content to produce query is unadvised?

I'm asking you this because atm I'm able to produce queries based on json content, but I'm facing an issue with insert_date and Putsql not being able to convert it to Timestamp

EDIT: I tried your configuration, still getting the same error:

ERROR [Timer-Driven Process Thread-7] o.apache.nifi.processors.standard.PutSQL PutSQL[id=6f30edf4-0164-1000-7481-f207d6b133b3] Failed to update database for StandardFlowFileRecord[uuid=a51252a4-e4f0-467f-8790-10e7c706aa11,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1531731910037-2, container=default, section=2], offset=34618, length=149],offset=0,name=mfn_m02_medico_16_07_2018_11_07_00_884.xml,size=149] due to java.sql.SQLDataException: The value of the sql.args.1.value is '2018-07-16 13:06:49', which cannot be converted into the necessary data type; routing to failure: java.sql.SQLDataException: The value of the sql.args.1.value is '2018-07-16 13:06:49', which cannot be converted into the necessary data type
java.sql.SQLDataException: The value of the sql.args.1.value is '2018-07-16 13:06:49', which cannot be converted into the necessary data type
at org.apache.nifi.processors.standard.util.JdbcCommon.setParameters(
at org.apache.nifi.processors.standard.PutSQL.lambda$null$5(
at org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(
at org.apache.nifi.processors.standard.PutSQL.lambda$new$6(
at org.apache.nifi.processors.standard.PutSQL.lambda$new$9(
at org.apache.nifi.processor.util.pattern.PutGroup.putFlowFiles(
at org.apache.nifi.processor.util.pattern.Put.onTrigger(
at org.apache.nifi.processors.standard.PutSQL.lambda$onTrigger$20(
at org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(
at org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(
at org.apache.nifi.processors.standard.PutSQL.onTrigger(
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(
at org.apache.nifi.controller.tasks.ConnectableTask.invoke(
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$
at java.util.concurrent.Executors$
at java.util.concurrent.FutureTask.runAndReset(
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(
at java.util.concurrent.ScheduledThreadPoolExecutor$
at java.util.concurrent.ThreadPoolExecutor.runWorker(
at java.util.concurrent.ThreadPoolExecutor$
Caused by: java.lang.NumberFormatException: For input string: "2018-07-16 13:06:49"
at java.lang.NumberFormatException.forInputString(
at java.lang.Integer.parseInt(
at java.lang.Integer.parseInt(
at org.apache.nifi.processors.standard.util.JdbcCommon.setParameter(
at org.apache.nifi.processors.standard.util.JdbcCommon.setParameters(
... 20 common frames omitted

Re: Elaborate query from JSON


okay use ${now} as an expression language it will give you current date and time if not please post your screen shots of processor configuration

Re: Elaborate query from JSON

Super Guru

Does each XML file have a single record? If not, I recommend keeping all the records together (i.e. a JSON array of objects, each with 10 key/value pairs), and in either case, I recommend PutDatabaseRecord to replace ConvertJSONToSQL and PutSQL. It will take your JSON (or even your XML as of NiFi 1.7.0 via the XMLReader added under NIFI-4185) and do all the conversions and SQL generation for you, so you don't need to worry about positional attributes, data types, etc.