Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Apache Nifi: Insert json data into table as single column data into table after concatenating all the keys and values of json.

avatar
Explorer

Please help design data flow in nifi, I need to insert the incoming JSON data into table which having single column. The condition are I need to concat all the keys and values and insert that data. 

1 ACCEPTED SOLUTION

avatar
Super Guru

Hi,

thanks for the information , I came across situation before and I'm not sure if there is better way but you can use the following processors after you split to data into individual records :

ExtractText -> PUTSQL

For the ExtractText:

1- Add dynamic Property to capture all the Json Content from incoming flowfile as follows:

SAMSAL_1-1676645774194.png

Note: You have to be careful with if each record data can be large (> 1024 Chars). In this case you need to look into modifying "Maximum Buffer Size" and "Maximum Capture Group Length" accordingly otherwise the data will be truncated.

For the PutSQL: Once you configure the JDBC Connection Pool , you can set the SQL Statement Property with something like this:

insert into myTable (jsonCol) values ('${JsonRecord}')

 

If that helps please accept solution

View solution in original post

5 REPLIES 5

avatar
Super Guru

Hi ,

 

Can you provide sample\example  data of the json and how do you expect to save it as ?

avatar
Explorer

Hi, SAM. Thank you for response. I am using ListenHTTP to get json data-->ConvertRecord(to validate json) -->SplitJson(to split)-->Now I have individial Json Objects. Now I have concatenate all the keys and values of each json object and insert to a table have single column called JSON_DATA.

 

This is my input data, 

[ {"Name":"Joh ","Age": 2,"Occupation":"Developer",      "Location": "New York","Salary": 75780},
{"Name": "Ja","Age": 9,"Occupation": "Designer",        "Location": "San Francisco","Salary": 87800},
{"Name": "Bb kkkk","Age": 5,"Occupation": "Manager",        "Location": "London","Salary": 90870} ]...I want insert each json obj inot each new row as string like this in a snowflake table   ' {"Name":"Joh ","Age": 2,"Occupation":"Developer",      "Location": "New York","Salary": 75780} ' 

avatar
Super Guru

Hi,

thanks for the information , I came across situation before and I'm not sure if there is better way but you can use the following processors after you split to data into individual records :

ExtractText -> PUTSQL

For the ExtractText:

1- Add dynamic Property to capture all the Json Content from incoming flowfile as follows:

SAMSAL_1-1676645774194.png

Note: You have to be careful with if each record data can be large (> 1024 Chars). In this case you need to look into modifying "Maximum Buffer Size" and "Maximum Capture Group Length" accordingly otherwise the data will be truncated.

For the PutSQL: Once you configure the JDBC Connection Pool , you can set the SQL Statement Property with something like this:

insert into myTable (jsonCol) values ('${JsonRecord}')

 

If that helps please accept solution

avatar
Explorer

Hi SAM, Thank you for response. I am facing one more error, while inserting data into table. Even though i have tables named test4 in my snowflake table, The PUTSQL processor keep telling "The Table 'test4' doesn't exist or not Authorized(I did give permission too). This is my sql statement(INSERT INTO test4 (JSON_DATA) values ('${JsonRecord}'). 

praful123_0-1676871828372.png

Any help will be appreciated. 

avatar
Explorer

Hi SAM, Actually it worked for me now. No error now. Thank you for your help.