Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

How to load json record to postgres as json datatype ?

avatar
Contributor

I have a list of json records : [{}, {},{},{}]

I splitted them to get : {},{},{},{}
i need to load them to postgres as json datatype

the result in postgres should be like that 
|                data                               |

|---------------------------------------|

| { "name":"alex", "surname":"rolfi"}|
| { "name":"rolf", "surname":"syon"}|
| { "name":"med", "surname":"akil"}|

i did that with the following processors :

Ghilani_0-1655373269238.png

 

but Putdatabaserecord did not solve the problem

1 ACCEPTED SOLUTION

avatar

The way PutDatabaseRecord works is basically once it reads the json , it will try to match the json keys to the DB columns hence the error you are getting if not being able to map fields to columns, in your case it will be looking for columns: name & surename to store their values, so it wont store the whole json in one field. I had similar situation where I needed to store json and the only way I can think of is first load the json into an attribute using the ExtractText Processor, then use PutSQL processor and write the insert statement to the target table where the column is the Data and the value is the attribute contain the json value for example ${JsonDataAttribute}. You have to be careful how big your json is, if its too big then you have to figure out another way, but if its small then no harm from saving into an attribute.

View solution in original post

7 REPLIES 7

avatar

Hi,

 

Can you explain what happen when you use the PutDatabaseRecrod Processor?

avatar
Contributor

Ghilani_0-1655384841812.png

 

avatar

The way PutDatabaseRecord works is basically once it reads the json , it will try to match the json keys to the DB columns hence the error you are getting if not being able to map fields to columns, in your case it will be looking for columns: name & surename to store their values, so it wont store the whole json in one field. I had similar situation where I needed to store json and the only way I can think of is first load the json into an attribute using the ExtractText Processor, then use PutSQL processor and write the insert statement to the target table where the column is the Data and the value is the attribute contain the json value for example ${JsonDataAttribute}. You have to be careful how big your json is, if its too big then you have to figure out another way, but if its small then no harm from saving into an attribute.

avatar
Contributor

Can you please explain in more detail how to do this.
i tried this solution but i think i missed something in the configuration

Ghilani_0-1655386690579.png


Extract text 
is it correct how i loaded the text to the variable data

Ghilani_1-1655386716586.png

 

then PUTSQL 

Ghilani_2-1655386819733.png

 

avatar

Im not familiar with the postgress sql but should not be like this:

INSERT INTO table(column1, column2, …)
VALUES
    (value1, value2, …);

 

Also for the value you need to provide the attribute name , so it should be something like this:

INSERT INTO table(jsonDataColumn) VALUES(${data});

 

avatar
Contributor

Ghilani_0-1655389023138.png

i did that , PutSQL accepts as you see the flowfile but it gives no output, also no data inserted to the DB

avatar

Sorry I forgot to mention that in the PutSql processor set the Batch Size property to 1