Support Questions

Find answers, ask questions, and share your expertise

Split JSON after Convert Record (CSVtoJSON) creating 10,000 duplicate split records

avatar
Contributor

Sample Data:-

1,Michael,Jackson

2,Jim,Morrisson

3,John,Lennon

4,Freddie,Mercury

5,Elton,John

refer image CSVtoJSONJSON created successfully

43902-csvtojson.png

Result After Convert Record (CSVtoJSON)

[ { "id" : 1, "firstName" : "Michael", "lastName" : "Jackson" },

{ "id" : 2, "firstName" : "Jim", "lastName" : "Morrisson" },

{ "id" : 3, "firstName" : "John", "lastName" : "Lennon" },

{ "id" : 4, "firstName" : "Freddie", "lastName" : "Mercury" },

{ "id" : 5, "firstName" : "Elton", "lastName" : "John" } ]

Applying SplitJSON to the convertrecord processor with jsonpath = $.* creates 10,000 splits in the queue

refer image SplitJSON10000splits

43903-splitjson10000splits.png

I need to split the array of JSON into individuall JSON records and apply some transformation to these records

1 ACCEPTED SOLUTION

avatar
Master Guru

@Rohan Naidu

You have connected Failure and Original relationships of split json looped back to same processor,

What does Original relationship means The original FlowFile that was split into segments. If the FlowFile fails processing, nothing will be sent to this relationship i.e your original flowfile will be transferred to this relationship.

Example of Original Flowfile:-

This message will be original flowfile

[{"id":1,"fname":"Michael","lname":"Jackson"},{"id":2,"fname":"Jim","lname":"Morrisson"},{"id":3,"fname":"John","lname":"Lennon"},{"id":4,"fname":"Freddie","lname":"Mercury"},{"id":5,"fname":"Elton","lname":"John"}]

in your case original relation loop back to split json and generating duplicates.

To resolve this issue auto terminate original relation by

  1. Right Click on the Split json processor
  2. Goto Settings tab
  3. Click on Check box before Original relation
  4. Then click on Apply button Right Below of the screen.

Auto terminate original Relationship:-

43905-original.png

Splitjson Configs:-

43904-splitjson.png

As you can see above screenshot only failure relationship is loop back to the processor and we have auto terminated original relationship.

View solution in original post

2 REPLIES 2

avatar
Master Guru

@Rohan Naidu

You have connected Failure and Original relationships of split json looped back to same processor,

What does Original relationship means The original FlowFile that was split into segments. If the FlowFile fails processing, nothing will be sent to this relationship i.e your original flowfile will be transferred to this relationship.

Example of Original Flowfile:-

This message will be original flowfile

[{"id":1,"fname":"Michael","lname":"Jackson"},{"id":2,"fname":"Jim","lname":"Morrisson"},{"id":3,"fname":"John","lname":"Lennon"},{"id":4,"fname":"Freddie","lname":"Mercury"},{"id":5,"fname":"Elton","lname":"John"}]

in your case original relation loop back to split json and generating duplicates.

To resolve this issue auto terminate original relation by

  1. Right Click on the Split json processor
  2. Goto Settings tab
  3. Click on Check box before Original relation
  4. Then click on Apply button Right Below of the screen.

Auto terminate original Relationship:-

43905-original.png

Splitjson Configs:-

43904-splitjson.png

As you can see above screenshot only failure relationship is loop back to the processor and we have auto terminated original relationship.

avatar
Contributor

@shu Thanks. It worked.