Member since
05-22-2024
3
Posts
0
Kudos Received
0
Solutions
02-10-2025
10:08 AM
Hi again, I managed how to split records into individual records thanks to JOLT like this: [ { "operation": "shift", "spec": { "records": { "*": { "@(2,messageId)": "[&1].messageId", "@(2,markerId)": "[&1].markerId", "@(2,dateFrom)": "[&1].dateFrom", "@(2,dateTo)": "[&1].dateTo", "recordId": "[&1].recordId", "account": "[&1].account", "data": { "email": "[&2].email", "firstName": "[&2].firstName", "lastName": "[&2].lastName" }, "city": "[&1].city" } } } } ] Now my output is like this: [ { "messageId" : 1234, "markerId" : "T", "dateFrom" : 6436058131202690000, "dateTo" : -3840351829778683400, "recordId" : 1, "account" : "152739203233" }, { "messageId" : 1234, "markerId" : "T", "dateFrom" : 6436058131202690000, "dateTo" : -3840351829778683400, "recordId" : 2, "email" : "jsmith@gmail.com", "firstName" : "John", "lastName" : "Smith" }, { "messageId" : 1234, "markerId" : "T", "dateFrom" : 6436058131202690000, "dateTo" : -3840351829778683400, "recordId" : 3, "city" : "Los Angeles" }, { "messageId" : 1234, "markerId" : "T", "dateFrom" : 6436058131202690000, "dateTo" : -3840351829778683400, "recordId" : 4 }, { "messageId" : 1234, "markerId" : "T", "dateFrom" : 6436058131202690000, "dateTo" : -3840351829778683400, "recordId" : 5 }, { "messageId" : 1234, "markerId" : "T", "dateFrom" : 6436058131202690000, "dateTo" : -3840351829778683400, "recordId" : 6, "account" : "6789189790191" }, { "messageId" : 1234, "markerId" : "T", "dateFrom" : 6436058131202690000, "dateTo" : -3840351829778683400, "recordId" : 7, "city" : "San Fransisco" } ] But still I dont now how to remove/filter records which have idNumber and accountNumber fields(in this case records 4,5,6). Someone can help me?
... View more
02-10-2025
03:21 AM
Hi i have json input like that: {
"messageId": 1234,
"markerId": "T",
"dateFrom": 6436058131202690000,
"dateTo": -3840351829778683400,
"records": [
{
"recordId": 1,
"account": "152739203233"
},
{
"recordId": 2,
"data": {
"email": "jsmith@gmail.com",
"firstName": "John",
"lastName": "Smith"
}
},
{
"recordId": 3,
"city": "Los Angeles"
},
{
"recordId": 4,
"idNumber": "12345"
},
{
"recordId": 5,
"accountNumber": "55671"
},
{
"recordId": 6,
"account": "6789189790191"
},
{
"recordId": 7,
"city": "San Fransisco"
}
]
} And I would like to have output like that: [ {
"messageId" : 1234,
"markerId" : "T"
"dateFrom" : 6436058131202690000,
"dateTo" : -3840351829778683400,
"recordId" : 1,
"account": "152739203233"
}, {
"messageId" : 1234,
"markerId" : "T"
"dateFrom" : 6436058131202690000,
"dateTo" : -3840351829778683400,
"recordId" : 2,
"email": "jsmith@gmail.com",
"firstName": "John",
"lastName": "Smith"
}, {
"messageId" : 1234,
"markerId" : "T"
"dateFrom" : 6436058131202690000,
"dateTo" : -3840351829778683400,
"recordId" : 3,
"city": "Los Angeles"
}, {
"messageId" : 1234,
"markerId" : "T"
"dateFrom" : 6436058131202690000,
"dateTo" : -3840351829778683400,
"recordId" : 6,
"account": "6789189790191"
}, {
"messageId" : 1234,
"markerId" : "T"
"dateFrom" : 6436058131202690000,
"dateTo" : -3840351829778683400,
"recordId" : 7,
"city": "San Fransisco"
} ] i need to have only this record with account, data(email,firstName,lastName) or city, i dont need records with idNumber and accountNumber. Additionally, I need each record to have a common part: messageId, markerId, dateFrom and dateTo Is it possible to do something like that with jolt transformation?
... View more
Labels:
- Labels:
-
Apache NiFi
05-22-2024
10:16 AM
Hi i have problem with NIFI-I download json records from Kafka, I split to have single records-here i have round robin strategy, I add attributes, then I give the id attribute and divide the flow into two paths, on one I do some actions with invokes http to reach some data and transform it to attributes, I add attributes, clear the context and I would like to merge it back to keep these jsons from one paths and attributes on the other. I use the merge content processor for this, merging using the id attribute that I assigned at the beginning. And now I have a problem because it works, but in the case of a larger number of records, e.g. 200, it doesn't because I have an error BIN_MANAGER_FULL. Minimum Number of Entries is 2 and Maximum Number of Entries is 2, while Maximum number of Bins is 10 and I would not like to increase it. Is there any way to bypass this and make it work for a larger number of records? For example, apply a limit so that it merges only when there is space or divede content and wait? I have no idea how to reslove this problem.
... View more
Labels:
- Labels:
-
Apache NiFi