Member since
09-24-2019
4
Posts
0
Kudos Received
0
Solutions
02-26-2020
04:08 AM
Json Input:
{
"value_schema_id": "GBM_KF_DH_ClientFilter",
"system_code": "LogMessage",
"event_type": "PROCESSOR",
"metric_name": "ERROR",
"metric_id": "1",
"global_person_profile_id": "2",
"create_ts": "3"
}
JOLT Specification:
[
{
"operation": "shift",
"spec": {
"value_schema_id": "value_schema_id",
"system_code": "records[1].value.system_code",
"event_type": "records[1].value.event_type",
"metric_name": "records[1].value.event_detail.metric_name",
"metric_id": "records[1].value.event_detail.metric_id",
"global_person_profile_id": "records[1].value.event_detail.global_person_profile_id",
"create_ts": "records[1].value.event_detail.create_ts"
}
}
]
Json Output
{
"value_schema_id" : "GBM_KF_DH_ClientFilter",
"records" : [ null, {
"value" : {
"system_code" : "LogMessage",
"event_type" : "PROCESSOR",
"event_detail" : {
"metric_name" : "ERROR",
"metric_id" : "1",
"global_person_profile_id" : "2",
"create_ts" : "3"
}
}
} ]
}
Can you please help me to take out nulls from the output-json.
... View more
Labels:
- Labels:
-
Apache Hadoop
02-17-2020
04:32 AM
Hi,
Can someone please help me in transforming below json, I tried by following the nifi jolt transormation guide available in the community, but I couldn't. Thanks in advance.
[ { "value_schema_id" : "XXXXXXX", "system_code" : "XXXXXXXXX", "event_type" : "XXXXXXXXX", "metric_name" : "XXXXXXX", "metric_id" : "1", "global_person_profile_id" : "2", "create_ts" : "3" }, { "value_schema_id" : "XXXXXXX", "system_code" : "XXXXXXX", "event_type" : "XXXXXXXXX", "metric_name" : "XXXXXXX", "metric_id" : "1", "global_person_profile_id" : "2", "create_ts" : "3" } ]
As mentioned below:
{
"value_schema_id": XXXXXX,
"records": [
{
"value": {
"system_code": "XXXXXX",
"event_type": "XXXXXX",
"event_detail": {
"metric_name": "XXXXXX",
"metric_id": "XXXXXX",
"global_person_profile_id": "XXXXXX",
"create_ts": "XXXXXX"
}
}
}
]
}
... View more
Labels:
- Labels:
-
Apache NiFi
09-26-2019
08:13 AM
Hi Dennis, Thanks for the response. I already implemented the fetch size property as 20000 and Max Rows Per flow file is 20000. My question is is NIFI capable of handling 526 columns? The reason why I am asking this is, I can see that nifi is pulling the data, but the performance is not upto the mark. I would like to know is there a better approach, other than the splitting and re-joining the columns. I think if I go with around 200 columns, NiFi will be faster. -Thanks Vamcy
... View more
09-24-2019
05:04 AM
Hi,
I have a table with 526 columns. QueryDataBase processor is not able to fetch all the rows and it is failing. I am stuck on how to proceed further or which processor to use to fetch the rows. Can someone please guide me.
... View more
Labels:
- Labels:
-
Apache NiFi