Member since
07-31-2017
7
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
6128 | 07-31-2017 08:41 PM |
09-22-2017
03:08 PM
Thanks Ryan. Can you please verify the following would work? The value of the FlowFileAttribute grok.expression is (?<severity>.{1}) (?<time>.{8}) (?<sequence>.{8}) (?<source>.{12}) (?<destination>.{12}) (?<action>.{30}) %{GREEDYDATA:data} Within Configure Processor of the ExtractGrok Processor, the value of Grok Expression is ${grok.expression} The expected behavior is that the ExtractGrok Processor would continue to work as though the Grok Expression were hardcoded with (?<severity>.{1}) (?<time>.{8}) (?<sequence>.{8}) (?<source>.{12}) (?<destination>.{12}) (?<action>.{30}) %{GREEDYDATA:data}
... View more
09-21-2017
10:27 PM
The following works great in the NiFi ReplaceText Processor Flowfile Content: US0706003247984600Z1Z000123371K
US0706003247984600Z1Z000125491K
US0706003247984600Z1Z000125596K Search Value: (.{2})(?:.{4})(.{6})(.{2})(.{4})(.{6})(.{6})(.{1}) Replacement Value: {col_foo1:$1,col_foo3:$2,col_foo4:$3,col_foo5:$4,col_foo6:$5,col_foo7:$6,col_foo8:$7}, Output: {col_foo1:US,col_foo3:003247,col_foo4:98,col_foo5:4600,col_foo6:Z1Z000,col_foo7:123371,col_foo8:K}
{col_foo1:US,col_foo3:003247,col_foo4:98,col_foo5:4600,col_foo6:Z1Z000,col_foo7:125491,col_foo8:K},
{col_foo1:US,col_foo3:003247,col_foo4:98,col_foo5:4600,col_foo6:Z1Z000,col_foo7:125596,col_foo8:K}, however, I need to store the Search Value in an Attribute (e.g. search.value) and the Replacement Value in an Attribute (e.g. replace.value), which will be passed in a via a configuration file. Flowfile Content: US0706003247984600Z1Z000123371K
US0706003247984600Z1Z000125491K
US0706003247984600Z1Z000125596K Search Value: ${search.value} search.value Attribute: (.{2})(?:.{4})(.{6})(.{2})(.{4})(.{6})(.{6})(.{1}) Replacement Value: ${replacement.value} replacement.value Attribute: {col_foo1:$1,col_foo3:$2,col_foo4:$3,col_foo5:$4,col_foo6:$5,col_foo7:$6,col_foo8:$7}, Output: US0706003247984600Z1Z000123371K US0706003247984600Z1Z000125491K
US0706003247984600Z1Z000125596K which appears to indicate that the regex content of each of the Attribute values is not being evaluated properly. Any ideas are greatly appreciated.
... View more
Labels:
- Labels:
-
Apache NiFi
07-31-2017
08:41 PM
Thanks for your help. Finally figured out the Nifi JSONToAvro processor can only accept 1 row at a time, versus an array of rows. The following worked for me: <Json array containing multiple rows of data> --> SplitJson --> ConvertJSONToAvro --> MergeContent ... Thanks again for your help.
... View more
07-31-2017
06:34 PM
Hmm, the following schema doesn't validate. Any ideas? {
"type":"record",
"name":"vendors",
"fields":
[
"name":"fields",
"type":
{
"type":"array",
"items":
{
"name":"vendor",
"type":"record",
"fields":
[
{
"name":"businessentityid",
"type":["int","null"]
},
{
"name":"accountnumber",
"type":["string","null"]
},
{
"name":"name",
"type":["string","null"]
},
{
"name":"creditrating",
"type":["int","null"]
}
]
}
}
]
}
... View more
07-31-2017
05:50 PM
So the schema should look something like? ... {
"type":"record",
"name":"vendor",
"fields":
[
"name":"children",
"type":
{
"type":"array",
"items":
{
"name":"Child",
"type":"record",
"fields":
[
{
"name":"businessentityid",
"type":["int","null"]
},
{
"name":"accountnumber",
"type":["string","null"]
},
{
"name":"name",
"type":["string","null"]
},
{
"name":"creditrating",
"type":["int","null"]
}
]
}
}
]
}
... View more
07-31-2017
03:04 PM
I'm trying to ingest JSON data and store it in AVRO format for consumption downstream. I'm using the ConvertJSONToAvro processor (1.3). The data is coming out of the processor as successful, but there is no data. Here is the incoming JSON {
"type" : "record",
"name" : "vendor",
"fields" : [ {
"businessentityid" : 1518,
"accountnumber" : "INTERNAT0004",
"name" : "International Trek Center",
"creditrating" : 1
}, {
"businessentityid" : 1520,
"accountnumber" : "G&KBI0001",
"name" : "G & K Bicycle Corp.",
"creditrating" : 1
}, {
"businessentityid" : 1546,
"accountnumber" : "GREENLA0001",
"name" : "Green Lake Bike Company",
"creditrating" : 1
}, {
"businessentityid" : 1574,
"accountnumber" : "JEFFSSP0001",
"name" : "Jeff's Sporting Goods",
"creditrating" : 1
}, {
"businessentityid" : 1594,
"accountnumber" : "FITNESS0001",
"name" : "Fitness Association",
"creditrating" : 1
}, {
"businessentityid" : 1636,
"accountnumber" : "INTEGRAT0001",
"name" : "Integrated Sport Products",
"creditrating" : 1
}, {
"businessentityid" : 1676,
"accountnumber" : "TEAMATH0001",
"name" : "Team Athletic Co.",
"creditrating" : 1
} ]
}
Here is the AVRO schema {
"type":"${schema.type}",
"name":"${schema.name}",
"fields":
[
{
"name":"businessentityid",
"type":["int","null"]
},
{
"name":"accountnumber",
"type":["string","null"]
},
{
"name":"name",
"type":["string","null"]
},
{
"name":"creditrating",
"type":["int","null"]
}
]
}
Here is the JSON output, after converting it back to JSON. {
"businessentityid" : null,
"accountnumber" : null,
"name" : "vendor",
"creditrating" : null
}
Any assistance is greatly appreciated. BTW, trying to turn up the logging to anything other than INFO in Nifi has not yielded any different logging behavior. Thanks, M
... View more
Labels:
- Labels:
-
Apache NiFi