Member since
11-16-2015
892
Posts
649
Kudos Received
245
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
5361 | 02-22-2024 12:38 PM | |
1347 | 02-02-2023 07:07 AM | |
3016 | 12-07-2021 09:19 AM | |
4163 | 03-20-2020 12:34 PM | |
13977 | 01-27-2020 07:57 AM |
05-30-2019
02:19 PM
The JsonPath Expression is meant to identify an array, then SplitJson will split each element into its own flow file. Try "$.data" as your JsonPath Expression, and use the "splits" relationship to send things downstream. The "original" relationship will contain the incoming flow file, which doesn't sound like what you want.
... View more
05-29-2019
04:39 PM
In nifi-assembly/target you'll find the built system as you mention, including a "conf" folder that contains (among other things) a file called bootstrap.conf. In that file there's a commented out JVM property to enable attachment by a debugger (the preceding line says "Enable Remote Debugging". When you uncomment that argument and start NiFi, it will listen on port 8000 for a debugger to attach. You can then attach a debugger from your IDE (Eclipse, NetBeans, IntelliJ, etc.). You can change the port and/or set "suspend=y" if you want it to wait until a debugger is attached before continuing startup, the latter is helpful if you are debugging something early in the startup sequence. Otherwise you can wait for NiFi to finish starting up and then attach whenever you like.
... View more
05-24-2019
02:37 AM
2 Kudos
You were so close! By using the [] syntax it just adds to the outgoing array, but you wanted to associate them with the same index, namely the one matched by the * "above" the fields. Put #2 inside your braces (#2 is a reference to the array index you're iterating over, "two levels up" from where you are in the spec): [{
"operation": "shift",
"spec": {
"nummer": "Nummer",
"table": {
"*": {
"zn": "Positionen.[#2].ZeileNr",
"datum": "Positionen.[#2].Datum"
}
}
}
}]
... View more
05-10-2019
06:18 PM
1 Kudo
To address your comment below, I missed the part where you want to call the outgoing field "color". Change this line (8): "$": "colorsLove[].&2" To this: "$": "colorsLove[].color"
... View more
05-10-2019
04:27 PM
1 Kudo
This Chain spec will add the hardcoded value 20190905 into the array (after removing empty values): [
{
"operation": "shift",
"spec": {
"color_*": {
"": "TRASH",
"*": {
"$": "colorsLove[].&2"
}
},
"*": "&"
}
},
{
"operation": "shift",
"spec": {
"colorsLove": {
"*": {
"#20190905": "colorsLove[#2].date",
"*": "colorsLove[#2].&"
}
},
"*": "&"
}
},
{
"operation": "remove",
"spec": {
"TRASH": ""
}
}
] You should be able to replace "#20190905" with a NiFi Expression Language statement, maybe something like: "#${now:toNumber():format('yyyyddMM')}" ... but I didn't try that part.
... View more
05-07-2019
01:58 PM
1 Kudo
What does the generated SQL coming from ConvertJSONToSQL look like? Are the fields correctly uppercased? Does your database lowercase the column names? Did you try setting the "Translate Field Names" property to "true" in ConvertJSONToSQL? Does the case of the table name in the SQL match the case of the table name in the DB? If you're using "dbo.xxxx" as the Table Name property in ConvertJSONToSQL, instead try using just "xxxx" as the Table Name, and setting either Catalog Name or Schema Name (depending on your DB) to "dbo" (or DBO if necessary).
... View more
05-07-2019
01:37 PM
You should be able to use SplitXml -> PutHDFS, using the Split Depth property to specify where to do the tag splitting. Each tag at that depth will be output as a separate flow file which you can send to HDFS via the PutHDFS processor. You may need to use UpdateAttribute to set the filename attribute, which is used by PutHDFS as the target filename.
... View more
05-07-2019
01:30 PM
1 Kudo
If you are not obtaining keys from the database, not using fragmented transactions, and not rolling back on failure, then you should see the failed flow files in a batch being routed to the failure relationship. If you must configure the processor differently, then the flow files will be treated as a single transaction. In that case, in order to handle individual failures you'll want to not use batches, meaning set PutSQL's Batch Size property to 1.
... View more
05-02-2019
08:51 PM
1 Kudo
I don't think your desired output is valid JSON, as the root object only has an array in it, not a key/value pair. If you want a key in there (let's call it root) the following spec will work in JoltTransformJSON: [
{
"operation": "shift",
"spec": {
"*": "root.[]"
}
}
] Otherwise if you just want to add braces around the array, you can use ReplaceText, replacing the entire thing with {$1}
... View more
04-29-2019
01:49 PM
1 Kudo
You can use the SiteToSiteProvenanceReportingTask for this. Filter the reporting task to only emit events at the "certain point" you mention above. Each event has a "timestampMillis" and "lineageStart" field, you should be able to route on the difference of the two using QueryRecord, with something like: SELECT * FROM FLOWFILE WHERE timestampMillis - lineageStart > 60000 Which should emit a flowfile containing all events for which the associated entity (in this case, the flow file in the system) has been in the flow for over a minute.
... View more