Member since
08-18-2019
56
Posts
11
Kudos Received
18
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1479 | 11-15-2023 05:38 AM | |
| 5580 | 07-12-2023 05:38 AM | |
| 1278 | 01-10-2023 01:50 AM | |
| 1756 | 12-06-2022 12:08 AM | |
| 6270 | 12-02-2022 12:55 AM |
01-13-2021
06:08 AM
You can use the "PutSQL" Processor to read your content of the sql output and it will be stored in your database. Or maybe instead of ConvertJsonToSQL, you can try it directly with "PutDatabaseRecord" Processor
... View more
01-13-2021
04:15 AM
To change the key names of your JSON you can transform it with JOLT. The Processor name calls JoltTransformJSON In the property you can "Jolt Specification" you can insert following code, that would be change your key from ServiceGroup to service_group and afterwards you can send the flowfile to the sql processor.. [
{
"operation": "shift",
"spec": {
"ServiceGroup": "service_group",
"*": "&"
}
}
]
... View more
01-08-2021
04:08 AM
1 Kudo
Hi, i tried a solution for you: 1) GenerateFlowFile Its your GetFile Processor to get the csv file 2) ConvertRecord Convert with CSVReader to JsonRecordSetWriter 3) SplitJson Split each bbject (csv row) with $.* as path 4) EvaluateJsonPath Add dynamicilly property with name filename and value $.ID to get the ID as filename on flowfile attribute 5) UpdateAttribute Add type of file to filename attribute value ${filename:append('.csv')} 6) ConvertRecord Now is the question how to work.. You can convert json back to csv or you are working with wait/notify, so that you can overhand your "filename" attribute to your splitted csv flowfile..
... View more
01-06-2021
12:47 PM
Hello, you could use following regex [^^]+ with that you should get the whole content. Please take a look at the property "Maximum Capture Group Length" at the ExtractText Processor, that its not too short.
... View more
11-12-2020
10:45 AM
The problem is because your value is higher than the maximum of an integer Change "=toInteger" to "=toLong"
... View more
09-03-2020
12:10 PM
1 Kudo
At ExecuteSQL Processor is a property named by "Max Rows Per Flow File". There you can set how much rows each Flow File should be contain and later you can merge them like you wanted cause the flow files get an fragment attribute.
... View more
09-03-2020
08:18 AM
I also tried it with the Avro schema and type 'timestamp-millis' but I had the problem, that the milliseconds everytime got saved as .000 instead of .123, .987, ... So another solution for you would be to use Jolt and add "createdAt": { "$date": "${dateAttr}" } to your JSON, that converts the type to date in MongoDB
... View more
04-30-2020
12:11 AM
You can clone the processor bundle you need from https://github.com/apache/nifi/tree/master/nifi-nar-bundles and clone it to the filearchive. That would be solve your problem, because after every update you have the processors you need and if they came back to default, the new code will be merged in your repo. Greets
... View more
04-23-2020
12:26 AM
1 Kudo
maybe you should set for all or just for t1/t2.. fields an Alias like t1.CODE_CLOCK_ID as CODE_CLOCK_ID_s26,
t1.COMPANY_CD as COMPANY_CD_s26,
t1.GROUP_SEGMENT_L1 as GROUP_SEGMENT_L1_s26,
t1.GROUP_SEGMENT_CD as GROUP_SEGMENT_CD_s26,
t2.CODE_CLOCK_ID asCODE_CLOCK_ID_s27,
t2.COMPANY_CD as COMPANY_CD_s27,
t2.GROUP_SEGMENT_L1 as GROUP_SEGMENT_L1_s27,
t2.GROUP_SEGMENT_CD as GROUP_SEGMENT_CD_s27 Greets
... View more
03-25-2020
08:28 AM
thanks @MattWho for your answer. The goal for me is to create an output with the merged attributes so that I can notify myself what elements have been edited. My thought was to mail me something like this: Postion 1,2,3,5,8,13 has been changed *1,2,3,5,8,13 are the attribute values of each flowfile Dont know how i can solve this otherwise in one Flow/Notification
... View more
- « Previous
- Next »