Member since
11-16-2015
892
Posts
650
Kudos Received
245
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
5664 | 02-22-2024 12:38 PM | |
1388 | 02-02-2023 07:07 AM | |
3085 | 12-07-2021 09:19 AM | |
4205 | 03-20-2020 12:34 PM | |
14158 | 01-27-2020 07:57 AM |
03-24-2021
03:23 PM
1 Kudo
What are the column names in your table? Assuming "carId" and "carType", you can use JoltTransformJson or JoltTransformRecord with the following spec: [ { "operation": "shift", "spec": { "*": { "$": "carId", "@": "carType" } } }, { "operation": "shift", "spec": { "carId": { "*": { "@": "[&0].carId" } }, "carType": { "*": { "@": "[&0].carType" } } } } ]
... View more
03-02-2021
08:55 AM
How to perform the same for the very first occurrence of [ and last occurrence of ].
... View more
02-14-2021
12:23 AM
Hi @mburgess Can you please elaborate on which property needs to be configured in the GrokReader controller service for using the kv filter? I'm trying to parse the incoming key=value pair. Input: key1=value1,key2=value2,key3=value3,key4=value4 output: I need key1, key2, key3, key4 as attributes and their respective values as attribute values I can use %{GREEDYDATA:msgbody} in the GrokExpression property but I do not know where to provide kv { source = "msgbody" } Your help is appreciated
... View more
01-29-2021
04:54 PM
Is there anything in the logs before/after the "already marked for transfer" entry? Trying to figure out how a flow file can get transferred and then something goes wrong (where we'd try to also send it to failure)
... View more
10-22-2020
04:18 PM
@mburgess Helpful information shared. I am using Nifi 1.7.1 For my case, the incremental fetching does not seem to work correctly. All records gets ingested from the database but do not make it all the way to the destination. Processor used is GenerateTableFetch then Execute SQL and the other corresponding processors down the data processing flow. Record id is captured correctly on the GenerateTableFetch property state and its up to date as the record id from the source (db). However, it will still miss some records when processing the files making the number of records on the destination out of sync with the source from the db. Am i missing something, Would scheduling times for fetching help and how can I do that?
... View more
09-30-2020
10:53 AM
Hi @Ayaz , @mburgess ! Please have a look at this spec as well! [ { "operation": "shift", "spec": { "*": { "BRANCH_CODE": "[&1].Fields.FLD0001", "CUST_NO": "[&1].Fields.FLD0002", "AC_DESC": "[&1].Fields.FLD0003", "CUST_AC_NO": "[&1].ExternalSystemIdentifier", "#1": "[&1].InstitutionId" } } } ] Just FYI!
... View more
09-22-2020
09:56 PM
1 Kudo
@mburgess @ljonnavi Thank you!
... View more
09-09-2020
03:00 PM
Hi, I am having exactly the same problem, I was wondering if it's possible to share the configuration did you use to get that result. Thank you in advance!
... View more
08-20-2020
05:50 AM
@nkimani when I change table name to "EMPLOYEE" it gives error 'stream has been closed' and when I give table name 'employee' it gives error 'none of the fields are matching with record map with employee table '
... View more
07-29-2020
04:00 PM
Did you get a solution to this . I am also getting a communication error . My Nifi Instance and Mysql are on the same linux server .
... View more