Member since
08-18-2019
33
Posts
5
Kudos Received
7
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
117 | 01-13-2021 04:15 AM | |
105 | 01-08-2021 04:08 AM | |
102 | 01-06-2021 12:47 PM | |
158 | 11-12-2020 10:45 AM | |
570 | 04-23-2020 12:26 AM |
01-19-2021
03:00 PM
Just asking, what is the result if you change the Property 'Destination' at AttributesToJSON to flowfile-content? 😉 😄 😄
... View more
01-13-2021
06:08 AM
You can use the "PutSQL" Processor to read your content of the sql output and it will be stored in your database. Or maybe instead of ConvertJsonToSQL, you can try it directly with "PutDatabaseRecord" Processor
... View more
01-13-2021
04:47 AM
Did you already think about to transform your csv to json and then rebuild it with JOLT?
... View more
01-13-2021
04:15 AM
To change the key names of your JSON you can transform it with JOLT. The Processor name calls JoltTransformJSON In the property you can "Jolt Specification" you can insert following code, that would be change your key from ServiceGroup to service_group and afterwards you can send the flowfile to the sql processor.. [
{
"operation": "shift",
"spec": {
"ServiceGroup": "service_group",
"*": "&"
}
}
]
... View more
01-08-2021
04:08 AM
1 Kudo
Hi, i tried a solution for you: 1) GenerateFlowFile Its your GetFile Processor to get the csv file 2) ConvertRecord Convert with CSVReader to JsonRecordSetWriter 3) SplitJson Split each bbject (csv row) with $.* as path 4) EvaluateJsonPath Add dynamicilly property with name filename and value $.ID to get the ID as filename on flowfile attribute 5) UpdateAttribute Add type of file to filename attribute value ${filename:append('.csv')} 6) ConvertRecord Now is the question how to work.. You can convert json back to csv or you are working with wait/notify, so that you can overhand your "filename" attribute to your splitted csv flowfile..
... View more
01-06-2021
12:47 PM
Hello, you could use following regex [^^]+ with that you should get the whole content. Please take a look at the property " Maximum Capture Group Length" at the ExtractText Processor, that its not too short.
... View more
01-03-2021
06:33 AM
Sorry, cant visualize it, but your flow can be look like this: QueryDatabaseTableRecord -> Wait ->PutS3Object
-> FetchFile -> EvaluateJsonPath -> Notify You can overhand the attribute from JSON to the waiting FlowFile. If you need help with configuration or other stuff for this, just text me.
... View more
12-04-2020
07:04 AM
try this "createdon" : {
"$date" : "putInHereDate"
} instead of "createdon": ISODate("2017-03-03") you can solve this also via JOLT
... View more
11-19-2020
10:11 AM
Your csv is content of the flowfile? Use PutDatabaseRecord and create CsvReader as Controller. You dont need an Avro Schema if the column names of both are the same
... View more
11-12-2020
11:21 AM
How many flowfiles you get after splitting? But as example for your properties of your processor If you get like 10 flowfiles set 'minimum number of entries' to 1000 (like maximum) and you have to set the 'minimum bin age' like 10 sec So if the first flowfile comes in with attribute value ABC the flowfile gets hold and the 10 sec timer gets startet and if the next one comes in within the time it is in the same bin. For every bin (in your case 3) the timer gets started for its own. minimum number of entries gets ignored cause you set the minimum bin age
... View more
11-12-2020
10:52 AM
Dont know if it is the best solution, but if you know the keys of the JSON, you can use the Processor 'EvaluateJsonPath' and put the values as attribute to the flowfile. Then use Processor 'AttributsToJson' and set the JSON key and the value with expression language ${attr:trim()}
... View more
11-12-2020
10:45 AM
The problem is because your value is higher than the maximum of an integer Change "=toInteger" to "=toLong"
... View more
09-03-2020
12:10 PM
1 Kudo
At ExecuteSQL Processor is a property named by "Max Rows Per Flow File". There you can set how much rows each Flow File should be contain and later you can merge them like you wanted cause the flow files get an fragment attribute.
... View more
09-03-2020
08:18 AM
I also tried it with the Avro schema and type 'timestamp-millis' but I had the problem, that the milliseconds everytime got saved as .000 instead of .123, .987, ... So another solution for you would be to use Jolt and add "createdAt": { "$date": "${dateAttr}" } to your JSON, that converts the type to date in MongoDB
... View more
05-04-2020
11:43 AM
Looks like there is on a table 2 or more PRIMERY_KEY columns. You can check it with: SHOW CREATE TABLE MYSQL_TABLE_NAME The Problem occurs if the same value is entered for several primary key columns. So you have to remove some of them
... View more
05-04-2020
11:17 AM
Hi, i think your data-format .xlsx is Content of your Flowfile and you don´t have to convert it firtstly? So you can easy add a UpdateAttribute Processor and add the Dynamic-Property "filename" With this Property you overwrite the actually set Filename (you can see there is already an Attribute called "filename" from beginning). As Value you can set the name via Attribute and add the format there e. g. ${filename:append('.xlsx')} or just hard coded like file123.xlsx And after that you add a PutFile or FTP or whatever Processor and send it to the Path you set there.
... View more
04-30-2020
12:11 AM
You can clone the processor bundle you need from https://github.com/apache/nifi/tree/master/nifi-nar-bundles and clone it to the filearchive. That would be solve your problem, because after every update you have the processors you need and if they came back to default, the new code will be merged in your repo. Greets
... View more
04-23-2020
12:26 AM
1 Kudo
maybe you should set for all or just for t1/t2.. fields an Alias like t1.CODE_CLOCK_ID as CODE_CLOCK_ID_s26,
t1.COMPANY_CD as COMPANY_CD_s26,
t1.GROUP_SEGMENT_L1 as GROUP_SEGMENT_L1_s26,
t1.GROUP_SEGMENT_CD as GROUP_SEGMENT_CD_s26,
t2.CODE_CLOCK_ID asCODE_CLOCK_ID_s27,
t2.COMPANY_CD as COMPANY_CD_s27,
t2.GROUP_SEGMENT_L1 as GROUP_SEGMENT_L1_s27,
t2.GROUP_SEGMENT_CD as GROUP_SEGMENT_CD_s27 Greets
... View more
04-20-2020
03:14 AM
Is it possible that you can send clean code without confusing codelines and lost brackets?
... View more
03-25-2020
08:28 AM
thanks @MattWho for your answer. The goal for me is to create an output with the merged attributes so that I can notify myself what elements have been edited. My thought was to mail me something like this: Postion 1,2,3,5,8,13 has been changed *1,2,3,5,8,13 are the attribute values of each flowfile Dont know how i can solve this otherwise in one Flow/Notification
... View more
03-18-2020
02:16 PM
Hello together,
I splitted a flow file for each x value.
With value x I fetch value y and set it to the Attribute.
Is it possible if I merge the FlowFiles together, that the Attribute values can be merged comma-seperated too?
... View more
Labels:
03-10-2020
12:58 PM
Hello, I use the QueryDatabaseTable Processor for fetching the Delta of a MSSQL-DB. My input is a SQL-Query which is based on a SQL-VIEW. At the Property " Maximum-value Columns" I set a date value. My Problem is that the initial Fetch is working and at "View state" the date is set. But if I change some dates to a actually one it doesnt fetch the dataset.. Is the Problem because it is based on a View? Because locally it works.. The mistery is that if I delete the "View State" and run the Processor again it gets all data and the new date I set is now at the "view state" set. Thanks for help
... View more
- Tags:
- NiFi
Labels:
02-18-2020
07:23 AM
1 Kudo
Maybe your version with template have (lower) version than 1.9.0 and other one is higher? There some bundles removed from the default. You can find it here: https://github.com/apache/nifi/tree/master/nifi-nar-bundles/nifi-kite-bundle
... View more
02-17-2020
05:23 AM
Go to the Processor and select "Scheduling". As Scheduling strategy you can use "Cron driven" and at the run schedule there you can set, when the Processor should be check for a Delta. Examples: Every 2 hours 0 0 0/2 1/1 * ? * Daily at 1 a.m. 0 0 1 1/1 * ? * Weekly at 3 p.m. on every monday 0 0 15 ? * MON *
... View more
01-27-2020
02:52 AM
Currently it works with two decoupled Flows. But you seen the problem of the scheduling, because if NiFi is going down for maintance and is started again the manual scheduling is cancelled and the two Flows starts at the same time. It´s seems like it is not possible to work with Apache Oozie for this reason. I need a solution to handle it in one Flow but thanks!
... View more
01-22-2020
02:25 AM
Hello everybody, I´m starting my Flow with the Processor " QueryDatabaseTable" and set the Property "Maximum-value Columns" to fetch the Delta of new Rows after a first initial import. Now I need to get the Delta of updated Rows too in the same Flow, but I can´t use the Processor "QueryDatabaseTable" again because there can´t be get a input Relationship. Is there any solution to get 2 Delta checks in one single Flow?
... View more
- Tags:
- NiFi
Labels:
01-16-2020
05:08 AM
You can use " ConvertAvroToJSON" and then the "EvaluateJsonPath" Processor and add there the Property like "workingHours" and as Value you set your JsonKey like $.hours With this you add it to the attribute of your FlowFile.
... View more
01-16-2020
03:30 AM
You can use " QueryDatabaseTable" as input processor, but you can´t use it twice in one flow (Im searching for a solution that I can get a delta in one flow for 2 Fields btw. :D). You have to set the Property "Maximum-value Columns". It can be a date, id or something else to have the maximum of a value. At "view state" you can see the value of "Maximum-value Columns" field you set. To set up the runtime you can go to "Schedule" at the processor and change the "Run Schedule" If you need more input, please let me know
... View more
01-15-2020
11:19 AM
Hello,
I use the QueryDatabaseTable as starting processor to fetch for new rows.The main problem for me is now, that I can´t use the same process for another field in one single flow to get another fetch for changed rows.
Is there any solution how I can solve it?
Thanks for your help. If you need more Information, please tell me
... View more
- Tags:
- NiFi
08-27-2019
01:42 AM
1 Kudo
You don´t need a particular exception. In catch block of onTrigger method you add like session.transfer(flowFile , REL_ FAILURE ) ;
... View more