Member since
08-18-2019
56
Posts
11
Kudos Received
18
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
641 | 11-15-2023 05:38 AM | |
2326 | 07-12-2023 05:38 AM | |
695 | 01-10-2023 01:50 AM | |
983 | 12-06-2022 12:08 AM | |
3813 | 12-02-2022 12:55 AM |
11-19-2020
10:11 AM
Your csv is content of the flowfile? Use PutDatabaseRecord and create CsvReader as Controller. You dont need an Avro Schema if the column names of both are the same
... View more
11-12-2020
11:21 AM
How many flowfiles you get after splitting? But as example for your properties of your processor If you get like 10 flowfiles set 'minimum number of entries' to 1000 (like maximum) and you have to set the 'minimum bin age' like 10 sec So if the first flowfile comes in with attribute value ABC the flowfile gets hold and the 10 sec timer gets startet and if the next one comes in within the time it is in the same bin. For every bin (in your case 3) the timer gets started for its own. minimum number of entries gets ignored cause you set the minimum bin age
... View more
11-12-2020
10:52 AM
Dont know if it is the best solution, but if you know the keys of the JSON, you can use the Processor 'EvaluateJsonPath' and put the values as attribute to the flowfile. Then use Processor 'AttributsToJson' and set the JSON key and the value with expression language ${attr:trim()}
... View more
11-12-2020
10:45 AM
The problem is because your value is higher than the maximum of an integer Change "=toInteger" to "=toLong"
... View more
09-03-2020
12:10 PM
1 Kudo
At ExecuteSQL Processor is a property named by "Max Rows Per Flow File". There you can set how much rows each Flow File should be contain and later you can merge them like you wanted cause the flow files get an fragment attribute.
... View more
09-03-2020
08:18 AM
I also tried it with the Avro schema and type 'timestamp-millis' but I had the problem, that the milliseconds everytime got saved as .000 instead of .123, .987, ... So another solution for you would be to use Jolt and add "createdAt": { "$date": "${dateAttr}" } to your JSON, that converts the type to date in MongoDB
... View more
05-04-2020
11:43 AM
Looks like there is on a table 2 or more PRIMERY_KEY columns. You can check it with: SHOW CREATE TABLE MYSQL_TABLE_NAME The Problem occurs if the same value is entered for several primary key columns. So you have to remove some of them
... View more
05-04-2020
11:17 AM
Hi, i think your data-format .xlsx is Content of your Flowfile and you don´t have to convert it firtstly? So you can easy add a UpdateAttribute Processor and add the Dynamic-Property "filename" With this Property you overwrite the actually set Filename (you can see there is already an Attribute called "filename" from beginning). As Value you can set the name via Attribute and add the format there e. g. ${filename:append('.xlsx')} or just hard coded like file123.xlsx And after that you add a PutFile or FTP or whatever Processor and send it to the Path you set there.
... View more
04-30-2020
12:11 AM
You can clone the processor bundle you need from https://github.com/apache/nifi/tree/master/nifi-nar-bundles and clone it to the filearchive. That would be solve your problem, because after every update you have the processors you need and if they came back to default, the new code will be merged in your repo. Greets
... View more
04-23-2020
12:26 AM
1 Kudo
maybe you should set for all or just for t1/t2.. fields an Alias like t1.CODE_CLOCK_ID as CODE_CLOCK_ID_s26,
t1.COMPANY_CD as COMPANY_CD_s26,
t1.GROUP_SEGMENT_L1 as GROUP_SEGMENT_L1_s26,
t1.GROUP_SEGMENT_CD as GROUP_SEGMENT_CD_s26,
t2.CODE_CLOCK_ID asCODE_CLOCK_ID_s27,
t2.COMPANY_CD as COMPANY_CD_s27,
t2.GROUP_SEGMENT_L1 as GROUP_SEGMENT_L1_s27,
t2.GROUP_SEGMENT_CD as GROUP_SEGMENT_CD_s27 Greets
... View more
- « Previous
- Next »