Member since
07-13-2020
58
Posts
2
Kudos Received
10
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1951 | 09-04-2020 12:33 AM | |
| 10928 | 08-25-2020 12:39 AM | |
| 4091 | 08-24-2020 02:40 AM | |
| 2955 | 08-21-2020 01:06 AM | |
| 1809 | 08-20-2020 02:46 AM |
07-14-2021
12:36 AM
Is it possible to show what the character is exactly? With logical type string it should accept any character as long as is it not a invalid or garbage value....
... View more
07-05-2021
11:13 PM
Hi..,if you know the exact number of files to be transferred then yes….but if the count is variable then since nifi is a flow tool there is no concept as the last file…. You can send an email if you find any files in the failure relationship… this way you will know if the transfer went well or not.., If this seems obvious please share a sketch of what you wish to inplement for a better solution…
... View more
07-05-2021
05:13 AM
Seems like db2 is using some data type which is not recognizable to Avro. You can try to Disable Avro logical types to false in QueryDatabaseTable and then parse the data correctly within the flow. If you find the answer helpful please accept this as a solution.
... View more
07-05-2021
04:52 AM
Hi...For a single email alert, use MergeContent processor (on some condition) before PutEmail. If you have a definite number of files then in MergeContent you can add Minimum Number of Entries as 10. If not, then only way is time bound. For transfer check, you will have to provide more info here. Where is it transferred? Does nifi handle the transfer? If nifi handles the transfer, you will have files in Failure relationship so that should tell you if the transfer for all files was successful or not. If you find the answer helpful please accept this as a solution.
... View more
07-05-2021
02:17 AM
It seems that the avro logical type are not matching. Please double check the data types. If you find the answer helpful please accept this as a solution.
... View more
09-08-2020
07:34 AM
Then im afraid its difficult to do so. I dont understand how you are feeding the queries to execute sql. Maybe its good to feed executesql with manageable queries. If you are using GenerateTableFetch then it allows you to break a big query into smaller queries like you want and feed it to ExecuteSQL. Hope this helps. Please do post back on how to managed to move forward.
... View more
09-07-2020
04:33 AM
Why not use the 2nd option i said above....Use splitcontent or splitrecord and then merge it later whenever you want it.
... View more
09-04-2020
12:33 AM
Hi....you can create a single flow as long as you can distinguish the files for e.g using the filename. You can route on attribute and load it different table. Hope this helps. If the comment helps you to find a solution or move forward, please accept it as a solution for other community members
... View more
09-04-2020
12:12 AM
You should use ListDatabaseTable and generatetablefetch to perform an incremental load. If you are joining the tables, you can do a replacetext after generatetablefetch to add the join query and then feed the flowfile to execute sql. You can split the amount of data in generatetablefetch. OR You can use splitrecord / splitcontent to split the single avro to multiple smaller files and then use mergecontent to merge them back if required. Hope this helps. If the comment helps you to find a solution or move forward, please accept it as a solution for other community members
... View more
08-25-2020
06:33 AM
Unfortunately, i havent used parquet at all...I would assume 'not a data file' could mean either the file doesnt have schema embedded or file is not in correct format (conversion didnt work)
... View more