Member since
07-29-2020
574
Posts
323
Kudos Received
176
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2143 | 12-20-2024 05:49 AM | |
| 2438 | 12-19-2024 08:33 PM | |
| 2189 | 12-19-2024 06:48 AM | |
| 1456 | 12-17-2024 12:56 PM | |
| 2082 | 12-16-2024 04:38 AM |
01-31-2023
10:16 AM
If you are using Nifi 1.16 or higher I would also refer you to the ForEnrichment & JoinEnrichment processors that can help you with what you are trying to do. I think you can use those processor regardless if you read the lookup dataset directly from HTTP or after you loaded into the DistributedMapCache: https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.12.1/org.apache.nifi.processors.standard.JoinEnrichment/additionalDetails.html Hope that helps.
... View more
01-31-2023
08:04 AM
Hi, Before trying to answer your question I'm trying to understand how are you planning to populate the lookup dataset in "memory" ? Are you thinking of doing it manually in case you have limited number of lookups as you would do with "SimpleKeyValueLookupService" or you just want to read it once from a file\db and populate it in some lookup service like "DistributedMapCacheLookupService" ?
... View more
01-30-2023
02:34 PM
1 Kudo
Hi, There are multiple ways you can resolve this issue without having to do all processors you mentioned above. Some options: 1- QueryRecord 2- UpdateRecord 3- JsonJoltTransformation. If you elect to use QueryRecord which is what is been suggested for such anti-pattern , then you can create new dynamic property on the QueryRecord to represent the new relationship which will have the result of the select statement with new Key field as follows : The select query syntax for the "AddKeyRel": select *,airport||'/'||runway as key from FlowFile The || symbol is used for concatenation in such query and processor. The JsonTreeReader and the JsonRecordSetWriter are configured as follows: If the helps please accept solution. Thanks
... View more
01-26-2023
10:10 AM
Hi, Not sure what is the problem with the current situation. Usually when the first process group finishes and send the flowfile to the second process group, this flowfile will be dropped when executing the the "GenerateTableFetch" in the second group and new flowfiles will be generated based on that. Another option if you dont want to have any relationships between the first and second process groups through output\input ports and make them completely independent where no flowfiles are transferred between them is to use the Nifi Rest Api to trigger the second group GenerateTableFetch from the first group using the InvokeHttp Processor. Hope that helps. Thanks
... View more
01-26-2023
06:22 AM
Hi, You dont need to have programming language to learn Expression Language. It can help but its not required.
... View more
01-25-2023
12:42 PM
Hi , I think the right syntax will be : ${field.value:raplace('a', 'i'):replace('e', 'u'):replace(...)} Hope that helps. Thanks
... View more
01-24-2023
08:57 AM
Hi, Try the following: [
{
"operation": "shift",
"spec": {
"*": {
"*": "[#2].&",
"$": "[#2].product"
}
}
}
] If that helps please accept solution. Thanks
... View more
01-21-2023
08:22 AM
Hi, I dont think you catch the SQL error in the sense that PutSQL wont report the error. However you can use the PutDatabaseRecrod instead and use the failure relationship to LogMessage where you can access the error message using the "putdatabaserecord.error" attribute. A better way of capturing errors from any processor (Global Catch) is to use the SiteToSiteBulletinReportingTask as explained here: SiteToSiteBulletinReportingTask If that helps please accept solution. Thanks
... View more
01-18-2023
12:26 PM
Hi, I was able to obtain the required result using the following processor: 1- SplitText : this is to help you split each json record into its own flowfile 2- UpdateRecord: This is used to update the dates fields and convert to the required format using Json Record Reader\Writer: The value used to convert the time for each field : ${field.value:toDate("yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"):format("yyyy-MM-dd HH:mm:ss.SSS")} More info on UpdateRecord: https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.7.1/org.apache.nifi.processors.standard.UpdateRecord/additionalDetails.html Note: The only problem I noticed is that null values will be converted to "" . Not sure if that will cause you a problem but you can use replace text or json jolt to convert the values back to null. If you need the records to be merged back together before inserting into Hive, you can use MergeRecord processor. If that helps please accept solution. Thanks
... View more
01-17-2023
11:16 AM
Hi, Not sure if you are looking for the exact thing but this should give you the expected output from the sample you provided: [
// Flatten an array of photo objects into a prefixed
// soup of properties.
{
"operation": "shift",
"spec": {
"content": {
"*": {
"*": {
"*": {
"$": "error",
"$1": "product",
"$2": "ErrorType",
"@": "q"
}
}
}
}
}
}
] If that helps, please accept solution. Thanks
... View more