Member since
09-21-2021
8
Posts
0
Kudos Received
0
Solutions
01-12-2024
12:33 AM
I try using this custom processor but when I try it in tableName using ${db.table.name} like this the result like this, can you help me? its also when using ${filename} the result get the same ${filename}
... View more
08-07-2023
05:26 AM
yes, you can try like that or like : format( /datetime, "yyyy/MM/dd/HH", "Asia/Jakarta") - but you will need to switch Replacement Value strategy, from Literal Value to Record Path Value.
... View more
01-18-2023
12:26 PM
Hi, I was able to obtain the required result using the following processor: 1- SplitText : this is to help you split each json record into its own flowfile 2- UpdateRecord: This is used to update the dates fields and convert to the required format using Json Record Reader\Writer: The value used to convert the time for each field : ${field.value:toDate("yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"):format("yyyy-MM-dd HH:mm:ss.SSS")} More info on UpdateRecord: https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.7.1/org.apache.nifi.processors.standard.UpdateRecord/additionalDetails.html Note: The only problem I noticed is that null values will be converted to "" . Not sure if that will cause you a problem but you can use replace text or json jolt to convert the values back to null. If you need the records to be merged back together before inserting into Hive, you can use MergeRecord processor. If that helps please accept solution. Thanks
... View more
10-10-2022
10:43 PM
Perhaps the connection you set between PutHDFS and UpdateHive3Table doesn't send the original file? I am a bit confused by your flow in general.. why convert to avro? Where are you reading files from? Why do you PutHDFS and then UpdateHiveTable instead of just using PutHiveQL?
... View more
10-04-2021
09:39 PM
Hello @RyanCicak Im trying. this flow but it doesn't work for me. This is my flow What should I do? thanks
... View more