I am facing one issue while extracting data from Oracle Db through QueryDatabase Table processor . We have defined batch_ts column as Timestamp in the HIVE and we are trying to insert data from oracle by using QueryDatabase Table processor but it is inserting NULL value for the batch_ts column although we have defined parameter for this column .Below is the design.
In the Update attribute processor we can see the batch_ts value .
Let me know if you guys have some suggestion on it.
What processor are you using to put the data into Hive? Right now it looks like you have Avro in the flow file content (the columns coming out of the Oracle DB via QueryDatabaseTable) and your batch_ts value as a attribute (aka metadata) on the flow file. You'll need to be able to inject the batch_ts value into your record(s).
I don't believe your version of NiFi has UpdateRecord, but if it does, you can use that to add a "batch_ts" field to each of the Avro records in the flow file. If you don't have UpdateRecord, you can convert the Avro to JSON and use JoltTransformJSON to put the "batch_ts" field into each record.
Hi @Matt Burgess,
We are using PutHDFS processor and further we have use Replace text processor to execute load data in path commond . I can see we have UpdateRecord processor so as per your suggestion i can use this processor after convertAvroto Orc processor ?.
below is the flow detail .