Created 09-27-2017 11:38 AM
I am ingesting data from oracle to hive using sqoop . I want to know whether i can use nifi to count the no of rows in oracle table is same as the no of rows in target table after ingestion.
Created on 09-27-2017 12:01 PM - edited 08-17-2019 11:52 PM
Want to get a detailed solution you have to login/registered on the community
Register/LoginCreated 09-27-2017 11:42 AM
Sure , You can do that with MergeContent Processor . if you are using only source and target then you can set the processor property Min no of entries to 2 and max no of entries to 2 and also mention a correlation attribute to do the merge.
Created 09-27-2017 11:46 AM
Can you help me with some example on how to do it
Created on 09-27-2017 12:01 PM - edited 08-17-2019 11:52 PM
Want to get a detailed solution you have to login/registered on the community
Register/LoginCreated 03-03-2020 11:39 PM
Can you please post the template, I am trying to solve the same problem. It would be a great help for me
Created 09-27-2017 04:34 PM
@Aneena Paul how much volume of data is being moved as part of the sqoop. IF the volume is not too high, why not simply use nifi for moving data from oracle to hive. Nifi can easily handle anything in the GB ranges for daily / hourly jobs. A simple flow would be Qeneratetablefetch -> RPG->executesql->puthdfs.
Created 09-27-2017 04:35 PM
This will give you provenance in nifi, which provides you with confirmation of how much data in bytes was extracted and sent to hdfs, so no need to do this additional check.