I have data in old database like emp (id, name , address,actn_ts) and i want to move this to new aws database with name emp(id,name,address). Connection and netchange process is in place. initial data is already i migrated. Wanted to check that data records are missing in old db emp and new db emp.
I have tried execute sql with detectduplicate. I want to compare data for some days that its in sync and if something is missing in new db identify that id and load it again to new DB.
I hope you just want to do this once, in that case you can just write a query that allows you to get all records later than a certain timestamp.
If you want do do this continuously or there is no query that gives the new records, you are looking for Change Data Capture. I have seen a simple processor for Mysql, but in general you may want a dedicated CDC tool to help Nifi pick up the changes.
This is continuous activity every hour i want to compare data between source and destination. And if data is missing then want to insert through flow.
1.Flow ExecuteSql for source Database is DB2.A(name,add,actn_by)
2.Flow ExecuteSql for destination Database is SQL. A(name,add)
3.While moving data from source to destination. we want to compare data in every hour what is missing from DB2.A and SQL.A
Can you help with any solution and example.
That will help me more to resolve and monitor data.