Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Handling inserts,updates and deletes ( CDC) in Hadoop

Handling inserts,updates and deletes ( CDC) in Hadoop

New Contributor

Hello,

I have a requirement to process and store changes from oracle database into hadoop data lake.

Goldengate replicates the changes and drops files into hdfs directly ( I don't see an option to write to hive table directly) . Each file will have Inserts , updates and deletes.

How can I quickly process the changes into a hive table?is Spark an option for local file processing?

I need to capture the changes from individual files into hive and also denormalize multiple transaction tables ( Eg: Headers, Detail) into one for faster querying.

Has anyone implemented a better solution for this kind of problem in hadoop? Please suggest

Thanks again

1 REPLY 1

Re: Handling inserts,updates and deletes ( CDC) in Hadoop

@cskbhatt
  1. Capture the change only data coming from Golden Gate to HDFS in a temp/staging Hive table.
  2. Use Hive Merge functionality to do inserts/updates/deletes. This article has all the details on how to use Merge.

But there are a couple of gotchas for that!

  1. You should have either HDP 2.6 or Hive 2.2 running on your cluster. The functionality is available in Hive 2.2 but Hortonworks has backported it to work on Hive 1.2 also in HDP 2.6.
  2. Your table(s) should be transactional.

Let know if that helps!