Support Questions

Find answers, ask questions, and share your expertise
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

Ingestion : How to ingest data from Oracle Database to Kafka ?

Customer wants to ingest data from Oracle Database to Kafka. it appears that sqoop2 is supported to ingest the data to Kafka. Since we dont have sqoop2 support yet. Kafka customer is looking for using logstash to ingest the data. Is there any other better options available ?



Linking your cross-post in another space, there was a discussion going.

New Contributor

Additionally, also keep in mind that the ExecuteSql processor in Nifi converts the data to an Avro format that is Kafka's native format (minus a Schema Registry). Using Nifi as your primary extraction tool also allows you to not be locked into Sqoop2's static command line format and single producer/consumer model

+1 on Nifi. If customer does not want to go that route (which they should because it is probably the most elegant solution) the other option would be to export the data from oracle using export utility to the local file system. Configure Flume agent to listen to the export directory and use Kafka Sink to place the data on Kafka. Depending on the downstream processing, Flume can chunk the data into appropriate sizes (1 row, 10 rows, n rows)

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.