Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Ingestion : How to ingest data from Oracle Database to Kafka ?

avatar

Customer wants to ingest data from Oracle Database to Kafka. it appears that sqoop2 is supported to ingest the data to Kafka. Since we dont have sqoop2 support yet. Kafka customer is looking for using logstash to ingest the data. Is there any other better options available ?

1 ACCEPTED SOLUTION

avatar
3 REPLIES 3

avatar

Linking your cross-post in another space, there was a discussion going. http://community.hortonworks.com/questions/953/can-nifi-be-used-to-pipe-the-data-from-oracle-data.ht...

avatar
New Contributor

Additionally, also keep in mind that the ExecuteSql processor in Nifi converts the data to an Avro format that is Kafka's native format (minus a Schema Registry). Using Nifi as your primary extraction tool also allows you to not be locked into Sqoop2's static command line format and single producer/consumer model

avatar

+1 on Nifi. If customer does not want to go that route (which they should because it is probably the most elegant solution) the other option would be to export the data from oracle using export utility to the local file system. Configure Flume agent to listen to the export directory and use Kafka Sink to place the data on Kafka. Depending on the downstream processing, Flume can chunk the data into appropriate sizes (1 row, 10 rows, n rows)