- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Ingestion : How to ingest data from Oracle Database to Kafka ?
- Labels:
-
Apache Kafka
-
Apache Sqoop
Created ‎10-06-2015 09:47 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Customer wants to ingest data from Oracle Database to Kafka. it appears that sqoop2 is supported to ingest the data to Kafka. Since we dont have sqoop2 support yet. Kafka customer is looking for using logstash to ingest the data. Is there any other better options available ?
Created ‎10-08-2015 02:50 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Linking your cross-post in another space, there was a discussion going. http://community.hortonworks.com/questions/953/can-nifi-be-used-to-pipe-the-data-from-oracle-data.ht...
Created ‎10-08-2015 02:50 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Linking your cross-post in another space, there was a discussion going. http://community.hortonworks.com/questions/953/can-nifi-be-used-to-pipe-the-data-from-oracle-data.ht...
Created ‎01-15-2016 02:01 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Additionally, also keep in mind that the ExecuteSql processor in Nifi converts the data to an Avro format that is Kafka's native format (minus a Schema Registry). Using Nifi as your primary extraction tool also allows you to not be locked into Sqoop2's static command line format and single producer/consumer model
Created ‎10-21-2015 02:27 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
+1 on Nifi. If customer does not want to go that route (which they should because it is probably the most elegant solution) the other option would be to export the data from oracle using export utility to the local file system. Configure Flume agent to listen to the export directory and use Kafka Sink to place the data on Kafka. Depending on the downstream processing, Flume can chunk the data into appropriate sizes (1 row, 10 rows, n rows)
