Created 10-06-2015 09:47 AM
Customer wants to ingest data from Oracle Database to Kafka. it appears that sqoop2 is supported to ingest the data to Kafka. Since we dont have sqoop2 support yet. Kafka customer is looking for using logstash to ingest the data. Is there any other better options available ?
Created 10-08-2015 02:50 PM
Linking your cross-post in another space, there was a discussion going. http://community.hortonworks.com/questions/953/can-nifi-be-used-to-pipe-the-data-from-oracle-data.ht...
Created 10-08-2015 02:50 PM
Linking your cross-post in another space, there was a discussion going. http://community.hortonworks.com/questions/953/can-nifi-be-used-to-pipe-the-data-from-oracle-data.ht...
Created 01-15-2016 02:01 PM
Additionally, also keep in mind that the ExecuteSql processor in Nifi converts the data to an Avro format that is Kafka's native format (minus a Schema Registry). Using Nifi as your primary extraction tool also allows you to not be locked into Sqoop2's static command line format and single producer/consumer model
Created 10-21-2015 02:27 PM
+1 on Nifi. If customer does not want to go that route (which they should because it is probably the most elegant solution) the other option would be to export the data from oracle using export utility to the local file system. Configure Flume agent to listen to the export directory and use Kafka Sink to place the data on Kafka. Depending on the downstream processing, Flume can chunk the data into appropriate sizes (1 row, 10 rows, n rows)