- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Kafka De-duplicating Message
Created ‎05-08-2021 07:37 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi, our client asking if we can prevent Kafka deduplication msg. Is that possible?
Our flow working like this:
CDC => Kafka => AB Initio consumer
How can we make sure, Kafka will only publish exactly one msg from CDC and not duplicating any event?
Thank you
Created ‎11-29-2021 05:20 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @hbinduni
From Kafka 0.11, the KafkaProducer supports two additional modes: the idempotent producer and the transactional producer. The idempotent producer strengthens Kafka's delivery semantics from at least once to exactly-once delivery. In particular producer, retries will no longer introduce duplicates.
It's important to mention that If the producer is already configured with acks=all, there will be no difference in
performance.
Additionally, the Order of messages produced to each partition will be guaranteed, through all failure
scenarios, even if max.in.flight.requests.per.connection is set to more than 1 (5 is the default, and also the highest value supported by the idempotent producer).
More details in the document below:
https://kafka.apache.org/28/javadoc/org/apache/kafka/clients/producer/KafkaProducer.html
