Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Kafka De-duplicating Message

avatar
New Contributor

Hi, our client asking if we can prevent Kafka deduplication msg. Is that possible?

Our flow working like this:

CDC => Kafka => AB Initio consumer

How can we make sure, Kafka will only publish exactly one msg from CDC and not duplicating any event?

 

Thank you

1 REPLY 1

avatar
Expert Contributor

Hi @hbinduni 

 

From Kafka 0.11, the KafkaProducer supports two additional modes: the idempotent producer and the transactional producer. The idempotent producer strengthens Kafka's delivery semantics from at least once to exactly-once delivery. In particular producer, retries will no longer introduce duplicates.

 

It's important to mention that If the producer is already configured with acks=all, there will be no difference in

performance.

 

Additionally, the Order of messages produced to each partition will be guaranteed, through all failure

scenarios, even if max.in.flight.requests.per.connection is set to more than 1 (5 is the default, and also the highest value supported by the idempotent producer).

 

More details in the document below:

https://kafka.apache.org/28/javadoc/org/apache/kafka/clients/producer/KafkaProducer.html