MongoDB Kafka Connector

Hi, I’m using below MongoDB Kafka Connector:

When we stop and start the connector, while the producer is still producing the messages, we noticed that we are getting duplicates messages. We ran a test by producing 50k messages and while the data is getting produced, we stopped the connector and started it again. We noticed 55651 being sent to Mongo collection, that means 5651 duplicates messages.

Please let us know what is the expected behavior of the MongoDB Sink connector? Is it allowed to get duplicate messages on the consumer side when restarted ?

thanks,
Suresh

This is expected. Applications that are consuming from the MongoDB Kafka Connector is expected to handle at least once processing.

Hi Robin, Thanks for your response!

This topic was automatically closed 5 days after the last reply. New replies are no longer allowed.