Hi, I’m using below MongoDB Kafka Connector:
When we stop and start the connector, while the producer is still producing the messages, we noticed that we are getting duplicates messages. We ran a test by producing 50k messages and while the data is getting produced, we stopped the connector and started it again. We noticed 55651 being sent to Mongo collection, that means 5651 duplicates messages.
Please let us know what is the expected behavior of the MongoDB Sink connector? Is it allowed to get duplicate messages on the consumer side when restarted ?
thanks,
Suresh