MongoDB Kafka Connector not generating the message key with the Mongo document id

I am able to get data from mongo to kafka using source connector but i am getting something like that.
“_id”: “some alphanumeric data”, “uuid”: “some alphanumeric data”, “fileName”: "file location.

I don’t want this backslash. This backslash is getting added in every fields of mongo document when i am getting data in kafka topic. Please help me. I am working on some urgent business need.

Can you provide the source connector configuration (I don’t need to connection info just the connector configuration parameters). By backslash can you provide a sample output of what you expect vs what you find? I am not clear from your description of the backslash. Are you trying to get dat in the Kafka topic that does not contain the change stream metadata ?

Thanks,
Rob

i think i have resolved my issue. the config which i am using now is below.

name=mongo-source
topics=mongotokafka.mongotokafka
connector.class=com.mongodb.kafka.connect.MongoSourceConnector
tasks.max=1
key.ignore=true
#connection.uri=######
connection.uri=#####
#connection.user=###
#connection.password=###
database=###
collection=$$$
copy.existing=true
max.num.retries=3
retries.defer.timeout=5000
type.name=kafka-connect
schemas.enable=false
#key.converter=org.apache.kafka.connect.json.JsonConverter
key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=org.apache.kafka.connect.storage.StringConverter
key.converter.schemas.enable=false
#value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable=false
#key.converter=org.apache.kafka.connect.storage.StringConverter
#key.converter.schemas.enable=false
#value.converter=org.apache.kafka.connect.storage.StringConverter
#value.converter.schemas.enable=false
offset.flush.timeout.ms=50000
buffer.memory=100
poll.max.batch.size=1000
poll.await.time.ms=5000
publish.full.document.only=true
change.stream.full.document=updateLookup
output.format.key=schema
output.format.value=schema
output.schema.infer.value=true
#topic.prefix=23

What i have done to resolve the issue , below i have added.
I replaced JsonConverter with String converter :slightly_smiling_face:

key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=org.apache.kafka.connect.storage.StringConverter
key.converter.schemas.enable=false
value.converter.schemas.enable=false  

and also added below :slightly_smiling_face:

publish.full.document.only=true
change.stream.full.document=updateLookup

This i have done in our stage Environment. Now i am working on prod. In my project MongoDb enterprise we are using.