MongoDb-Kafka-Connector error

Robert_Walters, in the mongodb kafka docs came across —>

For Kafka Connector select Java and Version 3.4 or later

my connector is nodejs - could this be reason wny my sink in not sending data to mongodb atlas?

That statement above about selecting the Java driver was only to get the connection string.

I think you should start with the tutorial example and tweak it slightly to get it to do what you want. There are options you have in your sink that aren’t valid and I’m not sure what you’re trying to get done. Read through Docker documentation and get familiar with docker compose files as well as Kafka connect. A solid understanding of these will make it easier

Thank you, have used the tutorial example but still sending data to mongodb

    "config": {
      "connector.class":"com.mongodb.kafka.connect.MongoSinkConnector",
      "tasks.max":"1",
      "topics": "ChatData",
      "connection.uri": " ",
      "database":"socketIo-MongoDb",
      "collection":"chatfeed",
      "key.converter":"org.apache.kafka.connect.json.JsonConverter",
      "key.converter.schemas.enable":false,
      "value.converter":"org.apache.kafka.connect.json.JsonConverter",
      "value.converter.schemas.enable":false,
      "publish.full.document.only": "true"
            }
}```

Thank you for you advise will look into the docs again

Hi Robert_Walters, thank you for you advice, have started the tutorials as instructed and have managed to get the connectors working.

I am just trying to get data from mongodb atlas collection via mongodb-kafka-source-connector to redpanda then back to a different collection in mongodb atlas via mongodb-kafka-sink-connector.

I managing to consume the data but then unable to send to mongodb atlas - getting exception, have spend all day looking at the docs, stackoverflow and no success

connect  | com.mongodb.MongoBulkWriteException: Bulk write operation error on server cluster0-shard-00-02.ebt7p.mongodb.net:27017. Write errors: [BulkWriteError{index=0, code=2, message='unknown operator: $oid', details={}}]. ```

```{"name": "mongo-ts-sink",
    "config": {
      "connector.class":"com.mongodb.kafka.connect.MongoSinkConnector",
      "tasks.max":"1",
      "topics": "ChatData.socketIo-MongoDb.chat",
      "connection.uri":"",
      "database":"socketIo-MongoDb",
      "collection":"chatfeed",
      "key.converter":"org.apache.kafka.connect.json.JsonConverter",
      "key.converter.schemas.enable":false,
      "value.converter":"org.apache.kafka.connect.json.JsonConverter",
      "value.converter.schemas.enable":false,
      "publish.full.document.only": true
    } 
}```

Although when i run

```docker exec -ti redpanda rpk topic consume ChatData.socketIo-MongoDb.chat

{
  "topic": "ChatData.socketIo-MongoDb.chat",
  "key": "{\"_id\": {\"_data\": \"8262E17ECA000000092B022C0100296E5A1004FC2D06E10EF64CA2967AEFB29F6E510B46645F6964006462E17ECA1F262FEED5EFB6520004\"}}",
  "value": "{\"_id\": {\"$oid\": \"62e17eca1f262feed5efb652\"}, \"name\": \"John Johnson\", \"message\": \"Bonjour\"}",
  "timestamp": 1658945232072,
  "partition": 2,
  "offset": 0
}```

```Atlas atlas-7j1r6x-shard-0 [primary] socketIo-MongoDb> db.chatfeed.insertOne( { 'test' : 1 } )``` work fine

Thank you for your help.

For vale.converter use string not JSon as that is what it appears to be saved as

‘’’
value.converter”:“org.apache.kafka.connect.storage.StringConverter”,
‘’’

Thank you, its now working perfectly, really appreciate your advice.

Hi Onesmus,

I am trying to do that same as you did but sync between 2 databases - From Azure Atlas to my local mongo DB. The sync connector is injecting the payload from topic as is rather than extracting collection information from it. Can you please share both of your source and sink connectors configuration?