How to connect mongodb connector and apache kafka?

Good evening. I would like to know the way how to connect mongodb connector and apache kafka without using confluent in my local. I have referred some google platform could be seen only using confluent.
I have referred (https://www.mongodb.com/docs/kafka-connector/current/introduction/connect/) this link also I don’t get proper idea to implement. Could you please guide me? If you have any implementing document also please share it to here?

Looking forward for favourable reply,
Vijayalakshmi M

What do you mean by

Are you referring to using the Confluent Platform ?

Here is a self-contained mongodb, kafka, kafka connect deployment in Docker https://www.mongodb.com/docs/kafka-connector/current/tutorials/tutorial-setup/#std-label-kafka-tutorials-docker-setup

Thank you sir for your response. I have connected mongodb sink connector with post api, I have configured like below,
curl -X POST -H “Content-Type: application/json” -d ‘{“name”:“test-sherlin”,
“config”:{“topics”:“sunflower”,
“connector.class”:“com.mongodb.kafka.connect.MongoSinkConnector”,
“tasks.max”:“1”,
“connection.uri”:“mongodb://localhost:27017”,
“database”:“flower”,
“collection”:“sunflower-collection”,
“key.converter”:“org.apache.kafka.connect.storage.StringConverter”,
“value.converter”:“org.apache.kafka.connect.storage.StringConverter”,
“key.converter.schemas.enable”:“false”,
“value.converter.schemas.enable”:“false”}}’ localhost:8083/connectors

I would like to know the way to handle PUT and DELETE configuration. how can I proceed?

Are you asking how to delete the connector via a curl statement ?

Yes, I am asking about both update and delete the data which is already saved in mongodb via connector curl.

It sounds like you are looking to replicate mongodb data ? What is your use case ?

Actually, I have done based on my requirements. Thanks for your response