Use mongo source connector with timeseries collections

Hi,
I have a MongoDB time series collection that I need to publish to Kafka. To achieve this, I used the Mongo source connector. However, when I attempted to do so, I received the following message:

(com.mongodb.kafka.connect.source.MongoSourceTask) [task-thread-source-mongo-condensed-cam-connector-0]
2023-04-24 11:48:23,103 INFO [source-mongo-condensed-cam-connector|task-0] Started MongoDB source task (com.mongodb.kafka.connect.source.MongoSourceTask) [task-thread-source-mongo-condensed-cam-connector-0]
2023-04-24 11:48:23,104 INFO [source-mongo-condensed-cam-connector|task-0] WorkerSourceTask{id=source-mongo-condensed-cam-connector-0} Source task finished initialization and start (org.apache.kafka.connect.runtime.AbstractWorkerSourceTask) [task-thread-source-mongo-condensed-cam-connector-0]
2023-04-24 11:48:23,108 INFO [source-mongo-condensed-cam-connector|task-0] Watching for collection changes on 'TestDB.TestCollection' (com.mongodb.kafka.connect.source.MongoSourceTask) [task-thread-source-mongo-condensed-cam-connector-0]
2023-04-24 11:48:23,111 INFO [source-mongo-condensed-cam-connector|task-0] New change stream cursor created without offset. (com.mongodb.kafka.connect.source.MongoSourceTask) [task-thread-source-mongo-condensed-cam-connector-0]
2023-04-24 11:48:23,964 WARN [source-mongo-condensed-cam-connector|task-0] Failed to resume change stream: Namespace TestDB.TestCollection is a timeseries collection 166

Upon researching timeseries, I discovered that timeseries collections do not support change streams, as outlined in the MongoDB documentation here: https://www.mongodb.com/docs/manual/core/timeseries/timeseries-limitations/.

Is there a way to use the MongoDB connector for timeseries data? If not, are there any other alternatives available?

You are correct, currently time series collections do not support change streams. While this support may come in a future version of MongoDB, today you’ll have to move the data to a regular collection. What do you do with the data once it is in Kafka? Is there processing you could do within the aggregation framework that might help ?

Hi Robert,
The data will be sent to multiple devices via MQTT. A large volume of time-series data is continuously being received by the time-series database, which needs to be published to the devices. However, I’m not quite sure if the aggregation framework would be super helpful in this situation.