Hello. I have a challenge with migration time series data.
I need migrate data from old time series collection to new time series collection. But new one already have documents. And I need keep them, because we use this collection and continue store events in it.
I tried using mongodump/mongorestore and the aggregation pipeline with $out, but neither option works with the tsd collection that already contains documents.
I also tried using a node.js script that ran 100 processes at once and used insertMany in batches (about 40,000 documents).
This approach worked and I was able to copy the old data into the new collection, but the size of the collection increased several times.
Therefore, I am looking for a way to transfer old documents from the old collection to the new one without increasing its size much (if possible).
I have a large amount of data more than 100000000 documents in the old one. Granularity: seconds