Mongosync for a zero-downtime migration

Hi everyone, I’ve been testing mongosync and it works great.

I’m facing a problem, one of our databases needs to be connected by passing a --sslCAFile

But I couldn’t how I can pass this parameter with mongosync , does anyone know if it’s possible?

when database and without certificate it works fine and I use below command:

./mongosync --verbosity=DEBUG \
  --cluster0 'mongodb://USER:PASS@HOST:27017/admin?tls=true' \
  --cluster1'mongodb://USER:PASS@HOST2:27017/admin?tls=true' ```

Best Regards,

Philippe Oliveira
1 Like

Script at the bottom will make this much easier.

The main issue with MongoSync is it doesn’t have the ability to pass the --sslCAFile, at least in 2022 it didn’t, and in fact a lot of methods attempted to do so caused extreme slowdowns or grinded MongoDB 4.4, 5.0, and 6.0 to a halt.

No fix has been implemented, just fyi to help save you from causing a potential outage in your prod environment.

You’re better off setting up a script to sync your DBs in batches via BSON/JSON pushes as it’s safer, and you can script in anything and everything you want and set it up to automatically send the batches, or just listen to changes and send the new files. General template is at the end in the script, the middle script segment can be used to help as well, but that’s what I’d do.

1 Like

Thanks for the answer o/

For example, I can configure it to sync everything after syncing everything the next day and only sync the last 24 hours…

And 10 minutes before the migration, just the previous 30 minutes. Do a kind of incremental import and almost continuous synchronization!

I want to make a full backup and import it and then just send the differential changes in my database.

can i do this with mongoexport/mongoimport?

Best Regards,

Easily, you will have to modify the script a bit like is provided at the end, but you can implement replace.one for what’s changed, etc. or add new documents etc. The key part is just establishing a listener for when changes have been made, or just setting it up to just send timed batches either or.

Also, this method in tests back in August of 2022, indicated less resource strain on MongoDB’s operation.

You can also set a time/date for what changes occurred and so on, you can also create a network drive and send back-up copies of the JSON documents or BSON files to, in addition to the MongoDB instances all at the same time.

  • I’m a DevOps background, 12 years InfoSecDevOps before working at MongoDB for two years. These are the methods that seem safest.

The other option is to build and install an Apollo GraphQL server, and have it route data between both MongoDB instances and it’ll sync everything and manage all of it.

Apollo GraphQL Server is a huge plus though, in long-term scalability, you can stack and orchestrate data between numerous databases irrelevant of what they are.

Redis, MongoDB, MySQL, etc. All of them can be connected to Apollo, and synced together so they all have the same data whether as caches or data stores.

The single largest concern I have though, is by using MongoSync to pass CAs, in a lot of tests it either shut down connections to MongoDB, froze MongoDB, or broke MongoDB Encrypted builds. In each use case and experiment the results to test environments were catastrophic.

This is the basis of why I suggest not using MongoSync if you’re trying to launch an --sslCAFile, because it can quite literally bring your entire prod down.

Mongosync supports CA file via the connection URI: https://www.mongodb.com/docs/upcoming/reference/connection-string/#mongodb-urioption-urioption.tlsCAFile

for example something like this should work:

mongodb://$USER:$PASS@$HOST:27017/?tls=true&tlsCAFile=tmp/my-ca-bundle.pem
1 Like

I’ll try this out, I’ve always been unsuccessful with this so far, will see how it goes.