I searched on YT on how to backup MongoDB data, and I had a Dev tell me that it’s not efficient backing up a dataset that is large but he doesn’t know the set number, I was wondering what will be the “limit”. The reason why I’m backing up data is bc the VPS is unstable and I might lose data but safe than sorry.
Welcome to the MongoDB Community @Shaughn_De_Sousa !
A backup strategy (which includes testing the restore process) is essential if your data is important. You can use any of the MongoDB Backup Methods applicable to your deployment type and MongoDB server version.
mongorestoreoperate by interacting with a running
mongodinstance, they can impact the performance of your running database. Not only do the tools create traffic for a running database instance, they also force the database to read all data through memory. When MongoDB reads infrequently used data, it can evict more frequently accessed data, causing a deterioration in performance for the database’s regular workload.
There is no specific limit, but if your data is significantly larger than available memory there will be more overhead for reading data with a
mongodump backup and more time to recreate your deployment with
mongorestore. The timing and performance impact of backups will vary depending on your backup approach, deployment resources, workload, and backup frequency.
The MongoDB server documentation includes considerations and procedures for supported backup methods.
I see thank you for the response, since I’m quite new to the backuping MongoDB Data, I was hoping to back data up via code itself and set a regular interval around everyday and back data up, I’ll check the MongoDB backup methods themselves and see if I can understand them and use it to my advantage!
This topic was automatically closed 5 days after the last reply. New replies are no longer allowed.