I have the following use case: there is a large amount of live data on a collection (~250GB) with a set of indexes (these are created when db is empty and then data are inserted afterwards). Then, we’ve done some analysis and decided to use a new set of indexes.
Question: what is the best way to deploy the new set of indexes and remove the old set of indexes (that affects existing data as well ) ?
- Backup data and reload doesn’t seem to work, because the snapshot would include the old indexes too.
- A no-brainer would be just run “db.removeIndexes()”, then “db.createIndexes()” but given the large existing data, it could be very time-consuming (I reckon)
I’m wondering if there’s a better way, or better, what is the industry standard approach to deal with this problem ?