How to do continuous delivery to a mongo database

Hi guys,

We have a MongoDB Atlas database with a few collections and we need to design the way how to do continuous delivery for the indexes, triggers and inject some data.

Until now we build the indexes using the Web UI of Mongo Atlas for our development enviroment but our customer have policies that don’t let us using this UI for production enviroment, so we need to think how to delivery changes of indexes, triggers and inject base data.

For example, we use Azure DevOps pipelines for our CI/CD to SQL Server, but we don’t have any idea of how we could build a delivery pipeline for MongoDB Atlas. There is a few ideas we have:

  1. Using an agent (vm) to install mongodb shell and run the scripts.
  2. Using the Mongo App Services API
  3. Build an Azure Function to run the scripts over the DB and call it from the pipelines.

¿in your expirience, what approach we should explore for this?

Thanks a lot for your help guys.

1 Like

Hello @Jose_Alejandro_Benit ,

Welcome back to The MongoDB Community Forums! :wave:

I notice you haven’t had a response to this topic yet - were you able to find a desired solution?

I think any of the ideas you mentioned could be used for importing data and creating indexes, it really depends on what you are comfortable implementing. For triggers, you can use Realm CLI or Atlas App Services Admin API. You can utilise a combination of API requests, mongosh scripts, etc to create an environment for fresh use every time.

You can also refer to this article on “How to Build CI/CD Pipelines for MongoDB Realm Apps Using GitHub Actions” for more insight. Which method ultimately is best for your use case really depends on your specific situation, and perhaps your existing tooling.


This topic was automatically closed 5 days after the last reply. New replies are no longer allowed.