DB scripts as part of deployment

We need to design a way to manage and automate deployment of mongodb database scripts.

Today we do this adhoc and manually, and someone needs to remember to perform this every time we promote changes across environments. We need some mechanism to manage state of executed scripts and and not require having to rerun them in specific environments as well as we should be able integrate this with our CI/CD process.

Need some help and pointers regarding this, like something similar to DACPAC projects for SQL server and DACPAC tasks in Azure DevOps release pipelines for SQL Server. So what is the equivalent thing in MongoDB? By the way, we are using Azure DevOps for CI/CD.

2 Likes

Is there a way we can leverage SaltStack for automating deployment of database changes. Heard about PyMongo sometime before, is it related to that? Could you provide more pointers and/or documentation regarding how SaltStack could be effectively harnessed to deploy MongoDB database changes across environments?

I am interested in this too. Can anyone share a nice way they handle their Mongo scripts when releasing code or do most people ask DBA to run the Mongo scripts?

Hi Ambar, did you get a method for it?

Hi everyone!

I believe you are describing a desire to handle database migrations similar to this recent discussion: Merge customer database with upgraded version without data loss - #2 by Stennie

Managing schema migrations is outside the scope of the core MongoDB server, but there are quite a few libraries and tools that can help with this depending on your requirements and software preferences.

I expect you could choose (or create) a schema migration approach independent of the automation tooling that you use for deployment (SaltStack, Puppet, etc) – automation tooling could just invoke schema migration at the relevant step in your deployment process.

I suggest creating separate discussion topics with more specific details of your individual requirements and use cases. The original question in this topic was focused on Azure DevOps, which is currently on the more niche side in terms of discussion and expertise in this forum

Regards,
Stennie

From my perspective I don’t want to do a migration. What I am wondering is how best to handle release scripts. E.g. for a given release we add 2 new tables, ​2 new indexes and have an insert script to populate them.
​In Oracle the answer would be to create a master script that calls DDL script first, then dml and have a validate statement that ensures everything inserted as expected otherwise rollback whole transaction. All this could be called from Jenkins.

Hi @Claire_Moore1,

What you are describing is the same concept as schema migration: you want to apply DDL changes to add new collections, indexes, and structural data as part of your deployment process so the end outcome is a consistent database schema.

Each changeset would have a unique identifier which can be used to identify whether those changes have already been applied and some convention to ensure changes are applied in a predictable order. Changesets could be applied within transactions – MongoDB 4.4+ would be required to Create Collections and Indexes in a Transaction. Changesets can also be committed to version control and deployed as part of your release process or continuous integration.

There is no strict requirement to use a schema migration tool with MongoDB, but one can certainly help with consistency of your deployments. From a data point of view there are also patterns like Schema Versioning that take advantage of MongoDB’s flexible schema to allow applications to be aware of multiple versions of document schemas co-existing in the same collection.

DDL commands are ultimately sent to a MongoDB deployment via the MongoDB Driver API, but there are schema migration/management tools that provide higher-level abstractions like JSON, YAML, or XML. I personally prefer a tool that matches the implementation language for my app so the dev team doesn’t have to learn additional syntax and existing data models can be leveraged to update structural data.

If you have more specific requirements, it would be best to start a new discussion topic focused on your environment and use case. There is definitely more than one way to approach this.

Regards,
Stennie

1 Like

Thank you for the reply. Yes I am going to give using Transactions a shot for our release process & Schema versioning

Hi @Claire_Moore1 , were you able to implement your DDL scripts from Azure DevOps pipelines? I’m attempting the same thing and would appreciate any guidance if you were successful.
Regards,
Matt