One github repo, Multiple Realm Apps, Production vs Dev, CI/CD implications

I have been trying to implement a CI/CD pipeline that deploys my code between a development realm app, and a production realm app, depending on the github branch that is merged.

I have been trying to figure out how to do it and it seems pretty difficult due to all of the configuration that exists inside the repo for each realm app. It’s almost like I need env variables I can use throughout the functions, serices, values, and triggers files. But I’m not even sure about that working fully.

I almost feel like I need 3 separate github repos. One to hold my dev app, one to hold my production app, and then another to hold my hosting files which is the front-end app. In this case, my CI/CD process could basically watch for PRs from the front-end app and do commits into the respective realm app repo. Then each realm app repo would be auto-deploy linked to the app.

I don’t really know at all if this is best. Does anybody have experience with this issue? I just want the most logical set up for having multiple realm apps that separate environments like prod vs dev.

Thanks!
Lukas

Hi @Lukas_deConantseszn1

What blocks you currently from deploying with the github integration to 3 different applications:
Dev, Stage , Prod.

Each one can be linked to a branch and therefore be deployed respectively.
https://docs.mongodb.com/realm/deploy/deploy-automatically-with-github/

The docs show exactly how to export an app.

Best
Pavel

It seems to me that some of the code in the config files, like the services files, specifies a specific DB to use. What if I want to use a different database between dev and prd? If this is possible, it is not very obvious on how to do so.

On a similar note, I am trying to make some changes to my github automatic deploy link and it doesn’t seem to be working. I am trying to connect to a different rep, and it keeps pulling up the old repo.

You might find Lauren Schaefer’s series of articles on “DevOps + MongoDB Realm Serverless Functions = :heart_eyes:” an interesting read: https://www.mongodb.com/how-to/unit-test-realm-serverless-functions

And her video an interesting watch: DevOps + MongoDB Serverless = Wow! - YouTube

I don’t know of other “official” guidelines on this, but I know that the team is currently revisiting the app export/import and configuration file format, in part to tackle this very problem. I think it would be valuable for them to hear what you think would be a good solution to the problem.


As a (temporary) workaround, I’ll share what I’ve personally done on one of my projects.

For function definitions

I’ve stored the name of my environment specific database in a “value” (named “defaultDatabase”) and retrieve it from functions like so:

context.services
    .get("mongodb-atlas")
    .db(context.values.get("defaultDatabase"))

But, that only solves the issue for function definitions.

For config files

For the configuration files I’ve written a small (TypeScript) script that takes the declaration of my “staging” app and patch in values that are relevant for production:

scripts/search-replace.ts

import path from "path";
import fs from "fs-extra";
import glob from "glob";
import deepmerge from "deepmerge";

if (process.argv.length < 4) {
  throw new Error("Expected at least 4 runtime arguments");
}

const args = process.argv.slice(process.argv.length - 3);

const mappingPath = path.resolve(args[0]);
const appPath = path.resolve(args[1]);
const destinationPath = path.resolve(args[2]);

console.log(`Copying ${appPath} to ${destinationPath} (overwriting)`);
fs.removeSync(destinationPath);
fs.copySync(appPath, destinationPath, { overwrite: true });

const mapping = fs.readJSONSync(mappingPath);
for (const [fileGlob, replacement] of Object.entries(mapping)) {
  const files = glob.sync(fileGlob, { cwd: destinationPath });
  for (const relativeFilePath of files) {
    const filePath = path.resolve(destinationPath, relativeFilePath);
    const content = fs.readJSONSync(filePath);
    const mergedContent = deepmerge(content, replacement as any);
    fs.writeJSONSync(filePath, mergedContent, { spaces: 2 });
  }
}

realm-app/production.json

{
  "config.json": {
    "app_id": "my-app-prod-abcde",
    "hosting": {
      "custom_domain": "app.my-app.io",
      "app_default_domain": "my-app-prod-abcde.mongodbstitch.com"
    },
    "custom_user_data_config": {
      "database_name": "my-app-prod"
    }
  },
  "values/defaultDatabase.json": {
    "value": "my-app-prod"
  },
  "services/mongodb-atlas/rules/*.json": {
    "database": "my-app-prod"
  }
}

I execute this with ts-node like so:

ts-node --project scripts/tsconfig.json scripts/search-replace.ts realm-app/production.json realm-app production-realm-app

It basically copies the app config files from realm-app and for every key in production.json it finds files matching the glob (specified by the key’s string value) and deep-replace the values defined by the value in those files.


I hope that all makes sense, feel free to use the code above if you choose to go down the same path as me.

3 Likes

Hi @Pavel_Duchovny and @kraenhansen,

Thank you very much for the responses. That’s also a lot of great info and I am very grateful. In this case, are you not using the GitHub linked automatic deployments? What kind of CI/CD tool are you using? Are you running this find/replace using some sort of CI/CD process that checks out the code and then runs your script? Or are you just doing it locally/manually each time? I might try this path. Your script looks very robust so thanks!

How can I stay up to date on what the team decides to do with the export/import formats?

Thanks for the shoutout!

The talk you linked to specifically explains how I handled using GitHub autodeployments for deploying to dev/qa/prod. Here is the GitHub repo with my Travis CI file and an explanation of how it all works together: GitHub - mongodb-developer/SocialStats.

Note that when I wrote this earlier this year, autodeploys only worked from the master branch, which is why I have 3 GitHub repos. I haven’t had time to revisit this yet to rework it so dev/qa/prod all live in the same GitHub repo.

1 Like

@Lauren_Schaefer thanks so much for posting and I did watch your video on TravisCI and your SocialStats app. It was very informative and I loved the part about unit testing for realm functions.

For my set up, I really need to do the whole find/replace thing for different database names. I don’t remember you mentioning find/replace or different DB names in the video, so I’m wondering if each DB had the same name just in different Atlas projects?

I think that something Realm could really benefit from is the concept of environments that can assign apps to. Like each environment would have it’s own DB, Domain, and other relevant values stored in some kind of variable that could sprinkle throughout the realm code. Like realmEnvironment.DATABASE or something similar.

1 Like

The databases and collections had the same names–I used a different Atlas project for each stage (dev/qa/prod). I did app configurations in a few different places:

Yes, these are a bit hacky, but it’s working for me. The Realm team is aware of the need for environment variables and looking into solutions (though I can’t say if/when they’ll be coming). FYI @Drew_DiPalma

1 Like

Update: I started working on a new project today. I decided to try deploying from branches other than master, and it’s working great. The config we setup is…

  • Prod: Prod Cluster, Prod Realm App, Master branch
  • QA: QA Cluster, QA Realm App, QA branch
  • Dev: Each person has their own of the following: Dev Cluster, Dev Realm App, Dev branch
3 Likes

I am really liking the use of separate clusters and I am going to implement this. Has a lot of benefits and solves a lot of complexities with different Realm apps.

The one thing I wish the automatic deploy had is some way to run some CI before the auto-deploy goes out. Like an intermediary step. This is where I would like to run a react build on my application before it goes into Realm. Otherwise, I would be utilizing the automatic build, but because of that issue I am using GitHub Actions to deploy.

2 Likes

This topic was automatically closed 5 days after the last reply. New replies are no longer allowed.