MongoDB Realm

14 results

Sync or Stink: Why Some Apps Can't Handle Mobile Data

Mobile apps have transformed the way businesses work with data. For example, by automating and digitizing what were previously manual and error-prone tasks, mobile apps have made frontline workers more productive and the operational data they generate more actionable. But that is only true when those workers have modern mobile apps that are fully integrated with back-end systems at the core of the network. Most mobile apps are often slow to load and sync data unreliably, frequently returning stale data or just outright crashing. The reason for this is that data sync is hard. A lot of developer teams take a DIY approach to building sync. But building a sync tool that works in an offline-first environment, like mobile, can take months of complex work and require thousands of lines of code. And when development teams build mobile sync themselves, they often oversimplify the solution — so app data may only sync a few times a day or won’t sync bidirectionally. Building sync the right way — keeping data up-to-date in real-time whenever devices are connected — requires complicated networking and conflict resolution code. MongoDB Realm eliminates this complexity We cover many of the technical difficulties of developing sync tools, including the pitfalls of DIY approaches, in our recent white paper, “ Unlocking Value in Mobile Apps: 3 Best Practices for Mobile Data Sync .” Data sync is absolutely essential for delivering good customer experiences, harnessing the value of mobile data for the business, and ultimately proving that investments in mobile technology are worth it. From getting real-time information to frontline workers to tracking fleets and getting timely inventory updates, mobile data has the potential to deliver true competitive advantage, but only if you’re working with a highly performant mobile platform. When apps work right, users reward you by using the app and, hopefully, leaving positive feedback. Another good sign of a well-built app is that users ask for new features. Development teams that don’t have to worry about complicated conflict resolution code and an overly complex app architecture can focus on quickly developing new features and, in turn, creating better app experiences. MongoDB Realm and Realm Sync include pre-built conflict resolution out of the box. The Realm Mobile Database — used alongside Realm Sync — is object-oriented, so it’s intuitive to mobile developers. Realm enables developers to focus on delivering competitive differentiation instead of worrying about building complicated data sync and conflict resolution tools. This speeds development and enables teams to turn around feature requests faster. Realm Sync works alongside Realm Mobile Database to synchronize data bi-directionally between the Realm Mobile Database on the client and MongoDB Atlas on the back end. Find out more by reading “ Unlocking Value in Mobile Apps: 3 Best Practices for Mobile Data Sync .”

November 3, 2021

Take Advantage of Low-Latency Innovation with MongoDB Atlas, Realm, and AWS Wavelength

The emergence of 5G networking signals future growth for low-latency business opportunities. Whether it’s the ever-popular world of gaming, AR/VR, AI/ML, or the more critical areas of autonomous vehicles or remote surgery, there’s never been a better opportunity for companies to leverage low latency application services and connectivity. This kind of instantaneous communication through the power of 5G is still largely in its nascent development, but customers are adapting to its benefits quickly. New end-user expectations mean back-end service providers must meet growing demand. At the same time, business customers expect to have the ability to seamlessly deploy the same cloud-based back-end services that they’re familiar with, close to their data sources or end users. With MongoDB Realm and AWS Wavelength, you can now develop applications that take advantage of the low latency and higher throughput of 5G—and you can do it with the same tools you’re familiar with. The following blog post explores the benefits of AWS Wavelength, MongoDB Atlas, and Realm, as well as how to set up and use each service in order to build better web and mobile applications and evolve user experience. We’ll also walk through a real-world use case, featuring a smart factory as the example. Introduction to MongoDB Atlas & Realm on AWS MongoDB Atlas is a global cloud database service for modern applications. Atlas is the best way to run MongoDB on AWS because, as a fully managed database-as-a-service, it offloads the burden of operations, maintenance, and security to the world’s leading MongoDB experts while running on industry-leading and reliable AWS infrastructure. MongoDB Atlas enables you to build applications that are highly available, performant at a global scale, and compliant with the most demanding security and privacy standards. When you use MongoDB Atlas on AWS, you can focus on driving innovation and business value instead of managing infrastructure. Services like Atlas Search , Realm , Atlas Data Lake and more are also offered, making MongoDB Atlas the most comprehensive data platform in the market. MongoDB Atlas seamlessly integrates with many AWS products. Click here to learn more about common integration patterns. Why use AWS Wavelength? AWS Wavelength is an AWS Infrastructure offering that is optimized for mobile edge computing applications. Wavelength Zones are AWS infrastructure deployments that embed AWS compute and storage services within communications service providers’ (CSP) data centers. AWS Wavelength allows customers to use industry-leading and familiar AWS tools while moving user data closer to them in 13 cities in the US as well as London, UK, Tokyo and Osaka, Japan, and Daejeon, South Korea. Pairing Wavelength with MongoDB’s flexible data model and responsive Realm database for mobile and edge applications, customers get a familiar platform that can run anywhere and scale to meet changing demands. Why use Realm? Realm’s integrated application development services make it easy for developers to build industry-leading apps on mobile devices and the web. Realm comes with three key features: Cross-platform mobile and edge database Cross-platform mobile and edge sync solution Time-saving application development services 1. Mobile & edge database Realm’s mobile database is an open source, developer-friendly alternative to CoreData and SQLite. With Realm’s open source database, mobile developers can build offline-first apps in a fraction of the time. Supported languages include Swift, C#, Xamarin, JavaScript, Java, ReactNative, Kotlin, and Objective-C. Realm’s Database was built with a flexible, object-oriented data model, so it’s simple to learn and mirrors the way developers already code. Because it was built for mobile, applications built on Realm are reliable, highly performant, and work across platforms. 2. Mobile and edge sync solution Realm Sync is an out-of-the-box synchronization service that keeps data up-to-date between devices, end users, and your backend systems, all in real-time. It eliminates the need to work with REST, simplifying your offline-first app architecture. Use Sync to backup user data, build collaborative features, and keep data up to date whenever devices are online—without worrying about conflict resolution or networking code. Figure 2: High-level architecture of implementing Realm in a mobile application Powered by the Realm Mobile and Edge Database on the client-side and MongoDB Atlas on the backend, Realm is optimized for offline use and capable of scaling with you. Building a first-rate app has never been easier. 3. Application development services With Realm app development services, your team can spend less time integrating backend data for your web apps, and more time building the innovative features that push your business initiatives forward. Services include: GraphQL Functions Triggers Data access controls User authentication Reference Architecture High-level design Terminology wise, we will be discussing three main tiers for data persistence: Far Cloud, Edge, and Mobile/IOT. The Far Cloud is the traditional cloud infrastructure business customers are used to. Here, the main parent AWS regions (such as US-EAST-1 in Virginia, US-WEST-2 in Oregon, etc) are used for centralized retention of all data. While these regions are well known and trusted, the issue is that not many users or IOT devices are located in close proximity to these massive data centers and internet-routed traffic is not optimized for low latency. As a result, we use AWS Wavelength regions as our Edge Zones. An Edge Zone will synchronize the relevant subset of data from the centralized Far Cloud to the Edge. Partitioning principles are used such that users’ data will be stored closer to them in one or a handful of these Edge Wavelength Zones, typically located in major metropolitan areas. The last layer of data persistence is on the mobile or IOT devices themselves. If on modern 5G infrastructure, data can be synchronized to a nearby Edge zone with low latency. For less latency-critical applications or in areas where the Parent AWS Regions are closer than the nearest Wavelength Zone, data can also go directly to the Far Cloud. Figure 3: High Level Design of modern edge-aware apps using 5G, Wavelength, and MongoDB Smart factory use case: Using Wavelength, MQTT, & Realm Sync Transitioning from the theoretical, let’s dig one level deeper into a reference architecture. One common use case for 5G and low-latency applications is a smart factory. Here, IOT devices in a factory can connect to 5G networks for both telemetry and command/control. Typically signaling over MQTT, these sensors can send messages to a nearby Wavelength Edge Zone. Once there, machine learning and analysis can occur at the edge and data can be replicated back to the Far Cloud Parent AWS Regions. This is critical as compute capabilities at the edge, while low-latency, are not always full-featured. As a result, centralizing many factories together makes sense for many applications as it relates to long term storage, analytics, and multi-region sync. Once data is in the Edge or the Far Cloud, consumers of this data (such as AR/VR headsets, mobile phones, and more) can access this with low-latency for needs such as maintenance, alerting, and fault identification. Figure 4: High-level three-tiered architecture of what we will be building through this blog post Latency-sensitive applications cannot simply write to Atlas directly. Alternatively, Realm is powerful here as it can run on mobile devices as well as on servers (such as in the Wavelength Zone) and provide low-latency local reads and writes. It will seamlessly synchronize data in real-time from its local partition to the Far Cloud, and from the Far Cloud back or to other Edge Zones. Developers do not need to write complex sync logic; instead they can focus on driving business value through writing applications that provide high performance and low latency. For highly available applications, AWS services such as Auto Scaling Groups can be used to meet the availability and scalability requirements of the individual factory. Traditionally, this would be fronted by a load-balancing service from AWS or an open-source solution like HAProxy. Carrier gateways are deployed in each Wavelength zone and the carrier or client can handle nearest Edge Zone routing. Setting up Wavelength Deploying your application into Wavelength requires the following AWS resources: A Virtual Private Cloud (VPC) in your region Carrier Gateway — a service that allows inbound/outbound traffic to/from the carrier network. Carrier IP — address that you assign to a network interface that resides in a Wavelength Zone A public subnet An EC2 instance in the public subnet An EC2 instance in the Wavelength Zone with a Carrier IP address We will be following the “Get started with AWS Wavelength” tutorial located here . At least one EC2 compute instance in a Wavelength zone will be required for the subsequent Realm section below. The high level steps to achieve that are: Enable Wavelength Zones for your AWS account Configure networking between your AWS VPC and the Wavelength zone Launch an EC2 instance in your public subnet. This will serve as a bastion host for the subsequent steps. Launch the Wavelength application Test connectivity Setting up Realm The Realm components we listed above can be broken out into three independent steps: Set up a Far Cloud MongoDB Atlas Cluster on AWS Configure the Realm Serverless Infrastructure (including enabling sync) Write a reference application utilizing Realm 1. Deploying your Far Cloud with Atlas on AWS For this first section, we will be using a very basic Atlas deployment. For demonstration purposes, even the MongoDB Atlas Free Tier (called an M0) suffices. You can leverage the AWS MongoDB Atlas Quickstart to launch the cluster , so we will not enumerate the steps in specific detail. However, the high-level instructions are: Sign up for MongoDB Atlas account at cloud.mongodb.com and then sign in Click the Create button to display the Create New Database Deployment dialog Choose a “Shared” cluster, then choose the size of M0 (free) Be sure to choose AWS as the cloud and here we will be using US-EAST-1 Deploy and wait for the cluster to complete deployment 2. Configuring Realm and Realm Sync Once the Atlas cluster has completed deploying, the next step is to create a Realm Application and enable Realm Sync. Realm has a full user interface inside of the MongoDB Cloud Platform at cloud.mongodb.com however it also has a CLI and API which allows connectivity to CI/CD pipelines and processes, including integration with GitHub. The steps we are following will be a high-level overview of a reference application located here . Since Realm configurations can be exported, the configuration can be imported into your environment from that repository. The high level steps to create this configuration are as follows: While viewing your cluster at cloud.mongodb.com, click the Realm tab at the top Click “Create a New App” and give it a name such as RealmAndWavelength Choose the target cluster for sync to be the cluster you deployed in the previous step Now we have a Realm app deployed. Next, we need to configure the app to enable sync. Sync requires credentials for each sync application. You can learn more about authentication here . Our application will use API Key Authentication.To turn that on: Click Authentication on the left On the Authentication Providers tab, find API Keys, and click Edit Turn on the provider and Save If Realm has Drafts enabled, a blue bar will appear at the top where you need to confirm your changes. Confirm and deploy the change. You can now create an API key by pressing the “Create API Key” button and giving it a name. Be sure to copy this down for our application later as it cannot be retrieved again for security reasons Also, in the top left of the Realm UI there is a button to copy the Realm App ID. We will need this ID and API key when we write our application shortly. Lastly, we can enable Sync. The Sync configuration relies on a Schema of the data being written. This allows the objects (i.e. C# or Node.JS objects) from our application we are writing in the next step to be translated to MongoDB Documents. You can learn more about schemas here . We also need to identify a partition key. Partition keys are used to decide what subset of data should reside on each Edge node or each mobile device. For Wavelength deployments, this is typically a variation on the region name. A good partition key could be a unique one per API key or the name of the Wavelength Region (e.g. “BOS” or “DFW”). For this latter example, it would mean that your Far Cloud retains data for all zones, but the Wavelength zone in Boston will only have data tagged with “BOS” in the _pk field. The two ways to define a schema are either to write the JSON by hand or automatic generation. For the former, we would go to the Sync configuration, edit the Configuration tab, choose the cluster we deployed earlier, define a partition key (such as _pk as a string), then define the rules of what that user is allowed to read and write. Then you must write the schema on the Schema section of the Realm UI. However, it is often easier to let Realm auto-detect and write the schema for you. This can be done by putting the Sync into “Development Mode.” While you still choose the cluster and partition key, you only need to specify what database you want to sync all of your data to. After that, your application written below is where you can define classes, and upon connection to Realm Sync, the Sync Engine will translate the class you defined in your application into the underlying JSON representing that schema automatically. 3. Writing an application using Realm Sync: MQTT broker for a Smart Factory Now that the back-end data storage is configured, it is time to write the application. As a reminder, we will be writing an MQTT broker for a smart factory. IOT devices will write MQTT messages to this broker over 5G and our application will take that packet of information and insert it into the Realm database. After that, because we completed the sync configuration above, our Edge-to-Far-Cloud synchronization will be automatic. It also works bidirectionally. The reference application mentioned above is available in this GitHub repository . It is based on creating a C# Console application with the documentation here . The code is relatively straightforward: Create a new C# Console Application in Visual Studio Like any other C# Console Application, have it take in as CLI arguments the Realm App ID and API Key. These should be passed in via a Docker environment variable later and the values of these were the values you recorded in the previous Sync setup step Define the RealmObject which is the data model to write to Realm Process incoming MQTT messages and write them to Realm The data model for Realm Objects can be as complex as makes sense for your application. To prove this all works, we will keep a basic model: public class IOTDataPoint : RealmObject { [PrimaryKey] [MapTo("_id")] public ObjectId Id { get; set; } = ObjectId.GenerateNewId(); [MapTo("_pk")] public string Partition { get; set; } [MapTo("device")] public string DeviceName { get; set; } [MapTo("reading")] public int Reading { get; set; } } To sync an object, it must inherit from the RealmObject class. After that, just define getters and setters for each data point you want to sync. The C# implementation of this will vary depending on what MQTT Library you choose. Here we have used MQTTNet so we simply create a new broker with MqttFactory().CreateMqttServer() then start this with specific MqttServerOptionsBuilder where we need to define anything unique to your setup such as port, encryption, and other basic Broker information. However, we need to hook the incoming messages with .WithApplicationMessageInterceptor() so that way any time a new MQTT packet comes into the Broker, we send it to a method to write it to Realm. The actual Realm code is also simple: Create an App with App.Create() and it takes in the argument of the App ID which we are passing in as a CLI argument Log in with app.LogInAsync(Credentials.ApiKey()) and the API Key argument is again passed in as a CLI argument from what we generated before To insert into the database, all writes for Realm need to be done in a transaction. The syntax is straight forward: instantiate an object based on the RealmObject class we defined previously then do the write with a realm.Write(()=>realm.Add({message)}) Finally, we need to wrap this up in a docker container for easy distribution. Microsoft has a good tutorial on how to run this application inside of a Docker container with auto-generated Dockerfiles. On top of the auto-generated Dockerfile, be sure to pass in the arguments of the Realm App ID and API Key to the application as we defined earlier. Learning the inner workings of writing a Realm application is largely outside the scope of this blog post. However there is an excellent tutorial within MongoDB University if you would like to learn more about the Realm SDK. Now that the application is running, and in Docker, we can deploy it in a Wavelength Edge Zone as we created above. Bringing Realm and Wavelength together In order to access the application server in the Wavelength Zone, we must go through the bastion host we created earlier. Once we’ve gone through that jump box to get to the EC2 instance in the Wavelength Zone, we can install any prerequisites (such as Docker), and start the Docker container running the Realm Edge Database and MQTT application. Any new inbound messages received to this MQTT broker will be first written to the Edge and seamlessly synced to Atlas in the Far Cloud. There is a sample MQTT random number generator container suitable for testing this environment located in the GitHub repository mentioned earlier. Our smart factory reference application is complete! At this point: Smart devices can write to a 5G Edge with low latency courtesy of AWS Wavelength Zones MQTT Messages written to that Broker in the Wavelength Zone have low latency writes and are available immediately for reads since it is happening at the Edge through MongoDB Realm Those messages are automatically synchronized to the Far Cloud for permanent retention, analysis, or synchronization to other Zones via MongoDB Realm Sync and Atlas What's Next Get started with MongoDB Realm on AWS for free. Create a MongoDB Realm account Deploy a MongoDB backend in the cloud with a few clicks Start building with Realm Deploy AWS Wavelength in your AWS Account

October 14, 2021

Atlas as a Service

Many of our customers provide MongoDB as a service to their development teams, where developers can request a MongoDB database instance and receive a connection string and credentials in minutes. As those customers move to MongoDB Atlas , they are similarly interested in providing the same level of timely service to their developers. Atlas has a very powerful control plane for provisioning clusters. In large organizations that have thousands of developers, however, it is not always practical to give so many people direct access to that interface. The goal of this article is to show how the Atlas APIs can be used to provide MongoDB as a service when MongoDB is managed by Atlas. Specifically, we’ll demonstrate how an interface could be created that offers developers a set of choices for a MongoDB database instance. For simplicity, this represents how to provide developers with a list of memory and storage options to configure their cluster. Other options like cloud provider and region are abstracted away. We also demonstrate how to add labels to the Atlas clusters, which is a feature that the Atlas UI doesn’t support. For example, we’ve added a label for cluster description. Architecture Although the Atlas APIs could be called directly from the client interface, we elected to use a 3 tier architecture, the benefits being: We can control the extent of functionality to what is needed. We can simplify the APIs exposed to the front-end developers. We can more granulary secure the API endpoints. We could take advantage of other backend features such as Triggers , Twilio integration, etc. Naturally, we selected Realm to host the middle tier. Implementation Backend Atlas API The Atlas APIs are wrapped in a set of Realm Functions . For the most part, they all call the Atlas API as follows (this is getOneCluster): /* * Gets information about the requested cluster. If clusterName is empty, all clusters will be fetched. * See https://docs.atlas.mongodb.com/reference/api/clusters-get-one * */ exports = async function(username, password, projectID, clusterName) { const arg = { scheme: 'https', host: 'cloud.mongodb.com', path: 'api/atlas/v1.0/groups/' + projectID +'/clusters/' + clusterName, username: username, password: password, headers: {'Content-Type': ['application/json'], 'Accept-Encoding': ['bzip, deflate']}, digestAuth:true }; // The response body is a BSON.Binary object. Parse it and return. response = await context.http.get(arg); return EJSON.parse(response.body.text()); }; You can see each function’s source on GitHub . MiniAtlas API The next step was to expose the functions as endpoints that a frontend could use. Alternatively, we could have called the functions using the Realm Web SDK , but we elected to stick with the more familiar REST protocol for our frontend developers. Using Realm Third-Party Services , we developed the following 6 endpoints: table, th, td { border: 1px solid black; border-collapse: collapse; } API Method Type Endpoint Get Clusters GET /getClusters Create a Cluster POST /getClusters Get Cluster State GET /getClusterState?clusterName:cn Modify a Cluster PATCH /modifyCluster Pause or Resume Cluster POST /pauseCluster Delete a Cluster DELETE /deleteCluster?clusterName:cn Here’s the source for getClusters. Note is pulls the username and password from the Values & Secrets : /* * GET getClusters * * Query Parameters * * None * * Response - Currently all values documented at https://docs.atlas.mongodb.com/reference/api/clusters-get-all/ * */ exports = async function(payload, response) { var results = []; const username = context.values.get("username"); const password = context.values.get("apiKey"); projectID = context.values.get("projectID"); // Sending an empty clusterName will return all clusters. var clusterName = ''; response = await context.functions.execute("getOneCluster", username, password, projectID, clusterName); results = response.results; return results; }; You can see each webhook’s source on GitHub . When the webhook is saved, a Webhook URL is generated, which is the endpoint for the API: API Endpoint Security Only authenticated users can execute the API endpoints. The caller must include an Authorization header containing a valid user id, which the endpoint passes through this script function: exports = function(payload) { const headers = context.request.requestHeaders const { Authorization } = headers const user_id = Authorization.toString().replace(/^Bearer/, '') return user_id }; MongoDB Realm includes several built-in authentication providers including anonymous users, email/password combinations, API keys , and OAuth 2.0 through Facebook , Google , and Apple ID . For this exercise, we elected to use Google OAuth , primarily because it’s already integrated with our SSO provider here at MongoDB. The choice of provider isn’t important. Whatever provider or providers are enabled will generate an associated user id that can be used to authenticate access to the APIs. Frontend The frontend is implemented in JQuery and hosted on Realm . Authentication The client uses the MongoDB Stitch Browser SDK to prompt the user to log into Google (if not already logged in) and sets the users Google credentials in the StitchAppClient . let credential = new stitch.GoogleRedirectCredential(); client.auth.loginWithRedirect(credential); Then, the user Id required to be sent in the API call to the backend can be retrieved from the StitchAppClient as follows: let userId = client.auth.authInfo.userId; And set in the header when calling the API. Here’s an example calling the createCluster API: export const createCluster = (uid, data) => { let url = `${baseURL}/createCluster` const params = { method: "post", headers: { "Content-Type": "application/json;charset=utf-8", ...(uid && { Authorization: uid }) }, ...(data && { body: JSON.stringify(data) }) } return fetch(url, params) .then(handleErrors) .then(response => response.json()) .catch(error => console.log(error) ); }; You can see all the api calls in webhooks.js . Tips We had great success using Postman Team Workspaces to share and validate the backend APIs. Conclusions This prototype was created to demonstrate what’s possible, which by now you hopefully realize is anything! The guts of the solution are here - how you wish to extend it is up to you. Learn more about the MongoDB Atlas APIs .

March 2, 2021

MongoDB Realm Sync is GA

Every mobile developer wants to build an app that users will love - meaning you want to build apps that work for users regardless of signal strength, that react to changes in data in real time, and that won’t drain your user’s battery life or use excessive amounts of data. In June, we released MongoDB Realm , a set of integrated application development services that makes it possible for anyone to build a great app - whether you’re a solo developer working to stand up your idea, or part of a larger team shipping your latest release. As part of this, we announced a public preview for MongoDB Realm Sync , which makes it easier for you to keep data in sync across users, devices, and your back end, even when devices aren’t always online. We’re excited to share that as of today, Realm Sync is now Generally Available (GA). We believe Realm Sync offers a best-in-class solution for offline-first app developers, who need to move data between a local client and the cloud. With the Realm Sync service, we’ve significantly reduced the code you need to write, while also reducing the complexity of your app architecture. Crucially, we’ve done it while making sure everything is built to optimize for battery power, CPU, and bandwidth. As a developer, you no longer need to write (or maintain) thousands of lines of complex conflict resolution and networking code. Realm Sync handles that for you, making it simple to move data between the local Realm Mobile Database and MongoDB Atlas on the back end. You can build features faster, reduce bugs, deliver a better user experience – and do it all without having to worry about standing up or scaling servers. Download the MongoDB Realm Whitepaper Building for an Offline-First Environment To many development teams, synchronizing data between the client and your back end sounds simple. But when connectivity isn’t guaranteed, it becomes time-consuming and complex to achieve. MongoDB Realm simplifies data sync. Synchronization works bi-directionally, moving data between the Realm Mobile Database on the client-side and MongoDB Atlas on the back end. Automatic conflict resolution resolves any data conflicts that may emerge across multiple devices, users, and your back end, and ensures data is consistent whenever mobile devices come online. Because data is synced to Atlas, applications can easily scale up or down infrastructure as app usage changes. MongoDB Realm Sync also: Speeds feature innovation. Realm’s Mobile Database - used to store data locally on device, and MongoDB Realm Sync - both reduce the code developers need to write, and free up time to focus on building new features that provide unique business value. Works across platforms. Realm Mobile Database and MongoDB Realm Sync work on any platform, for any mobile device. Is secure and stable. MongoDB Realm lets you encrypt data in-flight or at-rest, both in the cloud and on-device. MongoDB Realm Sync in Action Fortune 500 businesses and cutting-edge start-ups are already using the Realm Mobile Database and MongoDB Realm Sync to build their apps today. Srikanth Gandra, Director of Digital Technology for 7-Eleven, built a mobile app on MongoDB Realm that’s been successfully rolled out for use across the United States and Canada. “What we’ve created is really innovative. Since rolling this out to all 8,500 stores in North America, we’ve been able to sync data across more than 20,000 devices on a nearly real-time basis," he said. "[Managers] can start using devices immediately, rather than waiting 2-3 minutes to download the data on initial startup, like they used to. Data accuracy - especially around inventory when sales happen or shipments arrive - has really improved.” “We’re evaluating using Realm and Realm Sync to assist with inbound and outbound parcel shipping use cases,” said James Fairweather, Chief Innovation Officer, Pitney Bowes. “As an example, we are exploring building an app on Realm for our front-line workers to scan a package that would automatically sync the data back to MongoDB Atlas providing consistent reporting and up-to-date logistics throughout the shipping journey.” With MongoDB Realm Sync, mobile developers have the tools to make data sync simple, making sure they both build apps fast, while still making sure that even complex components like real-time data sync are built right. Try MongoDB Realm Sync, and get started building your offline-first app. Try MongoDB Realm Sync Today

February 2, 2021

Appy Pie & MongoDB’s Seamless, No-Code Business Solutions for Mobile & Web Apps

The tech industry’s ceaseless and exponential growth is no longer a surprise. As long as clients and end-users remain interested in faster, more efficient services, then tech companies will continue to improve business processes to meet the demand. Simultaneously, these improvements will reduce costs and maximize revenue. It’s a win-win--if, of course, it’s done correctly. So, what’s behind most success stories? How do some companies launch and maintain applications at such rapid, expansive scale? Often, the key to success lies in fostering core business processes that are driven by automation. For many tech-based organizations, Appy Pie Connect has been the go-to seamless integration platform that helps them get started. And now, with MongoDB Realm , it’s about to get even easier. Users build best in-class-apps across Android, iOS, and web with MongoDB Realm’s mobile database, sync solution, and application development services. Why use Appy Pie & MongoDB Realm? Together, Appy Pie and MongoDB are driving seismic operational change. Originally, Appy Pie AppMakr product moved to MongoDB Realm for local storage. But after experiencing the immense ease and advantages offered by Realm--specifically, its offline-first database that supports cross-platform app development-- we decided to extend its benefits to the customers of Appy Pie Connect. As an automation platform, Appy Pie Connect helps businesses automate manual tasks through smart integrations, allowing for intuitive, instant sharing between apps less commonly connected like MailChimp and LinkedIn or Stripe and Gmail, and so on. By integrating MongoDB and MongoDB Realm with Appy Pie Connect, customers can easily store or retrieve data within multiple database sources. This enables the storage of flexible schemas and maintains consistency and integration. This unique “no code” technology allows organizations to extract and work with data from MongoDB and then apply that data to desired software through triggered-based actions. For example, users can set up a trigger for every event on their Google Calendar so that their Slack status corresponds and is updated at the start and end of each meeting. This way, data concurrency is maintained without any manual effort. Realm is a particularly great choice for the customers at Appy Pie Connect because of its effortless data syncing. View some of the common use cases below. Example 1 In the Meter Billing example below, cost is calculated in real time based on usage (e.g. video viewing time), resulting in a transparent “pay as you go” model. Example 2 With real-time data sync, any updates or changes to the application are immediately reflected without requiring users to update or reinstall. Example 3 All API failure logs are conveniently displayed to the admin on the Appy Pie dashboard so that immediate troubleshooting actions can be taken. Benefits of MongoDB Realm and MongoDB Atlas Appy Pie Connect already uses MongoDB Atlas, so moving to Realm -- a MongoDB product offered through Atlas -- was a natural choice. Realm allows mobile users to sync data quickly and seamlessly between mobile devices and backend systems, even if they go offline (sync will occur when they are connected) -- and Atlas enables it all.. Some benefits of using Atlas are as follows: Scalability flexible data schema Document oriented storage Ad hoc queries, indexing, and real-time aggregation Powerful tools for data analysis Serverless function Ultimately, Appy Pie Connect helps businesses convert MongoDB into a central data store by pulling in and replicating data from all its sources. This allows customers to create new MongoDB documents automatically from new Typeform entries, new files on Dropbox, new posts on WordPress, or other resources. Similarly, Appy Pie Connect can also send MongoDB data to other third-party apps, including WordPress, Salesforce, Slack, Mailchimp, Google Drive, and many more. This makes enterprise-wide communication and collaboration much more efficient. When data is pulled from MongoDB through automation, it can help streamline other areas of the business. For example, when you pull MongoDB data into MailChimp, you can automatically add a new subscriber on Mailchimp. This ensures that your lists grow automatically, as fast as your business does. Ex. Appy Pie Connect seamlessly sends MongoDB data to third-party apps Use cases Send data from MongoDB to LinkedIn (without any code!) to quickly post accurate job-related content Extract retail product data into Google Sheets to record valuable data in an organized manner in one place Post enterprise-related content from MongoDB to Twitter to streamline social media presence How it works Appy Pie Connect employs a trigger-action based function that allows you, as a platform user, to choose the two apps you want to connect. The process is very straightforward. Once you choose the apps you wish to integrate, you will be presented with multiple options to connect them. Simply click on the “Connect” button. To integrate with the selected applications accounts, simply allow API access for Appy Pie Connect. Next, design the workflows by mapping all of your data synced from the applications you are connecting. Once complete, you are ready to test your brand new Connect with your Trigger and Action apps. And, that’s it! It is time to experience the magic of Appy Pie Connect at work. Let the automation workflows take over the mundane, repetitive tasks, and move on to more innovative, exciting tasks. As the efficiency of an organization improves through automation, one of the most direct advantages is a marked reduction in cost. These integrations help save hundreds of hours of manual effort, thereby freeing up talented resources to instead focus their energy and intellect on more critical, innovative issues. With Appy Pie Connect and MongoDB Realm, businesses can ensure that their workforce is not only optimized but also inspired, a key factor to employee satisfaction and overall company success. Watch this demo to learn how to integrate MongoDB with Google Sheets using Appy Pie Connect to help you automate data exchange between MongoDB and Google Sheets with ease. Click here to learn more about MongoDB Realm

February 2, 2021

Build Better Mobile Apps -- Running MongoDB Realm and Google Cloud

We’re partnering with Google Cloud to offer MongoDB Realm as part of the MongoDB Cloud stack with Google Cloud to service users globally whether you’re building a new mobile app or modernizing an existing one. Realm’s integrated application development services make it easy for developers to build industry leading apps on mobile devices and the web. With MongoDB Atlas running as a service with Google Cloud, it’s easy to connect your mobile database to Google services. Customers choose Google Cloud to: avoid vendor lock-in by running multi-cloud and hybrid cloud deployments take advantage of Google Cloud’s machine learning and advanced analytics abilities stay secure with the same protections Google Cloud itself uses to guard their data, applications, and infrastructure. Why MongoDB Realm for Mobile? Realm comes with 3 key features: Cross-platform mobile database Cross-platform mobile sync solution Time-saving application development services Mobile Database Realm’s mobile database is an open source, developer-friendly alternative to CoreData and SQLite. With Realm’s open source database, mobile developers can build offline-first apps in a fraction of the time. Supported languages include Swift, C#, Xamarin, JavaScript, Java, ReactNative, Kotlin, and Objective-C. Realm’s Database was built with a flexible, object-oriented data model, so it’s simple to learn and mirrors the way developers already code. Because it was built for mobile, applications built on Realm are reliable, highly performant, and work across platforms. Sync Solution Realm Sync is an out-of-the-box synchronization service that keeps data up-to-date between devices, end users, and your backend systems, all in real-time. It eliminates the need to work with REST, simplifying your offline-first app architecture. Use Sync to backup user data, build collaborative features, and keep data up to date whenever devices are online - without worrying about conflict resolution or networking code. Powered by the Realm Mobile Database on the client-side and MongoDB Atlas on the backend, Realm is optimized for offline use and scales with you. Building a first-rate app has never been easier. Application Development Services With Realm app development services, your team can spend less time integrating backend data for your web apps, and more time building the innovative features that push your business initiatives forward. Services include: GraphQL Functions Triggers Data access controls User authentication Use these products from Google to accelerate the development and deployment of backend services: Google Kubernetes Engine (GKE) Google Cloud Functions (FaaS) Google App Engine (PaaS) Realm and MongoDB Atlas with Google Cloud and Android As Realm is a MongoDB product offered through Atlas, and Atlas is used by Realm to sync data between the database and clients, Google Cloud and Atlas abilities are key to the Realm user experience. Figure 1: Screenshot of Realm offered through MongoDB Cloud UI MongoDB Atlas and Google Cloud MongoDB Atlas delivers a fully managed service on Google Cloud’s globally scalable and reliable infrastructure. Atlas allows users to manage their MongoDB databases easily through the UI or an API call. It’s simple to migrate to, and offers sophisticated features such as Global Clusters that offer low-latency read and write access anywhere across the globe. 3 Key Abilities with MongoDB Atlas and Google Cloud Geographic Presence All Google Cloud regions have at least 3 availability zones, providing higher availability, resiliency and geographic availability. Other public clouds do not have the same reliability guarantees. Network Offering — Cost and Customer Benefits Global VPC - global resources that reduce complexity in networking implementation Performance - premium tier leverages performance of the Google Cloud network improving application performance and latency across tiers Price - better pricing ratio for network egress costs Native Integrations Security -- Atlas offers native integrations to Google Auth through Realm, support for Google Cloud KMS for additional encryption at rest or MongoDB Client-Side Field Level Encryption, and OAuth flow based console integration Billing -- pay as you go billing on Google Cloud Marketplace (Realm is purchased through Atlas credits similarly on Marketplace) Realm and Android With Realm, you can create mobile applications for Android devices. Realm supports all versions of the Android API after level 9 (Android 2.3 Gingerbread). Below is a sample reference architecture which shows how to leverage MongoDB Atlas with Google Cloud as an Operational Data Layer (ODL) / Operational Data Store (ODS) and build mobile applications using MongoDB Mobile and Realm Sync. Figure 2: Reference Architecture for ODL on MongoDB Atlas and Realm with Google Cloud Realm Customer Story — A Leading New York Healthcare Payer MongoDB has partnered with Exafluence to deliver a COVID employee self-assessment health checker app for a leading healthcare payer in New York since the onset of the pandemic, they’ve needed to quickly adapt to new operational standards, as the situation with COVID evolves. MongoDB Atlas, Realm, Google Cloud, and Exafluence have all been a key part of allowing their onsite operations to continue. The CDC and New York State require organizations to keep track of which of their employees reporting to a physical office for work. As a result, the organization must monitor their New York based employees who still come onsite in order to support their members. They needed an app that would capture their employees’ health, and ask a series of questions to determine if the associate was able to enter the facility. Exafluence -- a MongoDB Global Strategic Partner working with the healthcare payer’s HR team and business team -- was able to deliver a complete solution in only three weeks from start to go-live. This rapid deployment was made possible using MongoDB Atlas, Realm, and Google Cloud. The completed app includes: support for mobile devices a web Portal to aggregate information use of QR Scans to confirm access on iPads deployed in facility entrances integration with Active Directory and alerts to the funds email system This rapid deployment was made possible using MongoDB Atlas and Realm. The organization and Exafluence chose Realm because it’s application development services make it easy to work with data across both web and mobile applications. Realm works with React js, provides offline sync and is Atlas cloud ready. MongoDB Atlas and Realm also make it easy to rapidly develop new features when the next stage of the pandemic changes app requirements. Exafluence will be able to quickly add app features tied to vaccination, like the ability for employees to disclose and share immunization certification via MongoDB’s FHIR API. Prior to the Covid App, this healthcare payer chose to use Atlas on Google Cloud because the fully managed, global DBaaS accelerates development and allows them to manage both structured and unstructured data. They also needed a solution for analytics involving geocoding, machine learning, and dashboarding. With Atlas and Google Cloud, their teams get agility while with elastic scaling and provision on-demand resources. Additional differentiators that drove the organization to select Google Cloud include: Maps API Air flow for scheduling Cloud identity Kubernetes deployment and seamless integration with MongoDB and Realm for mobile development Scalable VM environments Meeting CISO requirements They were able to automate and offload operational tasks while taking advantage of built-in security best practices, and this in turn reduced regulatory risk. With Atlas and Google Cloud, their teams can also elastically scale and provision on-demand resources to build more microservices, in-line with their agile development requirements. Click here to learn more about MongoDB Realm

February 2, 2021

Showingly Transforms Real Estate with MongoDB Atlas and MongoDB Realm

Buying or selling a house is difficult. There are more steps than you could imagine, and each one feels harder than it should be. The improvements that technology has made in the past decades seem to have passed the industry by. Showingly is trying to make the process less difficult, starting by making it easier to see houses you might want to buy. Buyers can browse listings and book showings with no fuss. Sellers and their agents can make listings available on the app, making it easier for buyers to find them. Agents and other real estate professionals can simplify their workflows. Showingly built its full stack on MongoDB, using MongoDB Atlas and MongoDB Realm . I caught up with Andrew Coca, Co-Founder and President, to learn more. Tell us a little bit about your company. Showingly is a real estate platform for buyers, sellers, and professionals. Of course, it does everything you would expect: sellers can have their house listed on the platform, buyers can browse listings and see all of the important details, agents can manage their listings, and so on. What sets Showingly apart is that consumers can actually drive the showing process, instead of merely searching for homes. On most real estate platforms, if you find a house you’re interested in, you might see a list of times and dates to schedule an appointment, but you’re not actually booking a showing. Instead, the platform sells that to an agent as a lead, and the agent has to follow up with you – they may or may not be able to show you the house at that time. Showingly is an actual showing platform: when you book a showing, we’re really scheduling it for you in our backend. Until now, the home showing process has been prioritizing agent convenience instead of consumer convenience. Now, for the first time, consumers can have the transparency of directly booking the showings they want. We also have features built for agents and other professionals. For example, agents are able to delegate showings. If you’re too busy, you can find another agent to show one of your listings; on the other side of the coin, if you have spare time, you can turn that into money by picking up delegated showings. We’ve been building Showingly for two years, and we launched publicly five months ago. We’re currently live in Alaska, Arizona, Colorado, Hawaii, Massachusetts, South Carolina, Tennessee, and Utah. We continue to integrate with more multiple listing services (MLSs) to launch into new markets. I have to ask: What has your experience been launching a real estate application during a global pandemic? Early on, it made raising funding a little harder, but we were able to find investors who wanted to invest during COVID. Surprisingly, it then made hiring easier: people who had lost their internships or jobs came to work for us. We grew from a couple engineers to a dozen, and we’ve probably built as much in the last two months as we did in the year before. It hasn’t had an enormous effect at a business level. As you might know, real estate markets have reacted in very different ways to COVID. In the Northeast, the market is cautious, but here in Colorado, it’s booming. What was the genesis of Showingly? We wanted to start Showingly after learning the ins and outs of the industry through real estate sales. I’ve been an agent, I’ve managed a team of agents, and I’ve experienced a lot of parts of the process. Real estate is such an archaic industry. Technology has done so much in the last 10 or 20 years, and the real estate industry hasn’t materially improved. Sure, there have been some new applications, but they don’t fundamentally change the process – they just put the old process into a pretty website. We saw a big opportunity to actually transform the industry. How is Showingly using MongoDB? Showingly is built fully on MongoDB. We’re storing all of our data in MongoDB Atlas, running on AWS. We have one main production cluster, plus a few dev and test clusters. Our application backend is built entirely on MongoDB Realm, using Realm Functions . It’s really nice to have it all in one place. We use functions for data movement, like retrieving listings from an MLS, as well as application functionality. When you take any action in the app – displaying listings, displaying showings, creating a new showing, updating a showing, and so on – that’s calling a Realm function to access the data in MongoDB Atlas. For frontend, we have a cross-platform mobile app built with React Native, as well as a web client for agents and other professionals. It’s easy to connect to Realm and Atlas from those different clients. What made you decide to use MongoDB Atlas and Realm’s application development services? We went to MongoDB World last year, and learned about the document model, MongoDB Atlas, MongoDB Realm, and all of your other products. We knew then that it was the right way to build a simple but powerful architecture. MongoDB has made it easy to get started, but it will also scale with our business: we could go nationwide or worldwide, and MongoDB makes that easy. There’s no reason we would ever need to change our backend. As I mentioned, my background is in real estate, not technology. But over the past couple of years, I’ve gotten quite good at working with MongoDB Realm and MongoDB Atlas. The simplicity of JavaScript on the frontend and MongoDB on the backend makes it an easy stack to work with. And getting expert help from a consulting engineer quickly taught us the best practices for developing with MongoDB. Working with MongoDB let us develop a great application quickly. And how would you describe the benefit of MongoDB to your business? Time to market is certainly part of it, as I mentioned. But I wouldn’t have picked MongoDB just for that. Building for the long term is more important to me than time to market. I wouldn’t take on technical debt up front just to be able to move more quickly. I want to build a structure that lasts, and I’m confident that what we’re building with MongoDB Realm and MongoDB Atlas is just that. You mentioned that the ability to handle scaling in the future was key for you. What are your plans for scale? We actually use auto-scaling in Atlas, so our production cluster automatically scales up and down depending on workload. Right now, it’s usually either an M10 or an M20, fluctuating between them as needed. If and when the workload of our application increases beyond that level, Atlas will continue to scale up to match. It’s so easy to set up auto-scaling, so why do it ourselves? And we know that if we need to move to a multi-region cluster or global cluster, that’s very easy to do. And of course, MongoDB Realm is serverless and we don’t need to worry about scale on that side at all. We just define functions, and they run when needed at any scale. You said you worked with a MongoDB consulting engineer. Can you describe that process? Flex Consulting not only gave us the expert help we needed, but pulled us along the path to being expert MongoDB users ourselves. We’ve had several Flex Consulting engagements in the Design & Develop and Optimize tracks. Flex Consulting has been the key to making the best use of MongoDB Atlas and MongoDB Realm for our application. We actually covered a number of different points in our consulting engagements. First and foremost was getting the schema design right. For example, our consulting engineer helped us model the data structure of listings and showings (e.g., embedding vs. linking information), and how to represent data in a way that matches how the application uses it. Getting that design right the first time definitely helped avoid more work down the road. Advice from the MongoDB engineer also helped us control data quality when multiple people and processes can update records. We fetch listings from MLSs, and if all we had to do was present listings, it would be simple. But of course we’re also dealing with showings tied to listings, we’re enriching those records with other fields for our specific use, and there are cases where an agent might be modifying some of that extra data. So when we refresh listings from the MLS or make updates from the application, we need to make sure that those updates aren’t clobbering other data. Our consulting engineer helped us design Realm functions that would have the correct upsert behavior in all cases. These consulting engagements were almost like school for us. We spent time with our consulting engineer understanding the technology and making the right design decisions, which was super valuable. Flex Consulting not only gave us the expert help we needed, but pulled us along the path to being expert MongoDB users ourselves. What’s next for Showingly? In the short term, it’s about growing use in the markets where Showingly is live, and launching into new markets. On the product side, we’re adding some social elements so that agents can see what their peers are up to. We’re also very eager to do more analysis of our data, integrating machine learning to do things such as improving pricing for some of our agent features. What advice would you give to someone considering MongoDB for their next project? First, take advantage of consulting. MongoDB’s consulting is the best way to build your project not only quickly, but correctly for the long term. Second, you’ll get the biggest benefit from utilizing the whole stack of Realm and Atlas together. There’s an enormous amount of convenience from having everything in one place. In the long term, we want Showingly to be the real estate platform. To date, real estate hasn’t had a good platform that facilitates the process. If you’ve ever bought or sold a home, you know how convoluted it is. You should be able to do everything in a single platform: finding potential homes, getting pre-approvals, scheduling and going on showings, writing an offer, signing contracts and other documents, even closing and getting insurance. We want Showingly to be that platform, to turn a 30-day process into a 3-day process.

December 22, 2020

Part 1: The Modernization Journey with Exafluence and MongoDB

Welcome to the first in a series of conversations between Exafluence and MongoDB about how our partnership can use open source tools and the application of data, artificial intelligence/machine learning and neuro-linguistic programming to power your business’s digital transformation. In this installment, MongoDB Senior Partner Solutions Architect Paresh Saraf and Director for WW Partner Presales Prasad Pillalamarri sit down with Exafluence CEO Ravikiran Dharmavaram and exf Insights Co-Founder Richard Robins to discuss how to start the journey to build resilient, agile, and quick-to-market applications.   From Prasad Pillalamari: I first met Richard Robins, MD & Co-Founder of exf Insights at Exafluence back in June 2016 at a MongoDB world event. Their approach towards building data-driven applications was fascinating for me. Since then Exafluence has grown by leaps and bounds in the System Integration space and MongoDB has outperformed its peers in the database market. So Paresh and I decided to interview Richard to deep-dive into their perspective on Modernization with MongoDB. Prasad & Paresh: We first met the Exafluence team in 2016. Since then, MongoDB has created the Atlas cloud data platform that now supports multi-cloud clusters and Exafluence has executed multiple projects on mainframe and legacy modernization. Could you share your perspective on the growth aspects and synergies of both companies from a modernization point of view? Richard Robins: Paresh and Prasad, I’m delighted to share our views with you. We’ve always focused on what happens after you successfully offload read traffic from mainframes and legacy RDBMS to the cloud. That’s digital transformation and legacy app modernization. Early on, Exafluence made a bet that if the development community embraces something we should, too. That’s how we locked in on MongoDB when we formed our company. Having earned our stripes in the legacy data world, we knew that getting clients to MongoDB would mean mining the often poorly documented IP contained in the legacy code. That code is often where long-retired subject matter expert (SME) knowledge resides. To capture it, we built tools to scan COBOL/DB2 and stored procedures to reverse engineer the current state. This helps us move clients to a modern cloud native application, and it's an effective way to merge, migrate, and retire the legacy data stores all of our clients contend with. Once we’d mined the IP with those tools we needed to provide forward-engineered transformation rules to reach the new MongoDB Atlas endpoint. Using a metadata driven approach, we built a rules catalog that included a full audit and REST API to keep data governance programs and catalogs up to date as an additional benefit of our modernization efforts. We’ve curated these tools as exf Insights , and we bring them to each modernization project. Essentially, we applied NLP, ML, and AI to data transformation to improve modernization analysts’ efficiency, and added a low-to-no code transformation rule builder, complete with version control and rollback capabilities. All this has resulted in our clients getting world-class, resilient capabilities at a lower cost in less time. We’re delighted to say that our modernization projects have been successful by following simple tenets — to embrace what the development community embraces and to offer as much help as possible — embodied in the accelerator tools we’ve built. That’s why we are so confident we'll continue our rapid growth. P&P: How do you think re-architecting legacy applications with MongoDB as the core data layer will add value to your business? RR: We believe that MongoDB Atlas will continue to be the developers go-to document database, and that we’ll see our business grow 200-300% over the next three years. With MongoDB Atlas and Realm we can provide clients with resilient, agile applications that scale, are easily upgraded, and are able to run on any cloud as well as the popular mobile iOS and Android devices. Digital transformation is key to remaining competitive and being agile going forward. With MongoDB Atlas, we can give our clients the same capabilities we all take for granted on our mobile apps: they’re resilient, easy to upgrade, usually real-time, scale via Kubernetes clusters, and can be rolled back quickly if necessary. Most importantly, they save our clients money and can be automatically deployed. P&P: At a high level, how will Exafluence help customers take this journey? RR: We’re unusual as a services firm in that we spend 20% of gross revenue on R&D, so our platform and approach are proven. Thus, relatively small teams for our healthcare, financial services, and industrial 4.0 clients can leverage our approach, platform, and tools to deliver advanced analytical systems that combine structured and unstructured data across multiple domains. We built our exf Insights accelerator platform using MongoDB and designed it for interoperability, too. On projects we often encounter legacy ETL and messaging tools. To show how easy it is, we recently integrated exf Insights with SAP HANA and the SAP Data Intelligence platform. Further, we can publish JSON code blocks and provide Python code for integration into ETL platforms like Informatica and Talend. Our approach is to reverse engineer by mining IP from legacy data estates and then forward engineer the target data estate, using these steps and tools: Reverse Engineer Extract stored procedures, business logic, and technical data from the legacy estate and load it into our platform. Use our AI/ML/NLP algorithms to analyse business transformation logic and metadata, with outliers identified for cleansing. Provide DB scans to assess legacy data quality to cleanse and correct outliers, and provide tools to compare DB level data reconciliations. Forward Engineer To produce a clean set of metadata and business transformation logic, and baseline with version control, we: Extract, transform, and load metadata to the target state. Score metadata via NLP and ML to recommend matches to the Analyst who accepts/rejects or overrides recommendations. Analysts can then add additional transformations which are catalogued. Deploy and load cleansed data to the target state platform so any transformations and gold copies may be built. Automate Data Governance via Rest API, Code Block generation (Python/JSON) to provide enterprise catalogs with the latest transforms. P&P: What are your keys to a successful transformation journey? RR: Over the past several years we’ve identified these elements and observations: Subject matter experts and technologists must work together to provide new solutions. There’s a shortage of skilled technologists able to write, deploy, and securely manage next generation solutions. Using accelerators and transferring skills are vital to mitigating the skills shortage. Existing IP that’s buried in legacy applications must be understood and mined in order for a modernization program to succeed. A data-driven approach that combines reverse and forward engineering speeds migration and also provides new data governance and data science catalog capabilities. The building, caring, and feeding of new, open source-enabled applications is markedly different from the way monolithic legacy applications were built. The document model enables analytics and interoperability. Cybersecurity and data consumption patterns must be articulated and be part of the process, not afterthoughts. Even with aggressive transformation plans, new technology must co-exist with legacy applications for some time; progress works best if it’s not a big bang. Success requires business and technology to learn new ways to provide, acquire, and build agile solutions. P&P: Can you talk about solutions you have which will accelerate the modernization journey for the customers? RR: exf Insights helps our clients visualize what’s possible with extensive, pre-built, modular solutions for health care, financial services, and industrial 4.0. They show the power of MongoDB Atlas and also the power of speed layers using Spark and Confluent Kafka. These solutions are readily adaptable to client requirements and reduce the risk and time required to provide secure, production-ready applications. Source data loading. Analyze and integrate raw structured and unstructured data, including support for reference and transactional data. Metadata scan. Match data using AI/NLP, scoring results and providing side-by-side comparison. Source alignment. Use ML to check underlying data and score results for analysts, and leverage that learning to accelerate future changes. Codeless transformation. Empower data SMEs to build the logic with a multiple-sources-to-target approach and transform rules which support code value lookups and complex Boolean logic. Includes versioned gold copies of any data type (e.g., reference, transaction, client, product, etc.). Deployment. Deploy for scheduled or event-driven repeatability and dynamically populate Snowflake or other repositories. Generates code blocks that are usable in your estate or REST API. We used the same 5-step workflow data scientists use when we enabled business analysts to accelerate the retirement of internal data stores to build and deploy the COVID-19 self-checking app in three weeks, including active directory integration and downloadable apps. We will be offering a Realm COVID-19 screening app on web, Android, and IOS to the entire MongoDB Atlas community in addition to our own clients. The accelerator integrates key data governance tools, including exf Insights repository management of all sources and targets with versioned lineage; as-built transformation rules for internal and client implementations; and a business glossary integrated into metadata repositories. P&P: Usually one of the key challenges for businesses is data being locked in silos. RR: We couldn’t agree more. Our data modernization projects routinely integrate with source transactional systems that were never built to work together. We provide scanning tools to understand disparate data as well as ways to ingest, align, and stitch them together. Using health care as an example, exf Insights provides a comprehensive analytical capability, able to integrate data from hospitals, claims, pharmaceutical companies, patients, and providers. Some of this is NonSQL, such as radiological images; for pharma companies we provide capabilities to support clinical research organizations (CROs) via a follow-the-molecule approach. Of course, we also have to work with and subscribe to Centers for Medicare & Medicaid Services (CMS) guidelines. Our data migration focuses on collecting the IP behind the data and making the source, logic, and any transformations rules available to our clients. In financial services, it’s critical to understand source and targets. No matter how data is accessed (federated or direct store), with Spark and Kafka we can talk to just about any data repository. P&P: Once we discover the data to be migrated, we need to model the data according to MongoDB’s data model paradigm. That requires multiple transformations before data is loaded to MongoDB. Can you explain more about how your accelerators help here? RR: By understanding data consumption and then looking at existing data structures, we seek to simplify and then apply the capabilities of MongoDB’s document model. It’s not unlike what a data architect would do in the relational world, but with MongoDB Atlas it’s easier. We ourselves use MongoDB for our exf Insights platform to align, transform, and make data ready for consumption in new applications. We’re able to provide full rules lineage and audit trail, and even support rollback. For the real-time speed layer we use Spark and Kafka as well. This data-driven modernization approach also turns data governance into an active consumer of the rules catalog, so exf Insights works well for regulated industries. P&P: It’s great that we have data migrated now. Consider a scenario where it’s a mainframe application and we have lots of COBOL code in there. It has to be moved to a new programming language like Python, with a change in the data access layer to point to MongoDB. Do you have accelerators which can facilitate the application migration? If so, how? RR: Yes, we do have accelerators that understand the COBOL syntax to create JSON and ultimately Java, which speeds modernization. We also found we had to reverse engineer stored procedures as part of our client engagements for Exadata migration. P&P: Once we migrate the data from legacy databases to MongoDB, validation is the key step. As this is a heterogeneous migration it can be challenging. How can Exafluence add value here? RR: We’ve built custom accelerators that migrate data from the RDBMS world to MongoDB, and offer data comparisons as clients go from development to testing to production, documenting all data transformations along the way. P&P: Now that we’ve talked about all your tools which can help in the modernization journey, can you tell us about how you already helped your customers to achieve this? RR: Certainly. We’ve already outlined how we’ve created solution starters for modernization, with sample solutions as accelerators. But that’s not enough; our key tenet for successful modernization projects is pairing SMEs and developers. That’s what enables our joint client and Exafluence teams to understand the business, key regulations, and technical standards. Our data-driven focus lets us understand the data regardless of industry vertical. We’ve successfully used exf Insights now in financial services, healthcare, and industry 4.0. Whether it’s understanding the nuances of financial instruments and data sources for reference and transactional data, or Medical Device IoT sensors in healthcare, or shop floor IoT and PLC data for predictive analytics and digital twin modeling, a data-driven approach reduces modernization risks and costs. Below are some of the possibilities this data-driven approach has delivered for our healthcare clients using MongoDB Atlas. By aggregating provider, membership, claims, pharma, and EHR clinical data, we offer robust reporting that: Transforms health care data from its raw form into actionable insights that improve member care quality, health outcomes, and satisfaction Provides FHIR support Surfaces trends and patterns in claims, membership, and provider data Lets users access, visualize, and analyze data from different sources Tracks provider performance and identifies operational inefficiencies P&P: Thank you, Richard! Keep an eye out for upcoming conversations in our series with Exafluence, where we'll be talking about agility in infrastructure and data as well as interoperability. MongoDB and Modernization To learn more about MongoDB's overall Modernization strategy, read here .

December 9, 2020

Designing a CLI: 11 Principles from the MongoDB Realm Design Team

Don Norman saw this coming. Back in 2007 , Norman, one of the giants of user-centered design, predicted that the next UI breakthrough in the design world would be the command-line interface (CLI). Noting the limitations of graphical user interfaces and the capabilities of command-line language in search functions, Norman staked this bold claim: “Command line interfaces are back again, hiding under the name of search. Now you see them, now you don't. Now you see them again. And they will get better and better with time: Mark my words, that is my prediction for the future of interfaces.” In 2020, the return of the CLI is, perhaps, debatable. In the world of software development, CLI's still dominate with software professionals , due to their wide availability, high capacity for automation, and cultural fit with the developer ethos. But in the design world, “user experience” is still largely associated with modern web and mobile interfaces — clean whites, curved borders, and gradient buttons. Design interventions of the CLI are often left to the wayside. Here at MongoDB, our designers understand the importance of the CLI to our users. Through internal discussion and collaboration with our product and engineering teams, we are working hard to match the user experience of the CLI to our user’s needs. As such, the MongoDB Realm team has been working on a revamp of our Realm CLI to be released by the end of Q4. In order to improve the experience of using the CLI, our UX research and Realm design teams have conducted primary and secondary research, attempting to figure out how our CLI is used, and how developers commonly interact with CLIs writ large. Based on our findings, we created a list of 11 CLI UX principles for the Realm CLI . We call this our CLI Design Cheat Sheet. Modeled off Nielson-Norman’s Usability Heuristics , this set of principles has allowed us to inform and streamline our CLI design process, and to foster a user-centered approach across our product’s operations. It’s my hope that presenting these principles will help others design better CLIs, too. We’ll provide some CLI illustrations that are not representative of the new Realm CLI, but can help show these principles visually. 1. Allow users to create and clone Realm applications and assets via the CLI Develop short and easy-to-understand commands for app creation and cloning. Consider dividing this command into an object and an action: For example, the command may read realm app clone , where “app” is the object, and “clone” is the action. 2. Use accessible language to bridge the CLI and the real world The system should speak the users' language, with words, phrases, and concepts familiar to the user, rather than system-oriented terms. When creating CLI commands and prompts, use questions or phrases that resemble sentences. Avoid positional arguments, where the order matters. These types of arguments can be confusing. Use flags instead of args. Although they require a little more typing, flags better prevent input errors (e.g., realm fork --from sourceapp --to destapp ) compared to args (e.g., realm fork -sourceapp ) When collecting user information, use questions to make the CLI more conversational. For example, when initializing, the CLI can ask the user questions such as, “What’s your project name?” when determining the user’s framework. 3. Simplify CLI outputs to increase user control Complicated and messy outputs reduce user control and diffuse action. CLI users expect high-level outputs following a command. Make outputs simple so that users don’t need to scroll through multiple lines of text to find what they need. The MongoDB CLI provides a good example of human-friendly CLI output: Human-friendly MongoDB CLI output. Additionally, the CLI should allow for both human-friendly (plaintext) output, and machine-friendly (JSON) output. A user should be able to define which output they’d like to see. For example, the MongoDB CLI has the following functionalities that allow users to define their output as plaintext or JSON: Illustration: Allowing users to choose between different output formats. 4. Make terminologies consistent Try to use the same terminology consistently across your product’s system. In the case of Realm, the product’s CLI should use the same terminology as its GUI, as well as other CLI’s across the MongoDB platform. Additionally, try to draw from the user’s context with terminologies from the tools and CLIs that are already familiar to your users. 5. Prevent errors As mentioned before, the use of flags -- as opposed to args -- is one way to accomplish this. Another consideration is to have clear warnings and retype commands. These can better prevent destructive mistakes. For instance, instead of using a “Y/N” prompt, consider having the user re-type their input, and prefacing with a warning that it may result in drastic changes to their program. Illustration: Providing a warning and a retype command to prevent destructive errors. 6. Maximize user recognition by combining commands or prompting input Make the CLI experience more efficient and easy-to-use by easing the user’s memory load. One way to do that is by giving users a single command to perform a task, which doesn’t require them to remember certain inputs. For MongoDB Realm, this principle can be mobilized to improve our authentication experience. A single command can be called to automatically generate CLI credentials from the Realm CLI for Atlas. If it isn’t possible or ideal to combine several commands, then consider a prompt showing users a choice of complicated options in the CLI. Rather than asking users to type and remember an input, the CLI can show them a series of options. Illustration: A prompt suggesting complicated inputs that are hard to remember 7. Make the CLI more flexible by allowing users to easily set and change configuration options Understood generally, this principle is meant to highlight user freedom and customization capability. The CLI should provide straightforward commands that can allow users to easily set and change things as they wish. In the case of Realm, we are looking to make MongoDB auth configuration file creation automatic upon Realm initialization, with the access token stored in the user’s home folder. This makes configuration more efficient. Additionally, we are considering allowing users to change their config file directly in the CLI to increase user flexibility. 8. Implementing aesthetics and interactions There are different visual considerations that can improve CLI interactions and experiences. Here are a few examples: Adding better color support (e.g., Yellow or green = good; red = wrong) Adding visual hierarchy for tables (e.g., making headers stand out through highlighting) Adding spinners, progress bars, and/or step count to show long-running tasks Illustrations: Different ways to show task runtime. 9. Help users recognize, diagnose, and recover from errors Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution. They should be informative and should indicate next steps. Consider providing or linking to a troubleshooting guide or links to documentation as part of your error messages. 10. Help and documentation For users that decide not to go through the CLI docs, the help command is vital. We recommend writing a small description of the CLI’s purpose in the --help command. Also, consider providing links to documentation when necessary (i.e. as part of an error message). An example of an effective --help output is seen in the MongoDB CLI: The MongoDB CLI's --help output is both descriptive and concise. 11. Maintain user control and freedom by allowing users to opt-out of data collection Allow users to control data collection settings from their config in the CLI. This functionality can be provided with a command. In the case of realm, this may look like: --realm telemetry-disable . We hope these principles provide a solid foundation for designers and developers looking to create effective, user-centered CLIs. Be on the lookout for a new version of our Realm CLI by the end of Q4! Interested in trying MongoDB Realm ? Start building your app today! Start Free

October 21, 2020