2149 results

Take Advantage of Low-Latency Innovation with MongoDB Atlas, Realm, and AWS Wavelength

The emergence of 5G networking signals future growth for low-latency business opportunities. Whether it’s the ever-popular world of gaming, AR/VR, AI/ML, or the more critical areas of autonomous vehicles or remote surgery, there’s never been a better opportunity for companies to leverage low latency application services and connectivity. This kind of instantaneous communication through the power of 5G is still largely in its nascent development, but customers are adapting to its benefits quickly. New end-user expectations mean back-end service providers must meet growing demand. At the same time, business customers expect to have the ability to seamlessly deploy the same cloud-based back-end services that they’re familiar with, close to their data sources or end users. With MongoDB Realm and AWS Wavelength, you can now develop applications that take advantage of the low latency and higher throughput of 5G—and you can do it with the same tools you’re familiar with. The following blog post explores the benefits of AWS Wavelength, MongoDB Atlas, and Realm, as well as how to set up and use each service in order to build better web and mobile applications and evolve user experience. We’ll also walk through a real-world use case, featuring a smart factory as the example. Introduction to MongoDB Atlas & Realm on AWS MongoDB Atlas is a global cloud database service for modern applications. Atlas is the best way to run MongoDB on AWS because, as a fully managed database-as-a-service, it offloads the burden of operations, maintenance, and security to the world’s leading MongoDB experts while running on industry-leading and reliable AWS infrastructure. MongoDB Atlas enables you to build applications that are highly available, performant at a global scale, and compliant with the most demanding security and privacy standards. When you use MongoDB Atlas on AWS, you can focus on driving innovation and business value instead of managing infrastructure. Services like Atlas Search , Realm , Atlas Data Lake and more are also offered, making MongoDB Atlas the most comprehensive data platform in the market. MongoDB Atlas seamlessly integrates with many AWS products. Click here to learn more about common integration patterns. Why use AWS Wavelength? AWS Wavelength is an AWS Infrastructure offering that is optimized for mobile edge computing applications. Wavelength Zones are AWS infrastructure deployments that embed AWS compute and storage services within communications service providers’ (CSP) data centers. AWS Wavelength allows customers to use industry-leading and familiar AWS tools while moving user data closer to them in 13 cities in the US as well as London, UK, Tokyo and Osaka, Japan, and Daejeon, South Korea. Pairing Wavelength with MongoDB’s flexible data model and responsive Realm database for mobile and edge applications, customers get a familiar platform that can run anywhere and scale to meet changing demands. Why use Realm? Realm’s integrated application development services make it easy for developers to build industry-leading apps on mobile devices and the web. Realm comes with three key features: Cross-platform mobile and edge database Cross-platform mobile and edge sync solution Time-saving application development services 1. Mobile & edge database Realm’s mobile database is an open source, developer-friendly alternative to CoreData and SQLite. With Realm’s open source database, mobile developers can build offline-first apps in a fraction of the time. Supported languages include Swift, C#, Xamarin, JavaScript, Java, ReactNative, Kotlin, and Objective-C. Realm’s Database was built with a flexible, object-oriented data model, so it’s simple to learn and mirrors the way developers already code. Because it was built for mobile, applications built on Realm are reliable, highly performant, and work across platforms. 2. Mobile and edge sync solution Realm Sync is an out-of-the-box synchronization service that keeps data up-to-date between devices, end users, and your backend systems, all in real-time. It eliminates the need to work with REST, simplifying your offline-first app architecture. Use Sync to backup user data, build collaborative features, and keep data up to date whenever devices are online—without worrying about conflict resolution or networking code. Figure 2: High-level architecture of implementing Realm in a mobile application Powered by the Realm Mobile and Edge Database on the client-side and MongoDB Atlas on the backend, Realm is optimized for offline use and capable of scaling with you. Building a first-rate app has never been easier. 3. Application development services With Realm app development services, your team can spend less time integrating backend data for your web apps, and more time building the innovative features that push your business initiatives forward. Services include: GraphQL Functions Triggers Data access controls User authentication Reference Architecture High-level design Terminology wise, we will be discussing three main tiers for data persistence: Far Cloud, Edge, and Mobile/IOT. The Far Cloud is the traditional cloud infrastructure business customers are used to. Here, the main parent AWS regions (such as US-EAST-1 in Virginia, US-WEST-2 in Oregon, etc) are used for centralized retention of all data. While these regions are well known and trusted, the issue is that not many users or IOT devices are located in close proximity to these massive data centers and internet-routed traffic is not optimized for low latency. As a result, we use AWS Wavelength regions as our Edge Zones. An Edge Zone will synchronize the relevant subset of data from the centralized Far Cloud to the Edge. Partitioning principles are used such that users’ data will be stored closer to them in one or a handful of these Edge Wavelength Zones, typically located in major metropolitan areas. The last layer of data persistence is on the mobile or IOT devices themselves. If on modern 5G infrastructure, data can be synchronized to a nearby Edge zone with low latency. For less latency-critical applications or in areas where the Parent AWS Regions are closer than the nearest Wavelength Zone, data can also go directly to the Far Cloud. Figure 3: High Level Design of modern edge-aware apps using 5G, Wavelength, and MongoDB Smart factory use case: Using Wavelength, MQTT, & Realm Sync Transitioning from the theoretical, let’s dig one level deeper into a reference architecture. One common use case for 5G and low-latency applications is a smart factory. Here, IOT devices in a factory can connect to 5G networks for both telemetry and command/control. Typically signaling over MQTT, these sensors can send messages to a nearby Wavelength Edge Zone. Once there, machine learning and analysis can occur at the edge and data can be replicated back to the Far Cloud Parent AWS Regions. This is critical as compute capabilities at the edge, while low-latency, are not always full-featured. As a result, centralizing many factories together makes sense for many applications as it relates to long term storage, analytics, and multi-region sync. Once data is in the Edge or the Far Cloud, consumers of this data (such as AR/VR headsets, mobile phones, and more) can access this with low-latency for needs such as maintenance, alerting, and fault identification. Figure 4: High-level three-tiered architecture of what we will be building through this blog post Latency-sensitive applications cannot simply write to Atlas directly. Alternatively, Realm is powerful here as it can run on mobile devices as well as on servers (such as in the Wavelength Zone) and provide low-latency local reads and writes. It will seamlessly synchronize data in real-time from its local partition to the Far Cloud, and from the Far Cloud back or to other Edge Zones. Developers do not need to write complex sync logic; instead they can focus on driving business value through writing applications that provide high performance and low latency. For highly available applications, AWS services such as Auto Scaling Groups can be used to meet the availability and scalability requirements of the individual factory. Traditionally, this would be fronted by a load-balancing service from AWS or an open-source solution like HAProxy. Carrier gateways are deployed in each Wavelength zone and the carrier or client can handle nearest Edge Zone routing. Setting up Wavelength Deploying your application into Wavelength requires the following AWS resources: A Virtual Private Cloud (VPC) in your region Carrier Gateway — a service that allows inbound/outbound traffic to/from the carrier network. Carrier IP — address that you assign to a network interface that resides in a Wavelength Zone A public subnet An EC2 instance in the public subnet An EC2 instance in the Wavelength Zone with a Carrier IP address We will be following the “Get started with AWS Wavelength” tutorial located here . At least one EC2 compute instance in a Wavelength zone will be required for the subsequent Realm section below. The high level steps to achieve that are: Enable Wavelength Zones for your AWS account Configure networking between your AWS VPC and the Wavelength zone Launch an EC2 instance in your public subnet. This will serve as a bastion host for the subsequent steps. Launch the Wavelength application Test connectivity Setting up Realm The Realm components we listed above can be broken out into three independent steps: Set up a Far Cloud MongoDB Atlas Cluster on AWS Configure the Realm Serverless Infrastructure (including enabling sync) Write a reference application utilizing Realm 1. Deploying your Far Cloud with Atlas on AWS For this first section, we will be using a very basic Atlas deployment. For demonstration purposes, even the MongoDB Atlas Free Tier (called an M0) suffices. You can leverage the AWS MongoDB Atlas Quickstart to launch the cluster , so we will not enumerate the steps in specific detail. However, the high-level instructions are: Sign up for MongoDB Atlas account at and then sign in Click the Create button to display the Create New Database Deployment dialog Choose a “Shared” cluster, then choose the size of M0 (free) Be sure to choose AWS as the cloud and here we will be using US-EAST-1 Deploy and wait for the cluster to complete deployment 2. Configuring Realm and Realm Sync Once the Atlas cluster has completed deploying, the next step is to create a Realm Application and enable Realm Sync. Realm has a full user interface inside of the MongoDB Cloud Platform at however it also has a CLI and API which allows connectivity to CI/CD pipelines and processes, including integration with GitHub. The steps we are following will be a high-level overview of a reference application located here . Since Realm configurations can be exported, the configuration can be imported into your environment from that repository. The high level steps to create this configuration are as follows: While viewing your cluster at, click the Realm tab at the top Click “Create a New App” and give it a name such as RealmAndWavelength Choose the target cluster for sync to be the cluster you deployed in the previous step Now we have a Realm app deployed. Next, we need to configure the app to enable sync. Sync requires credentials for each sync application. You can learn more about authentication here . Our application will use API Key Authentication.To turn that on: Click Authentication on the left On the Authentication Providers tab, find API Keys, and click Edit Turn on the provider and Save If Realm has Drafts enabled, a blue bar will appear at the top where you need to confirm your changes. Confirm and deploy the change. You can now create an API key by pressing the “Create API Key” button and giving it a name. Be sure to copy this down for our application later as it cannot be retrieved again for security reasons Also, in the top left of the Realm UI there is a button to copy the Realm App ID. We will need this ID and API key when we write our application shortly. Lastly, we can enable Sync. The Sync configuration relies on a Schema of the data being written. This allows the objects (i.e. C# or Node.JS objects) from our application we are writing in the next step to be translated to MongoDB Documents. You can learn more about schemas here . We also need to identify a partition key. Partition keys are used to decide what subset of data should reside on each Edge node or each mobile device. For Wavelength deployments, this is typically a variation on the region name. A good partition key could be a unique one per API key or the name of the Wavelength Region (e.g. “BOS” or “DFW”). For this latter example, it would mean that your Far Cloud retains data for all zones, but the Wavelength zone in Boston will only have data tagged with “BOS” in the _pk field. The two ways to define a schema are either to write the JSON by hand or automatic generation. For the former, we would go to the Sync configuration, edit the Configuration tab, choose the cluster we deployed earlier, define a partition key (such as _pk as a string), then define the rules of what that user is allowed to read and write. Then you must write the schema on the Schema section of the Realm UI. However, it is often easier to let Realm auto-detect and write the schema for you. This can be done by putting the Sync into “Development Mode.” While you still choose the cluster and partition key, you only need to specify what database you want to sync all of your data to. After that, your application written below is where you can define classes, and upon connection to Realm Sync, the Sync Engine will translate the class you defined in your application into the underlying JSON representing that schema automatically. 3. Writing an application using Realm Sync: MQTT broker for a Smart Factory Now that the back-end data storage is configured, it is time to write the application. As a reminder, we will be writing an MQTT broker for a smart factory. IOT devices will write MQTT messages to this broker over 5G and our application will take that packet of information and insert it into the Realm database. After that, because we completed the sync configuration above, our Edge-to-Far-Cloud synchronization will be automatic. It also works bidirectionally. The reference application mentioned above is available in this GitHub repository . It is based on creating a C# Console application with the documentation here . The code is relatively straightforward: Create a new C# Console Application in Visual Studio Like any other C# Console Application, have it take in as CLI arguments the Realm App ID and API Key. These should be passed in via a Docker environment variable later and the values of these were the values you recorded in the previous Sync setup step Define the RealmObject which is the data model to write to Realm Process incoming MQTT messages and write them to Realm The data model for Realm Objects can be as complex as makes sense for your application. To prove this all works, we will keep a basic model: public class IOTDataPoint : RealmObject { [PrimaryKey] [MapTo("_id")] public ObjectId Id { get; set; } = ObjectId.GenerateNewId(); [MapTo("_pk")] public string Partition { get; set; } [MapTo("device")] public string DeviceName { get; set; } [MapTo("reading")] public int Reading { get; set; } } To sync an object, it must inherit from the RealmObject class. After that, just define getters and setters for each data point you want to sync. The C# implementation of this will vary depending on what MQTT Library you choose. Here we have used MQTTNet so we simply create a new broker with MqttFactory().CreateMqttServer() then start this with specific MqttServerOptionsBuilder where we need to define anything unique to your setup such as port, encryption, and other basic Broker information. However, we need to hook the incoming messages with .WithApplicationMessageInterceptor() so that way any time a new MQTT packet comes into the Broker, we send it to a method to write it to Realm. The actual Realm code is also simple: Create an App with App.Create() and it takes in the argument of the App ID which we are passing in as a CLI argument Log in with app.LogInAsync(Credentials.ApiKey()) and the API Key argument is again passed in as a CLI argument from what we generated before To insert into the database, all writes for Realm need to be done in a transaction. The syntax is straight forward: instantiate an object based on the RealmObject class we defined previously then do the write with a realm.Write(()=>realm.Add({message)}) Finally, we need to wrap this up in a docker container for easy distribution. Microsoft has a good tutorial on how to run this application inside of a Docker container with auto-generated Dockerfiles. On top of the auto-generated Dockerfile, be sure to pass in the arguments of the Realm App ID and API Key to the application as we defined earlier. Learning the inner workings of writing a Realm application is largely outside the scope of this blog post. However there is an excellent tutorial within MongoDB University if you would like to learn more about the Realm SDK. Now that the application is running, and in Docker, we can deploy it in a Wavelength Edge Zone as we created above. Bringing Realm and Wavelength together In order to access the application server in the Wavelength Zone, we must go through the bastion host we created earlier. Once we’ve gone through that jump box to get to the EC2 instance in the Wavelength Zone, we can install any prerequisites (such as Docker), and start the Docker container running the Realm Edge Database and MQTT application. Any new inbound messages received to this MQTT broker will be first written to the Edge and seamlessly synced to Atlas in the Far Cloud. There is a sample MQTT random number generator container suitable for testing this environment located in the GitHub repository mentioned earlier. Our smart factory reference application is complete! At this point: Smart devices can write to a 5G Edge with low latency courtesy of AWS Wavelength Zones MQTT Messages written to that Broker in the Wavelength Zone have low latency writes and are available immediately for reads since it is happening at the Edge through MongoDB Realm Those messages are automatically synchronized to the Far Cloud for permanent retention, analysis, or synchronization to other Zones via MongoDB Realm Sync and Atlas What's Next Get started with MongoDB Realm on AWS for free. Create a MongoDB Realm account Deploy a MongoDB backend in the cloud with a few clicks Start building with Realm Deploy AWS Wavelength in your AWS Account

October 14, 2021

Build a Single View of Your Customers with MongoDB Atlas and Cogniflare's Customer 360

The key to successful, long-lasting commerce is knowing your customers. If you truly know your customers, then you understand their needs and wants and can identify the right product to deliver to them—at the right time and in the right way. However, for most B2C enterprises, building a single view of the customer poses a major hurdle due to copious amounts of fragmented data. Businesses gather data from their customers in multiple locations, such as ecommerce platforms, CRM, ERP, loyalty programs, payment portals, web apps, mobile apps and more. Each data set can be structured, semi-structured or unstructured, delivered as stream or require batch processing, which makes compiling already fragmented customer data even more complex. This has led some organizations to bespoke solutions, which still only provide a partial view of the customer. Siloed data sets make running operations like customer service, targeted marketing and advanced analytics—such as churn prediction and recommendations—highly challenging. Only with a 360 degree view of the customer can an organization deeply understand their needs, wants and requirements, as well as how to satisfy them. A single view of that 360 data is therefore vital for a lasting relationship. In this blog, we’ll walk through how to build a single view of the customer using MongoDB’s database and Cogniflare’s Calledio Customer 360 tool. We’ll also explore a real-world use case focused on sentiment analysis. Building a single view with Calleido's Customer 360 With a Customer 360 database, organizations can access and analyze various individual interactions and touchpoints to build a holistic view of the customer. This is achieved by acquiring data from a number of disparate sources. However, routing and transforming this data is a complex and time-consuming process. Many of the existing Big Data tools often aren’t compatible with cloud environments. These challenges inspired Cogniflare to create Calleido . Figure 1: Calleido Customer 360 Use Case Architecture Calleido is a data processing platform built on top of battle-tested open source tools such as Apache NiFi. Calleido comes with over 300 processors to move structured and unstructured data from and to anywhere. It facilitates batch and real-time updates, and handles simple data transformations. Critically, Calleido seamlessly integrates with Google Cloud and offers one-click deployment. It uses Google Kubernetes Engine to scale up and down based on the demand, and provides an intuitive and slick low-code development environment. Figure 2: Calleido Data Pipeline to Copy Customers From PostgreSQL to MongoDB A real-world use case: Sentiment analysis of customer emails To demonstrate the power of Cogniflare’s Calleido , MongoDB Atlas , and the Customer 360 view, consider the use case of conducting a sentiment analysis on customer emails. To streamline the build of a Customer 360 database, the team at Cogniflare created flow templates for implementing data pipelines in seconds. In the upcoming sections, we’ll walk through some of the most common data movement patterns for this Customer 360 use case and showcase a sample dashboard. Figure 3: Sample Customer Dashboard The flow commences with a processor pulling IMAP messages from an email server (ConsumeIMAP). Each new email that arrives into the chosen inbox (e.g. customer service), triggers an event. Next, the process extracts email headers to determine topline details about the email content (ExtractEmailHeaders). Using the sender's email, Calleido identifies the customer (UpdateAttribute) and extracts the full email body by executing a script (ExecuteScript). Now, with all the data collected, a message payload is prepared and published through Google Cloud Platform (GCP) Pub/Sub (Kafka can also be used) for consumption by downstream flows and other services. Figure 4: Translating Emails to Cloud PubSub Messages The GCP Pub/Sub messages from the previous flow are then consumed (ConsumeGCPPubSub). This is where the power of MongoDB Atlas integration comes in as we verify each sender in the MongoDB database (GetMongo). If a customer exists in our system, we pass the email data to the next flow. Other emails are ignored. Figure 5: Validating Customer Email with MongoDB and Calleido Analysis of the email body copy is then conducted. For this flow, we use a processor to prepare a request body, which is then sent to Google Cloud Natural Language AI to assess the tone and sentiment of the message. The results from the Language Processing API then go straight to MongoDB Atlas so they can be pulled through into the dashboard. Figure 6: Making Cloud AutoML Call with Calleido End result in the dashboard: The Customer 360 database can be used in internal back-office systems to supplement and inform customer support. With a single view, it’s quicker and more effective to troubleshoot issues, handle returns and resolve complaints. Leveraging information from previous client conversations ensures each customer is given the most appropriate and effective response. These data sets can then be fed into analytics systems to generate learnings and optimizations, such as associating negative sentiment with churn rate. How MongoDB's document database helps In the example above, Calleido takes care of copying and routing data from the business source system into MongoDB Atlas, the operational data store (ODS). Thanks to MongoDB’s flexible data structure, we can transfer data in its original format, and subsequently implement necessary schema transformations in an iterative manner. There is no need to run complex schema migrations. This allows for the quick delivery of a single view database. Figures 7 & 8: Calleido Data Pipelines to Copy Products and Orders From PostgreSQL to MongoDB Atlas Calleido allows us to make this transition in just a few simple steps. The tool runs a custom SQL query (ExecuteSQL) that will join all the required data from outer tables and compile the results in order to parallelize the processing. The data arrives in Avro format, then Calleido converts it into JSON (ConvertAvroToJSON) and transforms it to the schema designed for MongoDB (JoltTransformJSON). End result in the Customer 360 dashboard: MongoDB Atlas is the market-leading choice for the Customer 360 database. Here are the core reasons for its world-class standard: MongoDB can efficiently handle non-standardized schema coming from legacy systems and efficiently store any custom attributes. Data models can include all the related data as nested documents. Unlike SQL databases, MongoDB avoids complicated join queries, which are difficult to write and not performant. MongoDB is rapid. The current view of a customer can be served in milliseconds without the need to introduce a caching layer. The MongoDB flexible schema model enables agility with an iterative approach. In the initial extraction, the data can be copied nearly exactly as its original shape. This drastically reduces latency. In subsequent phases, the schema can be standardized and the quality of the data can be improved without complex SQL migrations. MongoDB can store dozens of terabytes of data across multiple data centers and easily scale horizontally. Data can be shared across multiple regions to help navigate compliance requirements. Separate analytics nodes can be set up to avoid impacting performance of production systems. MongoDB has a proven record of acting as a single view database, with legacy and large organizations up and running with prototypes in two weeks and into production within a business quarter. MongoDB Atlas can autoscale out of the box, reducing costs and handling traffic peaks. The data can be encrypted both in transit and at rest, helping to accomplish compliance with security and privacy standards, including GDPR, HIPAA, PCI-DSS, and FERPA. Upselling the customer: Product recommendations Upselling customers is a key part of modern business, but the secret to doing it successfully is that it’s less about selling and more about educating. It’s about using data to identify where the customer is in the customer journey, what they may need, and which product or service can meet that need. Using a customer's purchase history, Calleido can help prepare product recommendations by routing data to the appropriate tools such as BigQuery ML. These recommendations can then be promoted through the call center and marketing teams for both online or mobile app recommendations. There are two flows to achieve this: preparing training data and generating recommendations: Preparing training data First, appropriate data from PostgreSQL to BigQuery is transferred using the ExecuteSQL processor. The data pipeline is scheduled to execute periodically. In the next step, appropriate data is fetched from PostgreSQL, dividing it to 1K row chunks with the ExecuteSQLRecord processor. These files are then passed to the next processor which uses load balancing enabled to utilize all available nodes. All that data then gets inserted to a BigQuery table using the PutBigQueryStreaming processor. Figure 9: Copying Data from PostgreSQL to BigQuery with Calleido Generating product recommendations Next, we move on to generating product recommendations. First, you must purchase Big Query capacity slots, which offer the most affordable way to take advantage of BigQuery ML features. Here, Calleido invokes an SQL procedure with the ExecuteSQL processor, then ensures that the requested BigQuery capacity is ready to use. The next processor (ExecuteSQL) executes an SQL query responsible for creating and training the Matrix Factorization ML model using the data copied by the first flow. Next in the queue, Calleido uses the ExecuteSQL processor to query our trained model to acquire all the predictions and store them in a dedicated BigQuery table. Finally, the Wait processor waits for both capacity slots to be removed, as they are no longer required. Figure 10 & 11: Generating Product Recommendations with Calleido Then, we remove old recommendations through the power of two processors. First, the ReplaceText processor updates the content of incoming flow files, setting the query body. This is then used by the DeleteMongo processor to perform the removal action. Figure 12: Remove Old Recommendations The whole flow ends with copying Recommendations to MongoDB. The ExecuteSQL processor fetches and aggregates the top 10 recommendations per user, all in chunks of 1k rows. Then, the following two processors (ConvertAvroToJSON and ExecuteScript) prepare data to be inserted into the MongoDB collection, by the PutMongoRecord processor. Figure 13: Copy Recommendations to MongoDB End result in the Customer 360 dashboard (the data used here in this example is autogenerated): Benefits of Calleido's 360 customer database on MongoDB Atlas Once the data is available in a centralized operational data store like MongoDB, Calleido can be used to sync it with an analytics data store such as Google BigQuery. Thanks to the Customer 360 database, internal stakeholders can then use the data to: Improve customer satisfaction through segmentation and targeted marketing Accurately and easily access compliance audits Build demand planning forecasts and analyses of market trends Reward customer loyalty and reduce churn Ultimately, a single view of the customer enables organizations to deliver the right message to prospective buyers, funneling those at the brand awareness stage into the conversion stage and ensures retention and post sales mechanics are working effectively. Historically, a 360 view of the customer was a complex and fragmented process, but with Cogniflare’s Calleido and MongoDB Atlas, a Customer 360 database has become the most powerful and cost efficient data management stack that an organization can harness.

October 14, 2021

MongoDB Employees Share Their Coming Out Stories: (Inter)National Coming Out Day 2021

National Coming Out Day is celebrated annually on October 11 and is widely recognized in the United States. MongoDB proudly supports and embraces the LGBTQIA+ community across the globe, so we’ve reimagined this celebration as (Inter)National Coming Out Day. In our yearly tradition of honoring (Inter)National Coming Out Day, we asked employees who are members of the LGBTQIA+ community to share their coming out experiences. These are their stories. Jamie Ivanov , Escalation Manager For as long as I can remember, I always wanted to play with dolls and felt closer to my female cousins. This was rather difficult for someone who is a male at birth being brought up in a fairly conservative family. At a young age, I knew that I was different but lacked a way to describe it. I certainly didn't have the support I needed, so I was brought up as a male. My father went out of his way to “make a man out of me” and toughen me up in ways that weren't exactly the most productive. Going through school, I still knew that I was different because I kept feeling attracted to both genders, but I was too afraid to admit to it. I found a youth group for LGBT teenagers that gave me a safe place to be myself and admit to people who I really was. Outside of that group was still pretty scary; I knew that I had to be straight or I would risk being beaten up or harassed, so I tried to push my queerness aside. In my 30s, after going through the Army and having three children, I realized that I couldn't keep pretending anymore -- who I was wasn't the true me. I started telling people that I was bisexual and hoping that they wouldn't see me as less of a person. Most of the responses I received were "yeah, we kinda figured.” Having that weight off of my shoulders was immensely relieving but something still wasn't quite right; while admitting that helped explain who I was interested in, it still didn't explain who I was. Through a series of fortunate unfortunate events, a lot of the facade I had built up for so many years came down, and I realized that who I was didn't match the body that I was given. It was terrifying to talk to anyone about how I was feeling or who I was, but I finally told people that I am a transgender woman. It was one of the scariest things that I have ever done. Some people didn't understand, and I did lose some family over it, but most people accepted me for who I am with open arms! Since being true to myself, more weight has been lifted off of me, and my only regret is not having the resources and courage to admit who I really was years and years ago. Since I've come out as bi/pansexual and a transgender woman, I've built stronger relationships and felt much more comfortable with myself, even to the point of liking photos of myself (which is something I've always hated and realized it was because it wasn't the real me). When a MongoDB recruiter reached out to me, I asked him the same question I asked other recruiters: "How LGBT friendly is MongoDB (with an emphasis on the transgender part)?" The response I got back from my technical recruiter Bryan Spears was the best response I had received from ANY recruiter, or company, and was the deciding factor in why I chose to work at MongoDB. Here’s what he said: “MongoDB is a company that truly does its best to follow our values like embracing the power of differences; we have many employees who identify as LGBTQ+ or are allies of the LGBTQ+ community. We also have two ERGs, MongoDB Queeries and UGT (Underrepresented Genders in Tech), which both aim to create and maintain a safe environment for those identifying as LGBTQ+ or questioning. From a benefits standpoint, we have expanded the amount of WPATH Standards of Care services available for people who identify as Transgender, Gender Nonconforming, or Transsexual through Cigna. While I know none of the information I have shared tells you what life is like at MongoDB, I hope that it shows we are doing our best to make sure that everyone feels respected and welcome here.” I didn't always have the support I needed to be myself at some previous jobs but MongoDB has raised the bar to a level that is hard to compete with. I'm happy to finally find a place that truly accepts me for who I am. Ryan Francis , VP of Global Demand Generation & Field Marketing Growing up in the 90s in what I used to call “the buckle of the Bible Belt,” I did not believe coming out was in the cards. In fact, I would sit up at night to devise my grand escape to New York City after being disowned (how I planned on paying for said escape remains unknown). I was, however, out to my best friend, Maha. During the summer between my Sophomore and Junior years of high school, I spent time with her family in Egypt. On the return trip, I bought a copy of The Advocate to learn about the big gay life that awaited me after my great escape. Later that month, my mother stumbled upon that magazine when she was cleaning the house. She waited six months to bring it up, but one day in January sat me down in the living and asked, “Are you gay?” I paused for a moment and said… “yup.” She started crying and thanked me for being honest with her. A month later, she picked up a rainbow coffee mug at a yard sale and has been Mrs. PFLAG ever since, organizing pride rallies in our little Indiana hometown and sitting on the Episcopal church vestry this year in order to push through our parish’s blessing of same-sex marriage. Needless to say, I didn’t have to escape. My father was also unequivocally accepting. This is a good thing because my sister Lindsay is a Lesbian, so they sure would have had a tough time given 100% of their kids turned out gay. Lindsay is the real hero here who stayed in our homeland to raise her children with her wife, changing minds every day so that, hopefully, there will be fewer and fewer kids who actually have to make that great escape. Angie Byron , Principal Community Manager Growing up in the Midwest in the 80s and 90s, I was always a “tomboy;” as a young kid, I gravitated to toys like Transformers and He-Man and refused to wear pink or dresses. Since we tended to have a lot in common, most of my best friends growing up were boys; I tended to feel awkward and shy around girls and didn’t really understand why at the time. I was also raised both Catholic and Bahá’í, which led to a very interesting mix of perspectives. While both religions have vastly different belief and value systems, the one thing they could agree on was that homosexuality was wrong (“intrinsically immoral and contrary to the natural law” in the case of Catholicism, and “an affliction that should be overcome” in the case of Bahá’í). Additionally, being “out” as queer at that time in that part of the United States would generally get you made fun of, if not the everlasting crap kicked out of you, so finding other queer people felt nearly impossible. As a result, I was in strong denial about who I was for most of my childhood and gave several valiant but ultimately failed attempts at the whole “trying to date guys” thing as a teenager (I liked guys just fine as friends, but when it came to kissing and stuff it was just, er… no.). In the end, I came to the reluctant realization that I must be a lesbian. I knew no other queer people in my life, and so was grappling with this reality alone, feeling very isolated and depressed. So, I threw myself into music and started to find progressively more and more feminist/queer punk bands whose songs resonated with my experiences and what I was feeling: Bikini Kill, Team Dresch, The Need, Sleater-Kinney, and so on. I came out to my parents toward the end of junior high, quite by accident. Even though I had no concrete plan for doing so, I always figured Mom would be the more accepting one, given that she was Bahá’i (a religion whose basic premise is the unity of religions and equality of humanity), and I’d have to work on Dad for a bit, since he was raised Catholic and came from a family with more conservative values from an even smaller town in the midwest. Imagine my surprise when one day, Mom and I were watching Ricky Lake or Sally Jesse Raphael or one of those daytime talk shows. The topic was something like “HELP! I think my son might be gay!” My mom said something off-handed like “Wow, I don’t know what I would do if one of you came out to me as gay...” And, in true 15-year old angsty fashion, I said, “Oh YEAH? Well you better FIGURE IT OUT because I AM!” and ran into my room and slammed the door. I remember Mom being devastated, wondering what she did wrong as a parent, and so on. I told her, truly, nothing. My parents were both great parents; home was my sanctuary from bullying at school, and my siblings and I were otherwise accepted exactly as we were, tomboys or otherwise. After we’d finished talking, she told me that I had better go tell my father, so I begrudgingly went downstairs. “Dad… I’m gay.” Instead of a lecture or expressing disdain, he just said, “Oh really? I run a gay support group at your Junior High!” and I was totally mind blown. Bizarro world. He was the social worker at my school, so this makes sense, but it was the exact opposite reaction that I was expecting. An important life lesson in not prejudging people. When I moved onto high school, we got… drumroll ... the Internet. Here things take a much happier turn. Through my music, I was able to find a small community of fellow queers (known as Chainsaw), including a ton of us from various places in the Midwest. I was able to learn that I was NOT a freak, I was NOT alone, there were SO many other folks who felt the exact same way, and they were all super rad! We would have long talks into the night, support each other through hardships, and more than a few of us met each other in person and hung out in “real life.” Finding that community truly saved my life, and the lives of so many others. (Side-note: This is also how I got into tech because the chat room was essentially one gaping XSS vulnerability, and I taught myself HTML by typing various tags in and seeing how they rendered.) I never explicitly came out to anyone in my hometown. I was too scared to lose important relationships (it turns out I chose my friends well, and they were all completely fine with it, but the prospect of further isolating myself as a teenager was too terrifying at the time). Because of that, when I moved to a whole new country (Canada) and went to college, the very first thing I did on my first day was introduce myself as “Hi, I’m Angie. I’ve been building websites for fun for a couple of years. Also, I’m queer, so if you’re gonna have a problem with that, it’s probably best we get it out of the way now so we don’t waste each others’ time.” Flash forward to today, my Mom is my biggest supporter, has rainbow stickers all over her car, and has gone to dozens of Pride events. Hacking together HTML snippets in a chat room led to a full-blown career in tech. I gleaned a bit more specificity around my identity and now identify as a homoromantic asexual . Many of those folks I met online as a teenager have become life-long friends. And, I work for a company that embraces people for who they are and celebrates our differences. Life is good. Learn more about Diversity & Inclusion at MongoDB Interested in joining MongoDB? We have several open roles on our teams across the globe and would love for you to transform your career with us!

October 11, 2021

Get Started in No-Code App Development with Unqork and MongoDB

Classic application development can be a long haul process. Even using Agile methodologies, teams of developers still work to build, test, and launch applications in a series of sprints, using many of the same programming languages and approaches as they have for years. That was until the concept of low and no-code app development arrived. Aiming to cut development cycles and empower non-developers, no-code in particular is gaining traction as a go-to-market strategy for modern apps. No-code is an alternative to traditional application and software development processes, allowing apps to be built through a simplified, interactive UI, rather than writing code from scratch. While the concept of no-code tools has been around for a while - from Visual Basic to WYSIWYG web editors like Wix and “super spreadsheets” like Airtable - newer no-code-based application development platforms take the concept a step further, dramatically simplifying the application development process itself. In the booming world of no-code, one company stands above the rest, Unqork. Unqork invented the first completely visual, no-code application platform to help enterprises build custom software faster, with higher quality and lower costs than traditional approaches––all without a single line of code. And with Unqork freeing organizations from the traditional constraints of developing with code, it was only natural that the pioneer of no-code application development partnered with MongoDB to also free them from the constraints of database design and management. Syncing Unqork with MongoDB Atlas Building out a new no-code application with Unqork is efficient and simple. From the database side of things, MongoDB has everything that a business needs with functionality to store, manage, and analyze data. MongoDB and Unqork are each functional and powerful in their own capacity, but when used together, an organization can truly improve and innovate their application development experience. Fig 1: Unqork syncs seamlessly with MongoDB Atlas Use Case Example: Mortgage Approval Application Large financial institutions have many separate functions and applications in use. A typical large bank will employ many different applications and microservices, all of which rely on many different data sources. This can quickly lead to complexity for their software infrastructure. The bank must make intelligent choices for their app development to stay lean and agile and not add unnecessary complexity. In this example, a bank decides to build out a new mortgage approval application as part of their loan origination infrastructure; an otherwise complex process that requires a variety of data, from bank information to customer background information, and data enrichment. The bank wants to build a portal where customers can log on, request a certain amount of money, and the application can decide whether or not they want to approve the mortgage quickly. In theory, the application objective is simple, however it needs to pull on the greater customer database for data collection and analysis. Let's break down the simple steps of this integration process. Step 1: Create an Unqork application Since Unqork is a no-code application, it’s simple to drag and drop objects into the format wanted. For this example, text fields were used for users to input their data into the application. This app is a mortgage approval application, so we wanted to look at the customer’s information as well as how much money they are looking to borrow. Fig 2: Creating an Unqork application The creation of the application is a single screen that allows users to define the overall concept and functions of the mortgage approval application, and manage the influx and outflow of data. Fig 3: Screenshot of our Unqork application for mortgae approval In Figure 3 we see elements and functions simply on the left side of the screen, representing the available functions, that can be dragged into the user screen and put into context with their execution. Step 2: Configuring MongoDB Atlas and Realm While MongoDB is the underlying application data platform for Unqork, it is unlikely that all customer data in a large, multi-department organization like a bank would always reside in Unqork. Therefore the bank's “Client 360 data layer” - held in a separate MongoDB Atlas instance - needs to be connected to Unqork. Given this, the next step is to get your MongoDB cluster ready to connect to the new Unqork-built mortgage approval application. MongoDB Atlas enables this connection via MongoDB Realm’s HTTP Endpoints (previously named Webhooks). MongoDB Realm is a set of application development services that makes it simple to build best-in-class apps across mobile and the web, like HTTP endpoints. Other services include edge-to-cloud data sync, instant GraphQL API, triggers, and functions. To work with MongoDB Realm’s HTTP endpoints, you must first create a Realm app – the central backend instance for your mobile or web application. You can easily create an app using MongoDB’s web GUI, as shown in the figure below. Fig 4: Creating an application in MongoDB Realm Step 3: Connect MongoDB Atlas to Unqork application Finally, in order to realize the benefits of both MongoDB Atlas and Unqork, we need to connect them to each other. As previously stated, this is achieved through MongoDB Realm’s HTTP endpoints. Both MongoDB Realm and Unqork have easy to use interfaces when it comes to sending requests over HTTPS with services like endpoints, and all we need to do is write the logic for the endpoint itself. By using MongoDB Realm functions we are able to access MongoDB Atlas data easily and connect to third party applications simply and with minimal code. Fig 5: MongoDB Atlas and Realm integrate with Unqork through an HTTP endpoint To connect the Unqork application to MongoDB Atlas, we’ll first need to set up our endpoints in MongoDB Realm. This provides us with an endpoint URL that we can use from Unqork. We can also define our HTTP endpoint serverless function here to send and receive data with Unqork. This is done from within Unqork. We navigate to “Administrator Access” and create a connection. Using the HTTP URL from Realm, we can create a connection within Unqork. Going back to our Unqork application, we can then create a plugin and use the integration that we created before. We have to fill in the correct information pertaining to the webhook, but once done, we’ll have our connection. To learn more about migrating to MongoDB Atlas and what that means for accelerated development, connect with the MongoDB team here . To request a demo of Unqork or learn more about development acceleration with their No-code approach, connect here . MongoDB & Unqork: No-Code Together While MongoDB and Unqork individually are powerful platforms, when deployed in tandem they provide a powerful solution to accelerate development. “ MongoDB Atlas gives us the ability to run our database on multiple clouds through the same service,” said Unqork Founder and CEO, Gary Hoberman . “With Atlas, we have the freedom from cloud vendor lock-in—each client can choose where they are the most comfortable hosting their data.” Together, development is automated and optimized, allowing for more time on innovation and less time on maintenance and upkeep. For businesses, the no-code movement has many benefits, including freeing them to focus on differentiating projects, like developing apps that customers actually want, rather than maintaining lines of code and waiting for teams of developers to build the traditional way. The value together brings: Accelerated time-to-market: With Unqork, enterprise applications can be developed three times faster. Teams are empowered to deploy quickly without the traditional code-based approach. Meanwhile, MongoDB, by design, is developer and DevOps friendly with its flexible use and automation. Together, time-to-market is decreased as resources can be spread more evenly across teams. Reduced TCO: With Unqork, the development process requires fewer resources providing three times cost savings versus Total Cost of Ownership (TCO) with a code/low-code-based approach. Meanwhile, MongoDB’s cost of managing and running a database is already taken care of and remains highly available even throughout system updates. Engaged employees: Unqork provides a collaborative platform that frees technology from high-volume development and management tasks so a team can do the best work of their careers. MongoDB allows developers to work in their desired programming language, with idiomatic drivers, flexible data models and intuitive building. Unlocked innovation: Because projects can get ramped up quickly with lower costs by using Unqork, the cost of innovation and experimentation is greatly mitigated. Similarly in MongoDB, creating and developing a database can be done at speeds that drastically beat traditional relational databases. Both Unqork and MongoDB work with large organizations across some of the world’s most regulated and complex sectors, making security yet another key reason the two became partners. “One of the big things that drew us to MongoDB Atlas over the other DBaaS providers was the security features,” Hoberman added. "When we partner up with a client, we take over and support their most critical processes and data assets." Unqork keeps all data encrypted in-transit and at-rest throughout the entire platform and Atlas allows each Unqork customer to have their own MongoDB instance with their own development and production environments.

October 7, 2021

Honoring Hispanic Heritage Month

We’re honoring Hispanic Heritage Month (September 15 to October 15) in a few ways here at MongoDB! First, hear from three MongoDB employees about their own experiences and what this month means to them. Then, keep scrolling for a Spotify playlist, reading list, and movie list curated by members of our affinity group the Underrepresented People of Color Network (TUPOC). Alicia Raymond , Director, HR Business Partner (Core & Cloud), New York City At 18 years old, and without knowing a word of English, my mother left behind her entire family in Chile to come to the United States. This was in 1973, shortly before the dictator Augusto Pinochet came into power. The following years in Chile were tumultuous and my mother, who was now married to a U.S. military member, relocated frequently. Over time, she lost contact with her family in Chile. Years later, I was a college student at the University of North Carolina at Chapel Hill on a Morehead-Cain scholarship. The scholarship allowed me to take part in various summer activities, including a summer of studying abroad. Chile was on the list of countries where I could study, so I jumped at the opportunity to go there and find my family. As soon as the plane touched down, I began searching for traces of my family members. This was before the prevalence of social media, so I spent a lot of time sifting through phone books. Finally, I was able to locate a phone number for my mother’s younger sister, Esther, but I didn’t call her right away. I was anxious about how I would fit in with my Chilean relatives. My identity as Latina had always felt a bit nebulous — a common feeling among multiracial, multicultural people and second-generation immigrants. I was Spanglish-speaking and white-passing, and I had not grown up among a Latinx community in the U.S. At the time, I struggled to feel like part of the Latinx community, but I also felt a deep obligation not to abandon the complex mix of identities I inherited from my mother — a mix we are still learning about today. Until recently, she didn’t know she was almost half Indigenous American — a detail her parents hid to improve their chances of integrating into the middle class of Chilean society. Alicia with her mother and aunts from Chile in New York City Eventually, I worked up the courage to make the call. After a few rings of the phone, someone picked up on the other end. I confirmed that it was Esther and then, in broken Spanish, I explained who I was and that I was in Chile. Esther’s excitement melted away all of my concerns. We scheduled a time to meet in person that week, and we have remained in contact ever since. After re-establishing and maintaining contact with my Chilean family, my bonds with my Chilean heritage strengthened. Although my cultural identity still feels complicated, within that complexity lies an incredible blessing. It has given me the opportunity to navigate multiple worlds and be shaped by varied perspectives and communities. That’s not to imply that those identities always meshed in a frictionless way — my father’s parents almost disowned him for marrying my Latina mother — but even that friction helped expand my view of the world. In a career context, this has allowed me to be highly adaptable to new circumstances, adept at perspective-taking, and flexible enough in my own beliefs to understand others’ viewpoints. Those skills are essential for my role as an HR Business Partner, where the issues I face often involve multiple stakeholders, rarely have one right answer, and require a big dollop of creative problem-solving. I am eternally grateful for the multifaceted lens my cultural background has provided me. Alicia's mother as a child, outside the house she grew up in Gustavo Chavez , Senior Solutions Architect, Austin Hispanic Heritage Month is not just a month, it’s a lifestyle! I’m originally from a small town in Mexico and was raised all over the state of Chihuahua. Growing up, I was always fascinated by airplanes and technology, and when I reached high school I had the opportunity to start learning computer programming. My friend’s father owned a payroll-processing company, and he started teaching RPG and COBOL on an IBM System 34 (yeah, I know, I’m dating myself) during the afternoons, so I would go there two or three times a week. This is where my passion for computers and technology really grew and led me to pursue a degree in computer science. After graduating, I began working at a local startup doing offshore work for a mainframe application performance-monitoring company located in Santa Monica, California. The company, Candle Corp, then offered me the opportunity to work for them in the U.S., so my wife and I packed our things in a U-Haul and drove 900 miles west to Los Angeles! IBM acquired Candle Corp in the mid-2000s, which led me to Austin, Texas. After a few years, I had the opportunity to join MongoDB. Diversity is celebrated here, and we all work together toward a common goal while having fun along the way. In my role as a Senior Solutions Architect, I support the LATAM Corporate Sales organization and help align MongoDB technology with customer needs and business goals. My children were born in Los Angeles, where, as an immigrant, I started thinking about my role as a parent in preserving Hispanic language and culture for the next generation. Luckily, it wasn’t too difficult given our location. The shared history between Mexico and the U.S. provides the perfect canvas to paint a picture of blended colors and influences from other places. This is apparent all across Texas and the southwest of our country. The food, architecture, names, battles, and social struggle through the years help build the foundation of what it means to be of Hispanic descent in the United States. We are embedded in the fabric of the region and country, and that is what we aim to share with everybody — our common bonds instead of our differences. Today, as the proud father of two young adults attending university, I can honestly say the job is not done. We still have other generations to share our culture and heritage with. I hope we can ensure that future generations are proud of being Hispanic and proud of the contributions made by members of the Hispanic community to the United States. Gustavo and his family Camilo Velez-Gordon , Field Marketing Specialist, New York City In 2003, my mom and I hopped on a one-way flight from Colombia to Newark International Airport with four suitcases and a lot of unknowns. As a 7-year-old with minimal knowledge of the English language, I had no idea what it meant for me or my future, and I was terrified. My family and I quickly settled in northern New Jersey, and I learned English in less than a year thanks to cartoons and shows such as Rocket Power and Drake and Josh. Throughout my upbringing, I learned that two things will always be true: Family is and always will be an important part of my life, and in the United States you are in control of your destiny, which may not be the case elsewhere. The older I get, the more significance Hispanic Heritage Month has in my life. This may be due to a deeper understanding of the importance of culture and my background. The month is a great opportunity to reflect on my journey to where I am today, and also a good time to educate the people around me about what it is like to be Latino in today’s America. The tech industry has always been fascinating to me, but, while in school, a career in tech always seemed like a far-fetched goal. Through my network, I was fortunate enough to secure a marketing internship for an ad-tech firm while finishing my senior year as a business student at Montclair State University. Once I got my foot in the door, I was determined to take full advantage of the opportunity. To this day, my main takeaway from the process of getting into tech is that mastering the skill of networking will open many doors in your career. As I approach my two-year anniversary at MongoDB, I frequently look back on my journey to where I am today, and I can’t help but smile. The terrified 7-year-old from 17 years ago came a long way. At MongoDB, I continue to grow, evolve, and learn. During my tenure, I have met incredible people, achieved many milestones, and launched multiple global programs that have had a positive impact on the business. I am so proud of how far my family and I have come, and I could not be more excited for what is to come for MongoDB. Camilo and his family Celebrate the Hispanic and Latinx community's contributions to music, literature, and film Spotify Reading list Title Author The House on Mango Street Sandra Cisneros I Am Not Your Perfect Mexican Daughter Erika L. Sanchez The Brief Wondrous Life of Oscar Wao Junot Diaz Dominicana: A Novel Angie Cruz War Against All Puerto Ricans: Revolution and Terror in America's Colony Nelson A. Denis Latinx Superheroes in Mainstream Comics Frederick Luis Aldama Empire's Workshop: Latin America, the United States, and the Rise of the New Imperialism Greg Grandin Borderlands/La Frontera: The New Mestiza Gloria Anzaldúa The Borders of Dominicanidad Lorgia Garcia-Peña The Battle for Paradise: Puerto Rico Takes on the Disaster Capitalists Naomi Klein The Arawak: The History and Legacy of the Indigenous Natives in South America and the Caribbean Charles River Editors The Indian Chronicles José Barreiro Eva Luna Isabel Allende The Bronx Evelyn Gonzalez Barrio Dreams: Puerto Ricans, Latinos, and the Neoliberal City Arlene Dávila Bodega Dreams Ernesto Quiñonez The Eagle's Throne Carlos Fuentes The Poet X Elizabeth Acevedo When I Was Puerto Rican: A Memoir

October 4, 2021

MongoDB is a Crain's Best Place to Work in NYC for the Fifth Year in a Row

We’re thrilled to announce that MongoDB has made Crain’s 2021 Best Places to Work in New York City list. This is the fifth year in a row that we’ve ranked among Crain’s top 100 companies in New York City, coming in at #29 for 2021. Among large companies specifically, MongoDB ranks #14 out of 47. At MongoDB, we are passionate about our mission of freeing the genius within everyone by making data stunningly easy to work with. This means enabling each individual to pursue their vision, whether they are a developer using our products or an employee. At MongoDB, if you have an idea, you get the trust from leadership and autonomy to run with it while excelling in your role. Every employee can see the direct impact they have on the business and product, as well as the inclusive culture we are building. To drive the personal growth and business impact of our employees, we have committed to developing an open, supportive, and enriching environment for everyone. From meditation sessions and yoga classes to fertility assistance and a generous parental leave policy — the opportunity to make an impact at MongoDB is real and we want to support all of our employees in that journey. It’s important for us to embody our company values, especially when it comes to “Embracing the Power of Differences.” One way we promote this is through our affinity groups , which support our larger commitment to an inclusive community. Our affinity groups provide a collaborative space for employees to mentor and connect with one another through a common interest or identity. In collaboration with our affinity groups, MongoDB supported organizations fighting for racial justice and equal opportunity through a fundraising campaign in 2020. MongoDB pledged $250,000, and through combined efforts with employees and outside contributors, we donated over $330,000 to organizations fighting for justice. While employees have worked from home during COVID-19, we’ve provided telehealth options, mental health support, emergency care leave, company-wide days off, and initiatives to increase social connectivity in a virtual environment. As employees begin to return to our offices, employee health and wellbeing, happiness, and success are of utmost importance to us. We are always striving to make sure that MongoDB is a great place to work for everyone. Hear from some of our New York City employees Marissa Jasso, Product Marketing Manager “As a Latina and Native American in the tech industry, it’s not often I come across a company that makes a consistent effort to ensure all members feel included. To me, that’s a real unicorn company. MongoDB is a deep breath. It’s the relief of knowing that every day, I can bring my whole identity to work.” Paige Jornlin, Manager, Customer Success “MongoDB has an incredible culture. Not only do we have an amazing team that makes me excited to come to work each day, but there are countless growth opportunities and our leaders show so much care for their people. It's truly special. MongoDB is also deeply committed to embracing differences. Without having such a diverse team, we wouldn’t be able to innovate, challenge the norm, or think about different ways of doing things as much as we do.” Blake Deakin, Area VP, Technical Services “We have the opportunity to solve really big, really interesting problems for our customers. There’s a good chance you’ll work on something, see it in the news, and then say, ‘Hey! I helped make that happen.’ For me, that’s one of the most gratifying things about working here.” Interested in joining MongoDB? We have several open roles on our teams across the globe and would love for you to transform your career with us!

September 30, 2021

Three Tips for Writing Your Customer Success Resume

So you’re interested in joining our Customer Success team. We’re thrilled you’ve identified MongoDB as a potential next step in your career! Before applying to one of our many open positions globally, take some time to read through these top three tips for what our Customer Success recruiters look for when reviewing applications. Less is more I’m a big fan of keeping resumes simple. Fun graphics, fonts, colors, and styles may be eye-catching, but at times they can also be distracting. Resumes that are easy to follow get noticed the most and showcase your expertise the best. If a resume’s format is hard to follow or in a hard-to-read font, a recruiter or hiring manager may miss some key points related to your experience that you wanted to highlight. Consider how you will organize the information you are presenting and which accomplishments you will highlight. This can often be an indicator of how you would present information to a customer! It’s a good idea to stick to a format where company names are highlighted in a larger font with short bullet points summarizing your day-to-day responsibilities and accomplishments underneath. Focus on your most recent role or experience that is the closest fit for the role you are applying to. Identify keywords and details in the job description Resumes that highlight key responsibilities and skills listed in the job description really stand out! We do not expect candidates to have all the skills required for our positions since Customer Success can vary across industries and organizations, but be sure to highlight the relevant skills you have that you can bring to MongoDB. It’s great to also showcase that you have a willingness to learn our technology, are coachable, and have progressed in your career. Try to tell a story with your resume and show the Recruiting team the skills you have gained through your past experiences. If you’ve spent any spare time upskilling, I recommend highlighting it to show your enthusiasm for learning and self-development. If you have been involved in any projects outside your core role that you think may be of interest to the team, you can add those too! Highlight details that showcase your experiences My final tip for applying to a role in Customer Success is to share the right details on your resume. Day-to-day responsibilities are great to list, but also make sure to share important details such as what type of customers you work with (SMB vs. Enterprise, for example), how many customers you manage in your portfolio, the region or industry you support, your current KPIs, and any significant achievements you have made in your current or previous roles. We have several teams within Customer Success at MongoDB focusing on different types of customers and regions, so this will help us to identify quickly which team your experience is more aligned with. These are just a few useful tips from my experience hiring for our Customer Success teams here at MongoDB. Be sure to check out our website to learn more about our Global Customer Success program and view our open roles . We hope to see you in the interview process soon! Interested in pursuing a career at MongoDB? We have several open roles on our teams across the globe and would love for you to transform your career with us!

September 30, 2021

Accelerate App Delivery with Cognizant's Next Gen Continuous Integrator

The phrase “digital transformation” is ubiquitous these days. But what does it actually mean? Often, the heart of a successful digital transformation lies in a company’s ability to migrate data from legacy systems to the cloud for easier access while also updating relevant applications quickly and seamlessly to deliver the most optimal experience to customers and end-users. To accomplish this journey in modernization, software teams have embraced agile practices founded on the pillars of collaboration, flexibility and continuous improvement and delivery. This is where continuous integration and continuous delivery (CI/CD) comes in. CI/CD is an agile practice that enables developers to implement small changes on a frequent basis. Conducted in an automated and secure environment that maintains version control, CI/CD not only accelerates app delivery, it also reduces infrastructure costs, minimizes security risks, and allows for zero-downtime deployments and easier rollbacks. The only problem? Many organizations lack the in-house expertise and resources needed to develop an agile CI/CD solution. How Cognizant's Next Generation Integrator can help The “Next Generation Continuous Integrator” is a UI-based, multi-component migration tool which utilizes a template-driven approach for migrating apps from existing CI/CD systems as well as on-boarding new apps to Concourse . The technology is built on two pillars: Templates and Containers. A template-driven approach is environment-agnostic (i.e. it can be run on private or public clouds, hybrid clouds, non-cloud, etc.) especially while handling numerous multi-tiered and multi-technology apps. A container is a standard unit of software that packages up code and all its dependencies and can be published into a Central Hub (public cloud) or a Customer Specific Hub (private). Containers can be used by various teams and applications to run applications reliably from one computing environment to another. How it works Based on the various needs of a project, including the services to be migrated or on-boarded, the Next Gen Continuous Integrator’s usage lifecycle is as follows: Reusable and shared infrastructure templates are built in association with the source CI/CD DevOps team. These form the core driving artefacts for the tool to work. These templates are pushed to GitHub repo and then to MongoDB. The templates are then exposed in the tool’s UI for the end user to choose the appropriate job configurations they need for their target pipelines. The services, according to the chosen configuration above, are then on-boarded or migrated from source CI/CD (Jenkins, GoCD etc) to Concourse. The migration/on-boarding reports are generated and pushed to MongoDB. Benefits Easier and quicker migration — CI/CD allows for system-agnostic migration across a broad spectrum of CI/CD tools through templates Quicker turn-around time — Easily accommodates Change Requests at any phase Improved quality — Standardizes processes faster. For instance, Next Gen Continuous Integrator designs a shared services layer for centralized management of the multiple environments and accounts Increased Savings — Projected cost savings of >70% as automation of the infrastructure leads to freeing up of internal resources who would otherwise be engaged in infrastructure management Why MongoDB for Next Gen Continuous Integrator tool? To understand why MongoDB is best positioned for the Next Gen Continuous Integrator tool, we first need to walk through the significant challenges of using relational databases (RDBMS) with the Next Gen Continuous Integrator. Challenges include: “Infrastructure-as-Code” templates are unstructured as they vary across teams and tools The reports that the tool generates are dynamic in nature, but reporting schema changes are difficult to make in an inflexible RDBMS Migrated source services are interdependent with multiple responsibilities and hence can become time-consuming to onboard them to Concourse The monolithic architecture of RDBMS affects management, scalability, and continuous deployment MongoDB, on the other hand, offers significant advantages: Flexible schema — Able to handle different data types and has the flexibility to store different templates of migration Ease of development — Design supports greater developer productivity and developers find it intuitive to use Rich JSON document and query language — Able to query templates at a deep hierarchical level Portability — For either the container or template driven approach, MongoDB can run across any platform There is no one-size-fits-all solution when it comes to workload migrations or onboarding applications for modern businesses. Next Generation Continuous Integrator can help clients develop a template-driven, containerized migration solution that fits the organization’s specific needs with ease and minimal operations support. Roadmap and contact details Based on the scale of adoption and performance analysis of Next Gen Continuous Integrator tool across multiple tenants, we may move from on-premise MongoDB to cloud-based MongoDB Atlas . Contact for more information. Contact for more information

September 29, 2021

Built with MongoDB: Leadgence

Leetal Gruper and Sergey Bahchissaraitsev worked together previously, but in 2019, they sat down to brainstorm a new direction. Leadgence was born out of their passion for data and business expertise and quickly grew, with Leetal as CEO and Sergey as CTO. With customers ranging from startups to Fortune 500 companies, they tripled their client base three times over in the last quarter. “The Leadgence platform delivers banks, financial services, fintech and insurance companies smart actionable data about SMBs,” said David Citron, Partner at Global Founders Capital, which invested in Leadgence and partners with MongoDB for Startups to support their portfolio companies. “With Leadgence, customers see through the cluttered SMB landscape using industry-specific tags and change and event-based triggers to tailor their outreach knowing who to call, when, and most importantly why they are calling! SMBs are in turn getting offers for services they need, when they need them, saving them time and money that is always scarce when growing a small or midsize business." In this edition of #BuiltWithMongoDB, we chat with Sergey about the evolution of Leadgence, his favorite MongoDB features, and constantly learning new lessons as a company co-founder. MongoDB: In your own words, what does Leadgence do, and how have your products evolved since you launched the company? Sergey: Leadgence grows revenue for enterprise companies that target small and mid-size businesses. We started out offering pure smart data to support sales and marketing for financial services. Initially, there was no user-facing platform. We had paying customers from day one which were getting smart actionable data. Then we launched our first application DataSeeker in December 2020 delivering event and change-based triggered actionable data intelligence. Somewhere in-between, we mapped the company’s future road map to include applications offering services to support the different needs of each of marketing, sales, growth and risk-assessment teams. MongoDB is probably going to back all of our future released applications. MongoDB: How did you decide to build with MongoDB? Sergey: It was pretty straightforward. First of all, MongoDB is the go-to database when you're talking about the back end for APIs. I was also already a bit familiar with it from working on a previous product. I guess the key point is that MongoDB Atlas makes things easy, so we started using the infrastructure as a code approach, and spinned it up with Terraform. Atlas was the key feature that drew us to MongoDB. Its document-native approach makes sense with the Node.js that we use. MongoDB: What has your experience been scaling with MongoDB? Sergey: We started pretty small, so at first we were just running trials, but we eventually had millions of documents in the database. As we’ve scaled and started building our applications, we also use it like an analytics database. We basically run online inquiries on it, where our users can explore our data and get instant results. The MongoDB for Startups program also has been really helpful. They’ve supported us a lot and have been a real partner in our growth. The consulting sessions they offered helped us finalize our analytic database approach. MongoDB: What is your favorite technical article or podcast? Sergey: Startups for the Rest of Us podcast! MongoDB: What are you currently learning? Sergey: I keep evolving in the entrepreneurial side of things. From a technical side, we keep facing challenges and solving them. Sometimes I’m specifically learning about new things from a data science, machine learning, or data processing perspective. Other times I’m learning about scaling the company and bringing people on board. I guess I’m going to keep learning about these things for a while! One big lesson I’ve learned is that when working on a problem, you should try to solve it in the simplest way possible. Complex solutions usually don’t work out in the end. So, if you solve something in a very simple way, it usually means that you understand what you’ve solved. You can make the greatest impact this way. MongoDB: How do you upskill and continue educating yourself? Sergey: I try to communicate with other professionals in entrepreneurship and technical spaces. Networking with individuals pursuing similar work helps me share perspectives and advice. It’s helpful to keep up these connections to understand what’s happening in the market, and what should be done or not done. Hearing others’ opinions about the market helps me understand the kind of direction Leadgence should be going, and what we need to pursue more deeply and analyze further. MongoDB: What’s been the most challenging thing about building Leadgence? Sergey: Building a business is in two words not-simple. We at Leadgence work with cutting-edge technology that is evolving rapidly, requiring us to always be on top of the latest developments. Add to that on-boarding new customers and the constant addition of new features and data requests, well I think you get the picture. Interested in learning more about MongoDB for Startups? Learn more about us here .

September 29, 2021