Applications
Customer stories, use cases, and experiences of MongoDB
Busting the Top Myths About MongoDB vs Relational Databases
February 10, 2025
Applied
MongoDB: Gateway to Open Finance and Financial Data Access
This is the second in a two-part series about open finance and the importance of a flexible data store to open finance innovation. Check out part one here! Open finance is reshaping the financial services industry, pushing traditional institutions to modernize with a data-driven approach. Consumers increasingly expect personalized experiences, making innovation key to customer retention and satisfaction. According to a number of studies 1 , there is an exponential increase of dynamic transformations in financial services, driven primarily by the impact of Banking-as-a-Service (BaaS), embedded banking services, and AI. All of these initiatives are mainly powered by API services intended for data sharing, and have become must-have technical capabilities for financial institutions. Open finance can also unlock massive opportunities for continuous innovation. As a result, financial institutions must provision themselves with the right tools and expertise to be fully aware of the potential risks and challenges of embarking on such a “data-driven” journey. Now, let’s dive deeper into an application of open finance with MongoDB. MongoDB as the open finance data store Integrating diverse financial data while ensuring its security, compliance, and scalability represents a series of considerable challenges for financial institutions. Bringing together data from a variety of backend systems entails a set of complex hurdles for financial ecosystem participants—banks, fintechs, and third-party providers (TPP). First, they need to be able to handle structured, semi-structured, and increasingly unstructured data types. Then, cybersecurity and regulatory compliance concerns must be addressed. What’s more, an increase in data-sharing scenarios can open up potential vulnerabilities, which lead to the risk of breach exposure and cyber-attacks (and, therefore, possible legal penalties and/or eventual reputational damage). Figure 1. The power of open finance. To implement open finance strategies, organizations must first determine the role they will play: whether they act as data holders, are in charge of sharing the data with TPP, or whether they will be data users, the ones able to provide enhanced financial capabilities to end-users. Then, they must choose the most suitable technology for the data management strategy—and this is where MongoDB comes in, functioning as the operational data store. Let’s explore how MongoDB can play a crucial role for both actors—data holders and data users—through an open finance functional prototype. Open finance in action: Aggregated financial view for banking users Figure 2 below shows a digital application from a fictional bank—Leafy Bank—that allows customers to aggregate all their bank accounts into a single platform. Figure 2. Architecture of MongoDB as the open finance data store. Four actors are involved in this scenario: a. Customer - User b. Data Users - Leafy Bank c. Data Holders - External Institution d. Open Finance Data Store - MongoDB Atlas Now let’s go through the steps from the customer experience. Step 1. Log in to the banking application Once logged in, the Leafy Bank digital banking application allows users to aggregate their external bank accounts. It is done behind the scenes, through a RESTFul API request that will usually interchange data in JSON format. For the Leafy Bank prototype, we are using MongoDB and FastAPI together, exposing and consuming RESTful APIs and therefore taking advantage of MongoDB Atlas’s high performance, scalability, and flexibility. Figure 3. Logging in to the banking application. Step 2. User authentication and authorization A crucial step to ensure security and compliance is user consent. End-users are responsible for granting access to their financial information (authorization). In our case, Leafy Bank emulates the OAuth 2.0 authentication. It generates the corresponding tokens for securing the service communication between participants. To achieve efficient interoperability without security issues, data holders must enable a secured technological “fence” for sharing data while preventing the operational risk of exposing core systems. Figure 4. User authorization. Step 3. Data exposure After the authorization has been granted, Leafy Bank will fetch the corresponding account data from the data custodian—external banks (in our fictional scenario, Green Bank or MongoDB Bank)—via APIs. Usually, participants expose customers’ financial data (accounts, transactions, and balances) through their exposed services in JSON format to ensure compatibility and seamless data exchange. Because MongoDB stores data in BSON, a superset of JSON , it provides a significant advantage by allowing seamless storage and retrieval of JSON-like data—making it an ideal backend for open finance. Figure 5. Data exposure. Step 4. Data fetching The retrieved financial data is then pushed into the open finance data store—in our case, in MongoDB Atlas—where it is centrally stored. Unlike rigid relational databases, MongoDB uses a flexible schema model, making it easy for financial institutions to aggregate diverse data structures from different sources, making it ideal for dynamic ecosystems and easy to adapt without costly migrations or downtime. Figure 6. Data fetching from data holder into MongoDB Atlas Data Store. Step 5. Data retrieval Now that the data has been aggregated in the operational data store (powered by MongoDB Atlas), Leafy Bank can leverage MongoDB Aggregation Pipelines for real-time data analysis and enrichment. To become “open finance” compliant, our Leafy Bank provides a holistic financial view and a global position accessible in a single application, thus improving individuals' experience with their finances. Furthermore, this set of features also benefits financial institutions. They can unveil useful insights for building unique services meant to enhance customers' financial well-being. Figure 7. Data retrieval from MongoDB Atlas Data Store. Step 6. Bank connected! In the end, customers can view all their finances in one place, while enabling banks to offer competitive, data-driven, tailored services. Figure 8. Displaying the bank connection in Leafy Bank. Demo in action Now, let’s combine these steps into a real-world demo application: Figure 9. Leafy Bank - MongoDB as the Open Finance Data Store. Advantages of MongoDB for open finance Open finance presents opportunities for all the ecosystem participants. On the one hand, bank customers can benefit from tailored experiences. For personal financial management, it can provide end-users central visibility of their bank accounts. And open finance can enable extended payment initiation services, financial product comparison, enhanced insurance premium assessments, more accurate loan and credit scoring, and more. From a technical standpoint, MongoDB can empower data holders, data users, and TPP to achieve open finance solutions. By offering a flexible schema , banks can adapt to open finance’s evolving requirements and regulatory changes while avoiding the complexity of rigid schemas, yet allowing a secure and manageable schema validation if required. Furthermore, a scalable ( vertical and horizontal ) and cloud-native ( multi-cloud ) platform like MongoDB can simplify data sharing in JSON format, as it has been widely adopted as the data interchange “defacto” format, making it ideal for open finance applications. Internally, MongoDB uses BSON, the binary representation of JSON, for efficient storage and data traversal. MongoDB’s rich extensions and connectors support a variety of frameworks to create RESTful API development. Besides FastAPI, there are libraries for Express.js (Node.js), Django (Python), Spring Boot (Java), and Flask (Python). The goal is to empower developers with an intuitive and easy-to-use data platform that boosts productivity and performance. Additionally, MongoDB offers key features like its aggregation pipeline , which is designed to process data more efficiently by simplifying complex transformations, real-time analytics, and detailed queries. Sophisticated aggregation capabilities from MongoDB allow financial institutions to improve their agility while maintaining their competitive edge, all by having data as their strategic advantage. Lastly, MongoDB provides financial institutions with critical built-in security controls, including encryption, role-based access controls (RBAC), and auditing. It seamlessly integrates with existing security protocols and compliance standards while enforcing privileged access controls and continuous monitoring to safeguard sensitive data, as detailed in the MongoDB Trust Center . Check out these additional resources to get started on your open finance journey with MongoDB: Read part-one of our series to discover why a flexible data store is vital for open finance innovation. Explore our GitHub repository for an in-depth guide on implementing this solution. Visit our solutions page to learn more about how MongoDB can support financial services.
How Cognistx’s SQUARY AI is Redefining Information Access
In a world where information is abundant but often buried, finding precise answers can be tedious and time-consuming. People spend hours a week simply searching for the information they need. Cognistx, an applied AI startup and a member of the MongoDB for Startups program, is on a mission to eliminate this inefficiency. Through its flagship product, SQUARY AI, the company is building tools to make information retrieval faster, more reliable, and radically simpler. As Cognistx seeks to unlock the future of intuitive search with speed, accuracy, and innovation, MongoDB Atlas serves as a reliable backbone for the company’s data operations. A company journey: From bespoke AI projects to a market-ready solution Cognistx started its journey with a focus on developing custom AI solutions for clients. Over time, the company identified a common pain point across industries: the need for efficient, high-quality tools to extract actionable insights from large volumes of data. This realization led it to pivot toward a product-based approach, culminating in the development of SQUARY AI—a next-generation intelligent search platform. SQUARY AI’s first iteration was born out of a bespoke project. The goal was to build a smart search engine capable of extracting answers to open-ended questions across multiple predefined categories. Early on, the team incorporated features like source tracking to improve trustworthiness and support human-assisted reviews, ensuring that the AI’s answers could be verified and trusted. Seeing the broader potential of its technology, Cognistx began using advancements in natural language processing and machine learning, transforming its early work into a stand-alone product designed for diverse industries. The evolution of SQUARY AI: Using state-of-the-art large language models Cognistx initially deployed traditional machine learning approaches to power SQUARY AI’s search capabilities, such as conversation contextualization and multihop reasoning (the ability to combine information from multiple sources to form a more complete answer). Before the rise of large language models (LLMs), this was no small feat. Today, SQUARY AI incorporates state-of-the-art LLMs to elevate both speed and precision. The platform uses a combination of retrieval-augmented generation (RAG), custom text-cleaning methods, and advanced vector search techniques. MongoDB Atlas integrates seamlessly into this ecosystem. MongoDB Atlas Vector Search powers SQUARY AI’s advanced search capabilities and lays the groundwork for even faster and more accurate information retrieval. With MongoDB Atlas, the company can store vectorized data alongside the rest of its operational data. There’s no need to add a separate, stand-alone database to handle vector search. MongoDB Atlas serves as both the operational data store and vector data store. Cognistx offers multiple branches of SQUARY AI, including: SQUARY Chat: Designed for public-facing or intranet deployment, these website chatbots provide instant, 24/7 access to website content, eliminating the need for human agents. It also empowers website owners with searchable, preprocessed AI insights from user queries. These analytics enable organizations to directly address customer needs, refine marketing strategies, and ensure that their sites contain the most relevant and valuable information for their audiences. SQUARY Enterprise: Built with businesses in mind, this enterprise platform helps companies retrieve precise answers from vast and unorganized knowledge bases. Whether it’s assisting employees or streamlining review processes, this tool helps organizations save time, improve team efficiency, and deliver actionable insights. One of the standout features of SQUARY AI is it's AI-driven metrics that assess system performance and provide insights into user interests and requirements. This is particularly valuable for public-facing website chatbots. A powerful database: How MongoDB powers SQUARY AI Cognistx attributes much of its technical success to MongoDB. The company’s history with MongoDB spans years, and its trust in MongoDB’s performance and reliability made the database the obvious choice for powering SQUARY AI. “MongoDB has been pivotal in our journey,” said Cognistx Data Scientist Ihor Markevych. “The scalable, easy-to-use database has allowed us to focus on innovating and refining SQUARY AI without worrying about infrastructure constraints. With MongoDB’s support, we’ve been able to confidently scale as our product grows, ensuring both performance and reliability.” The team’s focus when selecting a database was on cost, convenience, and development effort. MongoDB checked all those boxes, said Markevych. The company’s expertise with MongoDB, coupled with years of consistent satisfaction with its performance, made it the obvious choice. With no additional ramp-up effort necessary, the team was able to deploy very quickly. In addition to MongoDB Atlas Vector Search, the other critical feature of MongoDB is its scalability, which Markevych described as seamless. “Its intuitive structure enables us to monitor usage patterns closely and scale up or down as needed. This flexibility ensures we’re always operating efficiently without overcommitting resources,” Markevych said. The MongoDB for Startups program has also been instrumental in the company’s success. The program provides early-stage startups with free MongoDB Atlas credits, technical guidance, co-marketing opportunities, and access to a network of partners. With help from MongoDB technical advisors, the Cognistx team is now confidently migrating data from OpenSearch to MongoDB Atlas to achieve better performance at a reduced cost. The free MongoDB Atlas credits enabled the team to experiment with various configurations to optimize the product further. It also gained access to a large network of like-minded innovators. “The MongoDB for Startups community has provided invaluable networking opportunities, enhancing our visibility and connections within the industry,” Markevych said. The future: Scaling for more projects Looking ahead, Cognistx is focusing on making SQUARY AI even more accessible and customizable. Key projects include automating the onboarding process, which will enable users to define and fine-tune system behavior from the start. The company also aims to expand SQUARY AI’s availability across various marketplaces. With a successful launch on AWS Marketplace, the company next hopes to offer its product on WordPress, making it simple for businesses to integrate SQUARY Chat into their websites. Cognistx is continuing to refine SQUARY AI’s balance between speed, accuracy, and usability. By blending cutting-edge technologies with a user-centric approach, the company is shaping the future of how people access and interact with information. See it in action Cognistx isn’t just building a tool; it’s building a movement toward intuitive, efficient, and conversational search. Experience the possibilities for yourself— schedule a demo of SQUARY AI today . To get started with vector search in MongoDB, visit our MongoDB Atlas Vector Search Quick Start guide .
Embracing Open Finance Innovation with MongoDB
The term "open finance" is increasingly a topic of discussion among banks, fintechs, and other financial services providers—and for good reason. Open finance, as the next stage of open banking, expands the scope of data sharing beyond traditional banking to include investments, insurance, pension funds, and more. To deliver these enhanced capabilities, financial service providers need a versatile and flexible data store that can seamlessly manage a wide array of financial data. MongoDB serves as an ideal solution, providing a unified data platform that empowers financial services providers to integrate various data sources, enabling real-time analytics, efficient data retrieval, and scalability. These capabilities are pivotal in enhancing customer experiences, providing users with a comprehensive view of their finances, and empowering them with greater visibility and control over their own data. By adopting MongoDB, financial services can seamlessly adapt to the growing demands of open finance and deliver innovative, data-driven solutions. Open finance's past and future As highlighted in a study conducted by the Cambridge Centre for Alternative Finance 1 , the terms 'open banking' and 'open finance' vary globally. Acknowledging these differences, we'll focus on the model displayed in Figure 1 due to its widespread adoption and relevance in our study. Figure 1. The three waves of innovation in financial services. The development of open finance started with open banking, which intended for banks to promote innovation by allowing customers to share their financial data with third-party service providers (TPP) and allow those TPP—fintech and techfin companies—to initiate transactions on their behalf solely in the context of payments. This proved to be an effective way to promote innovation and thus led to a broader spectrum of financial products adding loans, mortgages, savings, pensions, insurance, investments, and more. Leading to this new directive, commonly referred to as: open finance. If we take a step further—regardless of its final implementation—a third development called open data suggests sharing data beyond the traditional boundaries of the financial services industry (FSI), exponentially increasing the potential for financial services by moving into cross-sector offerings, positioning FSI as a horizontal industry rather than an independent vertical as it was previously known. Who and what plays a role in open finance? Among the different actors across open finance, the most important are: Consumers: End-users empowered to grant or revoke consent to share their data primarily through digital channels. Data holders: These are mainly financial services companies, and thereby consumer data custodians. They are responsible for controlling the data flow across the different third-party providers (TPPs). Data users: Data users are common third-party providers offering their services based on consumers’ data (upon request/consent). Connectivity providers: Trusted intermediaries that facilitate data flow, also known as TSPs in the EU and UK, and Account Aggregators in India. Regulatory authorities: Set standards, oversee processes, and may intervene in open finance implementation. They may vary according to the governance type. The interactions between all these different parties define the pillars for open finance functioning: Technology: Ensures secure data storage and the exposure-consumption of services. Standards: Establishes frameworks for data interchange schemas. Regulations and enforceability: Encompasses security policies and data access controls. Participation and trust: Enables traceability and reliability within a regulated ecosystem. Figure 2. High-level explanation of data sharing in open finance. Drivers behind open finance: Adoption, impact, and compliance Open finance seeks to stimulate innovation by promoting competition, safeguarding consumer privacy, and ensuring market stability—ultimately leading to economic growth. Additionally, it has the potential to provide financial institutions with greater access to data and better insights into consumers' preferences, allowing them to tailor their offerings and to enhance user experiences. This data sharing between the ecosystem’s participants requires a regulated set of rules to ensure data protection, security, and compliance according to each jurisdiction. As seen in Figure 3 below, there are two broad drivers of open finance adoption: regulation-led and market-driven adoption. Whether organizations adopt open finance depends on factors like market dynamics, digital readiness, and regulatory environment. Figure 3. An illustrative example of open finance ecosystem maturity. Even though there is not one single, official legal framework specifying how to comply with open finance, countries around the world have crafted their own specific set of norms as guiding principles. Recent market research reports reveal how several countries are already implementing open finance solutions, each coming from different starting points, with their own economic goals and policy objectives. In Europe, the Revised Payment Services Directive (PSD2) combined with the General Data Protection Regulation (GDPR) form the cornerstone of the regulatory framework. The European Commission published a proposal in June 2023 for a regulation on a framework for Financial Data Access 2 (FiDA) set to go live in 2027. 3 In the UK, open finance emerged from the need to address the market power held by a few dominant banks. In India, open finance emerged as a solution to promote financial inclusion by enabling identity verification for accounts opening through the national ID system. The aim is to create a single European data space – a genuine single market for data, open to data from across the world – where personal as well as non-personal data, including sensitive business data, are secure and businesses also have easy access to an almost infinite amount of high-quality industrial data, boosting growth and creating value, while minimising the human carbon and environmental footprint. 4 Build vs. buy: Choosing the right open finance strategy One of the biggest strategic decisions financial institutions face is whether to build their own open finance solutions in-house or buy from third-party open finance service providers. Both approaches come with trade-offs: Building in-house provides full ownership, flexibility, and control over security and compliance. While it requires significant investment in infrastructure, talent, and ongoing maintenance, it ensures lower total cost of ownership (TCO) in the long run, avoids vendor lock-in, and offers complete traceability—reducing reliance on external providers and eliminating “black box” risks. Institutions that build their own solutions also benefit from customization to fit specific business needs and evolving regulations. Buying from a provider accelerates time to market and reduces development costs while ensuring compliance with industry standards. However, it introduces potential challenges such as vendor lock-in, limited customization, and integration complexities with existing systems. For financial institutions that prioritize long-term cost efficiency, compliance control, and adaptability, the building approach offers a strategic advantage—though it comes with its own set of challenges. What are the challenges and why do they matter? As open finance continues to evolve, it brings significant opportunities for innovation—but also introduces key challenges that financial institutions and fintech companies must navigate. These challenges impact efficiency, security, and compliance, ultimately influencing how quickly new financial products and services can reach the market. 1. Integration of data from various sources Open finance relies on aggregating data from multiple institutions, each with different systems, APIs, and data formats. This complexity leads to operational inefficiencies, increased latency, and higher costs associated with data processing and infrastructure maintenance. Without seamless integration, financial services struggle to provide real-time insights and a frictionless user experience. 2. Diverse data types Financial data comes in various formats—structured, semi-structured, and unstructured—which creates integration challenges. Many legacy systems operate with rigid schemas that don’t adapt well to evolving data needs, making it difficult to manage new financial products, regulations, and customer demands. Without flexible data structures, innovation is slowed, and interoperability between systems becomes a persistent issue. 3. Data security With open finance, vast amounts of sensitive customer data are shared across multiple platforms, increasing the risk of breaches and cyberattacks. A single vulnerability in the ecosystem can lead to data leaks, fraud, and identity theft, eroding customer trust. Security vulnerabilities have financial consequences and can result in legal examination and long-term reputational damage. 4. Regulatory compliance Navigating a complex and evolving regulatory landscape is a major challenge for open finance players. Compliance with data protection laws, financial regulations, and industry standards—such as GDPR or PSD2—requires constant updates to systems and processes. Failure to comply can lead to legal penalties, substantial fines, and loss of credibility—making it difficult for institutions to operate confidently in a global financial ecosystem. These challenges directly impact the ability of financial institutions to innovate and launch new products quickly. Integration issues, security concerns, and regulatory complexities contribute to longer development cycles, operational inefficiencies, and increased costs—ultimately slowing the time to market for new financial services. In a highly competitive industry where speed and adaptability are critical, overcoming these challenges is essential for success in open finance. MongoDB as the open finance data store To overcome open finance’s challenges, a flexible, scalable, secure, and high-performing data store is required. MongoDB is an ideal solution, as it offers a modern, developer-friendly data platform that accelerates innovation while meeting the critical demands of financial applications. Seamless integration with RESTful JSON APIs According to OpenID’s 2022 research , most open finance ecosystems adopt RESTful JSON APIs as the standard for data exchange, ensuring interoperability across financial institutions, third-party providers, and regulatory bodies. MongoDB’s document-based model natively supports JSON, making it the perfect backend for open banking APIs. This enables financial institutions to ingest, store, and process API data efficiently while ensuring compatibility with existing and emerging industry standards. Flexible data model for seamless integration Open finance relies on diverse data types from multiple sources, each with different schemas. Traditional relational databases require rigid schema migrations, often causing downtime and disrupting high-availability services. MongoDB's document-based model—with its flexible schema—offers an easy, intuitive, and developer-friendly solution that eliminates bottlenecks, allowing financial institutions to adapt data structures dynamically, all without costly migrations or downtime. This ensures seamless integration of structured, semi-structured, and unstructured data, increasing productivity and performance while being cost-effective, enables faster iteration, reduced complexity, and continuous scalability. Enterprise-grade security and compliance Security and compliance are non-negotiable requirements in open finance, where financial data must be protected against breaches and unauthorized access. MongoDB provides built-in security controls, including encryption, role-based access controls, and auditing. It seamlessly integrates with existing security protocols and compliance standards, ensuring adherence to regulations such as GDPR and PSD2. MongoDB also enforces privileged access controls and continuous monitoring to safeguard sensitive data, as outlined in the MongoDB Trust Center . Reliability and transactional consistency Financial applications demand zero downtime and high availability, especially when processing transactions and real-time financial data. MongoDB’s replica sets ensure continuous availability, while its support for ACID transactions guarantees data integrity and consistency—critical for handling sensitive financial operations such as payments, lending, and regulatory reporting. The future of open finance The evolution of open finance is reshaping the financial industry, enabling seamless data-sharing while introducing new challenges in security, compliance, and interoperability. As financial institutions, fintechs, and regulators navigate this shift, the focus remains on balancing innovation with risk management to build a more inclusive and efficient financial ecosystem. For organizations looking to stay ahead in this landscape, choosing the right technology stack is crucial. MongoDB provides the flexibility, scalability, and security needed to power the next generation of open finance applications—helping financial institutions accelerate innovation while ensuring compliance and data integrity. In Part 2 of our look at open finance, we’ll explore a demo from the Industry Solutions team that leverages MongoDB to implement an open finance strategy that enhances customer experience, streamlines operations, and drives financial accessibility. Stay tuned! Head over to our GitHub repo to view the demo. Visit our solutions page to learn more about how MongoDB can support financial services. 1 CCAF, The Global State of Open Banking and Open Finance (Cambridge: Cambridge Centre for Alternative Finance, Cambridge Judge Business School, University of Cambridge, 2024). 2 “The Financial Data Access (FiDA) Regulation,” financial-data-access.com, 2024, https://www.financial-data-access.com/ 3 Maout, Thierry, “What is Financial Data Access (FiDA), and how to get ready?”, July 16th, 2024, https://www.didomi.io/blog/financial-data-access-fida?315c2b35_page=2 4 European Commission (2020), COMMUNICATION FROM THE COMMISSION TO THE EUROPEAN PARLIAMENT, THE COUNCIL, THE EUROPEAN ECONOMIC AND SOCIAL COMMITTEE AND THE COMMITTEE OF THE REGIONS, EUR-Lex.
Innovating with MongoDB | Customer Successes, March 2025
Hello and welcome! This is the first installment of a new bi-monthly blog series showcasing how companies around the world are using MongoDB to tackle mission-critical challenges. As the leading database for modern applications, MongoDB empowers thousands of organizations to harness the power of their data and to drive creativity and efficiency across industries. This series will shine a light on some of those amazing stories. From nimble startups to large enterprises, our customers are transforming data management, analytics, and application development with MongoDB's flexible schema, scalability, and robust cloud services. What do I mean? Picture retailers like Rent the Runway improving customer experiences with real-time analytics, fintech companies such as Koibanx speeding up and securing transaction processes, and healthcare companies like Novo Nordisk optimizing the path to regulatory approvals. With MongoDB, every developer and organization can fully tap into the potential of their most valuable resource: their data. So please read on—and stay tuned for more in this blog series!—to learn about the ingenuity of the MongoDB customer community, and how they’re pushing the boundaries of what's possible. Lombard Odier Lombard Odier , a Swiss bank with a legacy dating back to 1796, transformed its application architecture with MongoDB to stay at the forefront of financial innovation. Confronted with the challenge of modernizing its systems amidst rapid digital and AI advancements, the bank leveraged MongoDB’s Application Modernization Factory and generative AI to streamline its application upgrades. This initiative resulted in up to 60x faster migration of simple code and slashed regression testing from three days to just three hours. By transitioning over 250 applications to MongoDB, including its pivotal portfolio management system, Lombard Odier significantly reduced technical complexity and empowered its developers to focus on next-generation technologies. SonyLIV SonyLIV faced challenges with its over-the-top (OTT) video-streaming platform. Their legacy relational database had poor searchability, complex maintenance, and slow content updates. Critically, it lacked the scalability necessary to support 1.6 million simultaneous users. To power their new CMS— ‘Blitz’—SonyLIV selected MongoDB Atlas’s flexible document model to improve performance and lower search query latency by 98%. Collaborating with MongoDB Professional Services , SonyLIV optimized API latency using MongoDB Atlas Search and Atlas Online Archive , effectively managing over 500,000 content items and real-time updates. With their new high-performing, modern solution in place, SonyLIV can now deliver flawless customer experiences to the world, faster. Swisscom Swisscom , Switzerland's leading telecom and IT service provider, harnessed MongoDB to enrich its banking sector insights with AI. Faced with the challenge of streamlining access to its extensive library of over 3,500 documents, Swisscom utilized MongoDB Atlas and MongoDB Atlas Vector Search capabilities to transform unstructured data into precise, relevant content summaries in seconds. In just four months, Swisscom launched a production-ready platform with improved relevance, concrete answers, and transparency. The project sets a new standard in Swiss banking, and showcases Swisscom's commitment to driving the digital future with advanced AI solutions. Victoria’s Secret Victoria's Secret’s e-commerce platform processes thousands of transactions daily across over 2.5 billion documents on hundreds of on-premises databases. Experiencing high costs and operational constraints with its monolithic architecture, the retailer initially adopted CouchDB but faced challenges like data duplication and limited functionality. In 2023, Victoria's Secret migrated to MongoDB Atlas on Azure , achieving zero downtime while optimizing performance and scalability. Over four months, they successfully migrated more than four terabytes of data across 200 databases, reducing CPU core usage by 75% and achieving a 240% improvement in API performance. The move to MongoDB also allowed the retailer to introduce additional products, like MongoDB Atlas Vector Search, resulting in significant operational efficiencies and cost savings. Video spotlight Before you go, be sure to watch one of our recent customer videos featuring the Danish pharmaceutical giant, Novo Nordisk . Discover how Novo Nordisk leveraged MongoDB and GenAI to reduce the time it takes to produce a Clinical Study Report (CSR) from 12 weeks to 10 minutes.. Want to get inspired by your peers and discover all the ways we empower businesses to innovate for the future? Visit our Customer Success Stories hub to see why these customers, and so many more, build modern applications with MongoDB.
Modernizing Telecom Legacy Applications with MongoDB
The telecommunications industry is currently undergoing a profound transformation, fueled by innovations in 5G networks, the growth of Internet of Things applications, and the rapid rise of AI. To capitalize on these technologies, companies must effectively handle increasing volumes of unstructured data, which now represents up to 90% of all information, while also developing modern applications that are flexible, high-performance, and scalable. However, the telecommunications industry's traditional reliance on relational databases such as PostgreSQL presents a challenge to modernization. Their rigid structures limit adaptability and can lead to decreased performance as table complexity grows. With this in mind, this blog post explores how telecom companies can modernize their legacy applications by leveraging MongoDB’s modern database and its document model. With MongoDB, telecom companies can take advantage of the latest industry innovations while freeing their developers from the burdens of maintaining legacy systems. Navigating legacy system challenges Legacy modernization refers to the process of updating a company’s IT infrastructure to align it with the latest technologies and workflows, and ultimately advancing and securing strategic business goals. For telecom companies, this modernization involves overcoming the limitations of their legacy systems, which hinder adjustment to changing market conditions that demand greater system scalability and availability to run real-time operations. The main drawbacks of legacy technologies like relational databases stem from their design, which wasn’t built to support the data processing capabilities required for modern telecom services. These limitations, as illustrated in Figure 1 below, include rigid data schemas, difficulty handling complex data formats, limited scaling ability, and higher operational costs for maintenance. Figure 1. The limitations of legacy systems. Expanding on these limitations, relational databases depend on a predefined schema, which becomes difficult to modify once established, as changes entail extensive restructuring efforts. In telecommunications, handling growing data volumes from connected devices and 5G networks can rapidly become burdensome and costly due to frequent CPU, storage, and RAM upgrades. Over time, technology lock-in can further escalate costs by hindering the transition to alternative solutions. Altogether, these factors hold back modernization efforts urging telecoms to transform their legacy systems to newer technologies. To overcome these challenges, telecom companies are replacing these legacy systems with modern applications that effectively provide them with greater scalability, enhanced security, and high availability, as shown in Figure 2. However, achieving this transition can be a daunting task for some organizations due to the complexity of current systems, a lack of internal technical expertise, and the hurdles of avoiding downtime. Therefore, before transforming their outdated systems, telecom companies must carefully select the appropriate technologies and formulate a modernization strategy to facilitate this transition. Figure 2. Characteristics of modern applications. Getting onboard with MongoDB Enter MongoDB. The company’s document-oriented database offers a flexible data model that processes any information format, easily adapting to specific application requirements. MongoDB Atlas —MongoDB’s unified, modern database—delivers a robust cloud environment that efficiently manages growing data volumes through its distributed architecture, ensuring seamless connectivity and enhanced performance. Moreover, as telecom providers prioritize cybersecurity and innovation, MongoDB includes robust security measures—comprising encryption, authentication, authorization, and auditing—to effectively protect sensitive information and ensure regulatory compliance. Additionally, leveraging MongoDB’s document model with built-in Atlas services like Vector Search , Atlas Charts , and Stream Processing allows telecommunications organizations to streamline advanced industry cases, including single customer view, AI integrations, and real-time analytics. Figure 3. Core MongoDB modernization features for Modernization. Recognizing these benefits, leading telecom companies like Nokia , Swisscom , and Vodafone have successfully modernized their applications with MongoDB. However, selecting the right technology is only part of the modernization process. In order to ensure a successful and effective modernization project, organizations should establish a comprehensive modernization strategy. This process typically follows one of three following paths: Data-driven modernization: this approach transfers all data from the legacy system to the new environment and then migrates applications. Application-driven modernization (all-or-nothing): this approach executes all reads and writes for new applications in the new data environment from the start, but leaves the business to decide when to retire existing legacy applications. Iterative modernization (one-step-at-a-time): this approach blends the previous paths, starting with the modernization of the least complex applications and incrementally moving forward into more complex applications. Read this customer story to learn more about telecoms migrating to MongoDB. With this overview complete, let's dive into the migration process by examining the iterative modernization of a telecom billing system. Modernizing a telecom billing system Telecom billing systems often consist of siloed application stacks segmented by product lines like mobile, cable, and streaming services. This segmentation leads to inefficiencies and overly complex architectures, highlighting the need to simplify these structures. With this in mind, imagine a telecom company that has decided to modernize its entire billing system to boost performance and reduce complexity. In the initial stage, telecom developers can assess the scope of the modernization project, scoring individual applications based on technical sustainability and organizational priorities. Applications with high scores undergo further analysis to estimate the re-platforming effort required. Later on, a cross-functional team selects the first component to migrate to MongoDB, initiating the billing system modernization. This journey then follows the steps outlined in Figure 4: Figure 4. The modernization process. First, developers analyze legacy systems by examining the codebase and the underlying architecture of the chosen billing system. Then, developers create end-to-end tests to ensure the application functions correctly when deployed. Later, developers design an architecture that incorporates managerial expectations of the desired application. Next, developers rewrite and recode the legacy application to align with the document model and develop APIs for MongoDB interaction. Following this, developers conduct user tests to identify and resolve any existing application bugs. Finally, developers migrate and deploy the modernized application in MongoDB, ensuring full functionality. Throughout this process, developers can leverage MongoDB Relational Migrator to streamline the transition. Relational Migrator helps developers with data mapping and modeling, SQL object conversion, application code generation, and data migration—corresponding to steps three, four, and five. Additionally, telecom companies can accelerate modernization initiatives by leveraging MongoDB Professional Services for dedicated, tailored end-to-end migration support. Our experts work closely with you to provide customized assistance, from targeted technical support and development resources to strategic guidance throughout the entire project. Building on this initial project, telecom companies can progressively address more complex applications, refining their approach to support a long-term modernization strategy. Next steps By revamping legacy applications with MongoDB, telecom companies can improve their operations and gain a competitive edge with advanced technology. This shift allows telcos to apply the latest innovations and free developers from the burdens of maintaining legacy systems. Start your journey to migrate core telecom applications to MongoDB Atlas, by visiting our telecommunications solutions page to learn more. If you would like to discover how to upgrade your TELCO legacy systems with MongoDB, discover how to start with the following resources: Visit our professional services to learn more about MongoDB Consulting YouTube: Relational Migrator Explained in 3 minutes White paper: Unleash Telco Transformation with an Operational Data Layer White paper: Modernization: What’s Taking So Long?
ZEE5: A Masterclass in Migrating Microservices to MongoDB Atlas
ZEE5 is a leading Indian over-the-top (OTT) video-streaming platform that delivers streamed content via Internet-connected devices. The platform offers a wide variety of content—movies, TV shows, web series, and original programming—across multiple genres and languages. Owned by Zee Entertainment Enterprises Limited , ZEE5 produces over 260 hours of content daily, with a monthly active user base of more than 119.5 million users across 190 countries. ZEE5’s operations and customer satisfaction are dependent on its backend infrastructure being robust and scalable to handle immense traffic and complex workflows. In order to future-proof its infrastructure and to maintain its competitive edge, the company needed to streamline operations and enhance its database management capabilities. This included the migration of its entire OTT platform, including a total of 100+ microservices and 80+ databases to Google Cloud. Pramod Prakash, Senior Vice President of Engineering at ZEE5, was on the stage of MongoDB.local Bangalore in 2024 . He shared insights into how ZEE5 managed this migration without hindering performance or disrupting its services. “It was a massive project which required a very carefully orchestrated migration plan,” said Prakash. Massive migration, zero downtime: Challenge accepted ZEE5’s team embarked on an ambitious journey to migrate a total of 40+ microservices (out of its 100+ microservices) to MongoDB Atlas . These were previously running on the Community Edition of MongoDB and on other NoSQL databases. One of the challenges of this migration was to ensure continuous data flow for the platform’s 119.5 million streaming users. To do so, Prakash and his team created multiple environments using a change data capture tool . This ensured continuous replication of data so the user experience would not be impacted. “We had to build four environments: dev, QA [Quality Assurance], UAT [User Acceptance Testing], and production,” explained Prakash. “We needed to keep testing and verifying each environment, and then finally enter the production phase when we migrated the data and moved the traffic.” The approach involved migrating production data twice: first for testing and then for the final cutover. This was to minimize any data loss. ZEE5 used MongoDB Atlas’ integrated tools mongosync and mongomirror . The tools helped achieve an essential goal: avoiding any downtime. “We migrated this entire mammoth application with zero downtime!” said Prakash. “We have not stopped ZEE5’s operations at all.” “The second important thing is the performance: you want to be 100% sure that the entire scale and peak traffic will work seamlessly within the new cloud environment,” added Prakash. ZEE5 relied on MongoDB Professional Services (PS)’s support. The PS team helped architect and plan the entire migration strategy. They also accompanied Prakash’s team step by step to ensure there would be no unexpected disruptions. The production environment was built and tested rigorously before the final migration to ensure seamless performance at peak traffic levels. “We iterated until we were 100% sure that the new environment was ready to take up ZEE5’s peak traffic. Functionally, it was all perfect,” said Prakash. The power of the Atlas platform According to Prakash, the power of MongoDB Atlas lies in the fact that it offers a fully managed platform. “There is no maintenance overhead at all,” he said. “All upgrades happen automatically without any downtime. We are also leveraging auto-scaling capabilities and point-in-time recovery.” All of this enables efficient handling of varying traffic loads without manual intervention. Additionally, data recovery capabilities are enhanced, and most importantly, the engineering team can prioritise application development rather than operational maintenance. As of February 2025, MongoDB Atlas supports a total of seven key use cases at ZEE5: payments, subscriptions, plans and coupons, video engineering, Zee Music (users’ preferences and playlists), content metadata, and the platform’s communication engine (SMS and email notifications). Looking ahead, ZEE5 is working on more use cases powered by MongoDB. For example, the company is looking to completely migrate their master data source for content metadata to MongoDB Atlas. ZEE5 is also considering relying on MongoDB Atlas to support and enhance its search and recommendations capabilities. Interested in learning how MongoDB is powering other companies applications? Head over to our customer case studies hub to read the latest stories. Visit our product page to learn more about MongoDB Atlas .
Debunking MongoDB Myths: Security, Scale, and Performance
MongoDB has come a long way since its founding in 2007. Many people first encountered MongoDB during its early years. They formed opinions about the database based on impressions from 2012 to 2014. However, much has changed since then. Over the past eleven years, MongoDB has made significant strides. Foremost being the launch of MongoDB Atlas in 2016. It has placed a substantial focus on improving the four critical areas that matter most to businesses and developers alike: security, durability, availability, and performance. Security: Protecting sensitive data from unauthorized access and ensuring regulatory compliance. Durability: Ensuring data remains intact and reliable, even during system failures or unexpected disruptions. Availability: Minimizing downtime and maintaining system operation, no matter what happens. Performance: Delivering fast, consistent application response times and scaling efficiently to meet growing demand. These advancements have earned the trust of some of the world’s largest enterprises, including Toyota , Cisco , Wells Fargo , Bosch , and Verizon . Yet despite this progress, outdated myths regarding MongoDB persist—particularly in these four foundational areas. In this blog, we will tackle those misconceptions head on and set the record straight about MongoDB’s security, durability, availability, and performance. Let’s dive in. Myth 1: “MongoDB is not as secure as a relational database” One of the most persistent myths about MongoDB is that it is not secure—certainly not as secure as traditional relational databases. This misconception likely stems from a series of ransomware attacks in the mid-2010s. Hackers exploited unsecured databases that lacked proper authentication and were left exposed on default TCP ports. While these incidents highlighted poor configuration practices, they have unfairly cast a shadow over MongoDB’s contemporary security capabilities. MongoDB provides robust, intelligent security features designed to protect sensitive data at every stage of its lifecycle. MongoDB encrypts data both in transit and at rest , just like other leading NoSQL and relational databases. However, what sets MongoDB apart is its ability to keep data encrypted while in use. With Queryable Encryption , an industry-first innovation unique to MongoDB, sensitive data can remain encrypted even while it is queried. This eliminates the need to decrypt the data and reduces exposure to threats. MongoDB also supports flexible authentication and authorization that seamlessly integrates with many identity management systems. Features like role-based access control and fine-grained permissions ensure users only have access to what they are authorized for. Concurrently, intuitive configuration makes these controls easy to implement. Beyond encryption and access control, MongoDB includes powerful auditing tools to monitor database activity and advanced network security features, such as IP allow-listing and private networking . Together, these capabilities provide comprehensive protection against unauthorized access and help organizations meet strict compliance requirements. Best of all, these advanced security features are included by default in both MongoDB Atlas and MongoDB Enterprise Advanced at zero cost. MongoDB’s approach simplifies security management while minimizing expenditure. This allows teams to focus on building applications with confidence that their data is protected. Myth 2: “MongoDB’s multi-cloud capabilities do not set it apart from other databases” At first glance, the claim that MongoDB is multi-cloud may not sound special. After all, plenty of databases are available through more than one cloud provider - however, this should not be confused with them all being multi-cloud . True multi-cloud supports ‘cross-cloud’ deployments, i.e. the ability to deploy individual nodes of a single cluster across multiple cloud providers. This distinction is often obfuscated by those vendors unable to run their clusters in such a configuration. Support for multi-cloud clusters in Atlas became generally available in October of 2020. MongoDB Atlas enables deployment not only on Amazon Web Services (AWS), Microsoft Azure, or Google Cloud but also across all three clouds simultaneously with a single cluster. It is possible to set up and configure cross-cloud deployments solely from the Atlas management console. No further configuration is required via the individual cloud providers. This is more than just a convenience; it is a transformative capability that eliminates the boundaries between cloud providers. With MongoDB Atlas, it is as if AWS, Azure, and Google Cloud operate as one unified cloud environment. Why does this matter? For starters, deploying a single database cluster across multiple clouds removes the operational complexity of managing data replication and migration between providers. Seamless data mobility can be achieved. The hardest part of any application to move—the data—now becomes the easiest. Multicloud also enables the creation of application architectures that exploit the best services from multiple cloud providers simultaneously. In addition, cross-cloud deployments deliver unmatched resiliency. With cross-cloud failover, in the event of an outage, data can be automatically switched to another cloud provider in the same geographic region. Thus ensuring uninterrupted service. Finally, MongoDB Atlas provides the flexibility to meet regional and cloud provider preferences with ease. Atlas spans 115+ supported regions across all three major cloud providers . This makes it easy to meet customer demands or comply with local regulations using a single database. MongoDB Atlas gives us the ability to run our database on multiple clouds through the same service. With Atlas, we have the freedom from lock-in—each client can choose where they are the most comfortable hosting their data. Gary Hoberman, CEO and Founder - Unquork Myth 3: “I get that MongoDB is built for horizontal scaling, but it is so painful to scale” Horizontal scaling, also known as scale-out, is a core strength of MongoDB. It allows workloads to be distributed by adding more nodes as data and applications expand. However, some beliefs have perpetuated that scaling MongoDB is difficult and complex. The reality? MongoDB makes scaling not just possible, but seamless—whether scaling out horizontally or scaling up vertically. With MongoDB Atlas, vertical scaling—or scale-up—is simple. By enabling auto-scaling , MongoDB Atlas dynamically adjusts cluster resources to meet workload demands. Adding more RAM, CPU, or storage capacity can be performed automatically and on-demand. This ensures optimal performance without continual manual intervention or oversight. If you need to move beyond vertical scaling, MongoDB offers three flexible ways to scale horizontally : Hashed sharding : Data is distributed randomly across nodes using a hashed shard key. This ensures an even distribution of data and workloads to prevent bottlenecks. Ranged sharding : Data is distributed based on ranges of a specific field. This enables fine-grained control over how data is divided. This approach is especially useful for preventing hotspots in workloads. Zone sharding : Data is distributed geographically. This enables compliance with data residency requirements and reduces latency by keeping data closer to users. What happens if the initial sharding strategy does not go as planned? MongoDB addresses this challenge with the ability to refine shard keys and reshard a collection with zero downtime. This ensures data distribution strategies can adapt as needs evolve, all without disrupting applications or users. Myth 4: “Since MongoDB is built for flexibility, it must not be very performant” One common misconception about MongoDB is that its flexibility and versatility must come at the expense of performance. After all, can such an agile database—one built for developers to model data however they want—really deliver the speed and efficiency of a performance-first solution? MongoDB is designed to provide both; unmatched flexibility and exceptional performance —all while keeping costs low. MongoDB’s performance stems from its intelligent architecture and powerful features. Ad hoc queries, indexing , and real-time aggregations make it easy to access and analyze data quickly. How fast are queries? Primary key or indexed queries typically execute in milliseconds. Even complex queries that are not indexed remain efficient. Performance typically is dependent on factors like collection size and machine specifications. What about workloads like search and analytics? Some developers might assume these would compete for resources and degrade performance on operational tasks. However, MongoDB solves this with workload isolation . This feature ensures that operational and nonoperational workloads are separated. This enables each to run at peak performance without requiring costly and time-consuming extract, transform, and load (ETL) processes. Network latency? For globally distributed applications, MongoDB’s hedged reads enable the nearest replica nodes to be read from rather than waiting for a response from distant nodes. This reduces latency and ensures applications remain highly responsive. MongoDB’s real-world performance is backed by incredible use cases: Amadeus processes 630 million bookings per year. Idealo supports 200,000 queries and 60,000 updates per second. Temenos achieves 150,080 transactions per second. This was before the release of MongoDB 8.0 , the most performant version of the database yet. MongoDB 8.0 has delivered: 36% faster reads 32% faster reads and updates 56% faster bulk inserts A stunning 200% improvement for time series queries MongoDB Atlas doesn’t just solve our performance issues. It makes life easier for web developers, who can build and maintain simpler, more straightforward code. Moutia Khatiri, CTO - Tech Accelerator, L’Oreal MongoDB Today MongoDB has evolved far beyond the myths perpetuated during its early years. MongoDB 8.0 delivers robust capabilities across security, durability, availability, and performance. It encrypts sensitive data throughout its lifecycle and enables seamless cross-cloud deployments. It simplifies horizontal and vertical scaling and powers some of the world’s most demanding applications. These capabilities solidify MongoDB’s position as the database of choice for modern applications. Read about more MongoDB myths and misconceptions in our previous two posts in this series: Debunking MongoDB Myths: Enterprise Use Case Busting the Top Myths About MongoDB vs Relational Databases Don't be held back by outdated misconceptions. Experience the innovation and performance of MongoDB. Start using MongoDB Atlas for free today . Or, to learn more about MongoDB, head over to MongoDB University and take our free Intro to MongoDB course .
Advancing Encryption in MongoDB Atlas
Maintaining a strong security posture and ensuring compliance with regulations and industry standards are core responsibilities of enterprise security teams. However, satisfying these responsibilities is becoming increasingly complex, time-consuming, and high-stakes. The rapid evolution of the threat landscape is a key driver of this challenge. In 2024, the percentage of organizations that experienced a data breach costing $1 million or more jumped from 27% to 36%. 1 This was partly fueled by a 180% surge from 2023 to 2024 in vulnerability exploitation by attackers. 2 Concurrently, regulations are tightening. Laws like the Health Insurance Portability and Accountability Act (HIPAA) 3 and the U.S. Securities and Exchange Commission’s cybersecurity regulations 4 have introduced stricter security requirements. This has raised the bar for compliance. Thousands of enterprises rely on MongoDB Atlas to protect their sensitive data and support compliance efforts. Encryption plays a crucial role on three levels; securing data at rest, in transit, and in use. However, security teams need more than solely strong encryption. Flexibility and control are essential to align with an organization’s specific requirements. MongoDB is introducing significant upgrades to MongoDB Atlas encryption to meet these needs. This includes enhanced customer-managed key (CMK) functionality and support for TLS 1.3. This post explores these improvements, along with the planned deprecation of outdated TLS versions, to strengthen organizations’ security postures. Why customer-managed keys (CMKs) matter Customer-managed keys (CMKs) are a security and data governance feature that delivers enterprises full control over the encryption keys protecting their data. With CMKs, customers can define and manage their encryption strategy. This ensures they have ultimate authority over access to their sensitive information. MongoDB Atlas customer key management provides file-level encryption, similar to transparent data encryption (TDE) in other databases. This customer-managed encryption-at-rest feature works alongside always-on volume-level encryption 5 in MongoDB Atlas. CMKs ensure all database files and backups are encrypted. MongoDB Atlas also integrates with AWS Key Management Service (AWS KMS), Azure Key Vault , and Google Cloud KMS . This ensures customers have the flexibility to manage keys as part of their broader enterprise security strategy. Customers using CMKs retain complete control of their encryption keys. If an organization needs to revoke access to data due to a security concern or any other reason, it can do so immediately by freezing or destroying the encryption keys. This capability acts as a “kill switch,” ensuring sensitive information becomes inaccessible when protection is critical. Similarly, an organization can destroy the keys to render the data and backups permanently unreadable and irretrievable. This may be applicable should they choose to retire a cluster permanently. Announcing CMK over private networking As part of a commitment to deliver secure and flexible solutions for enterprise customers, MongoDB is introducing CMKs over private networking. This enhancement enables organizations to manage their encryption keys without exposing their key management service (KMS) to the public internet. Using CMKs in MongoDB Atlas previously required Azure Key Vault and AWS KMS to be accessible via public IP addresses prior to today. While functional, this posed challenges for customers who need to keep KMS traffic private. It forced those customers to either expose their KMS endpoints or manage IP allow lists. By using private networking, customers can now: Eliminate the need for public IP exposure. Simplify network management by removing the need to manage allowed IP addresses. This reduces administrative effort and misconfiguration risk. Align with organizational requirements that mandate the use of private networking. Customer key management over private networking is now available for Azure Key Vault and AWS KMS . Customers can enable and manage this feature for all their MongoDB Atlas projects through the MongoDB Atlas UI or the MongoDB Atlas Administration API . More enhancements are coming for MongoDB customer key management in 2025. These include secretless authentication mechanisms and CMKs for search nodes. MongoDB Atlas TLS enhancements advance encryption in transit Securing data in transit is equally vital as a foundation of encryption at rest with CMKs. To address this, MongoDB Atlas enforces TLS by default. This ensures encrypted communication across all aspects of the platform, including client connections. Now MongoDB is reinforcing its TLS implementation with key enhancements for enterprise-grade security. MongoDB is in the process of rolling out fleetwide support for TLS 1.3 in MongoDB Atlas. The latest version of the cryptographic protocol offers several advantages over its predecessors. This includes stronger security defaults, faster handshakes, and reduced latency. Concurrently, TLS versions 1.0 and 1.1 are being deprecated. The rationale for this is known weaknesses and their inability to meet modern security standards. MongoDB is aligning with industry best practices by standardizing on TLS 1.2 and 1.3. This ensures a secure communication environment for all MongoDB Atlas users. Additionally, MongoDB now offers custom cipher suite selection, giving enterprises more control over their cryptographic configurations. This feature lets organizations choose the cipher suites for their TLS connections, ensuring compliance with their security requirements. Achieving encryption everywhere This post covers how MongoDB secures data at rest with CMKs and in transit with TLS. However, what about data in use while it’s being processed in a MongoDB Atlas instance? That’s where Queryable Encryption comes in. This groundbreaking feature enables customers to run expressive queries on encrypted data without ever exposing the plaintext or keys outside the client application. Sensitive data and queries never leave the client unencrypted. This ensures sensitive information is protected and inaccessible to anyone without the keys, including database administrators and MongoDB itself. MongoDB is committed to providing enterprise-grade security that evolves with the changing threat and regulatory landscapes. Organizations now have greater control, flexibility, and protection across every stage of the data lifecycle with enhanced CMK functionality, TLS 1.3 adoption, and custom cipher suite selection. As security challenges grow more complex, MongoDB continues to innovate to enable enterprises to safeguard their most sensitive data. To learn more about these encryption enhancements and how they can strengthen your security posture, visit MongoDB Data Encryption . 1 PwC , October 2024 2 Verizon Data Breach Investigations Report , 2024 3 U.S. Department of Health and Human Services , December 2024 4 U.S. Securities and Exchange Commission , 2023 5 MongoDB Atlas Security White Paper , Encryption at Rest section page 12
Hasura: Powerful Access Control on MongoDB Data
Across industries—and especially in highly regulated sectors like healthcare, financial services, and government—MongoDB has been a preferred modern database solution for organizations handling large volumes of sensitive data that require strict compliance adherence. In such enterprises, secure access to data via APIs is critical, particularly when information is distributed across multiple MongoDB databases and external data stores. Hasura extends and enhances MongoDB's access control capabilities by providing granular permissions at the column and field level across multiple databases through its unified interface. At the same time, designing a secure API system from scratch to meet this need takes significant development resources and becomes a burden to maintain and update. Hasura solves this problem for enterprises by elegantly serving as a federated data layer, with robust access control policies built-in. Hasura enforces powerful access control rules across data domains, joins data from multiple sources, and exposes it to the user via a single API. In this blog, we'll explore how Hasura and MongoDB work together to empower teams with granular data access control while simplifying data retrieval across collections. Team-specific data domains First, Hasura makes it possible for a business unit or team to own a set of databases and collections, also known as a data domain. Within each domain, a team can connect any number of MongoDB databases and other data sources, allowing the domain to have fine-grained role-based access control (RBAC) and attribute-based access control (ABAC) across all sources. More important though, is the ability to enable relationships that span domains, effectively connecting data from various teams or business units and exposing it to a verified user as necessary. This granular permissioning system means that the right users can access the right data at the right time, without compromising security. Field-level access control Hasura’s MongoDB connector also provides a powerful, declarative way to define access control rules at the collection and field level. For each MongoDB collection, roles may be specified for read, create, update, and delete (CRUD) permissions. Within those permissions, access may be further restricted based on the values of specific attributes. By defining these rules declaratively, Hasura makes it easy to implement and reason about complex access control policies. Joining across collections In addition to enabling granular access control, Hasura simplifies the retrieval of related data across multiple databases. By inspecting your MongoDB collections, Hasura can automatically create schemas and API endpoints (in GraphQL, REST, etc.) that let you query data along with its relationships. This eliminates the need to manually stitch together data from different collections in your application code. Instead, a graph of related data can be easily retrieved in a single API call, while still having that data filtered through your access control rules. As companies wrestle with the challenges of secure data access across sprawling database environments, Hasura provides a compelling solution. By serving as a federated data layer on MongoDB and external data, Hasura enables granular access control through a combination of role-based permissions, attribute-based restrictions, and the ability to join data and apply access across sources. Figure 1. Hasura & MongoDB demo environment With Hasura’s MongoDB connector , teams can easily implement sophisticated data access policies in a declarative way and provide their applications with secure access to the data they need. This combination of security and simplicity makes Hasura and MongoDB a powerful solution for organizations that strive to modernize, especially those in industries with strict compliance requirements. Visit the MongoDB Resources Hub to learn more about MongoDB Atlas.
Debunking MongoDB Myths: Enterprise Use Cases
MongoDB is frequently viewed as a go-to database for proof-of-concept (POC) applications. The flexibility of MongoDB’s document model enables teams to rapidly prototype and iterate. This allows for adaptation of the data model as requirements evolve during the early stages of application development. It is common for applications to continuously evolve during initial development. However, moving an application to production requires developers to add validation logic and fully define the data structures. A frequent assumption is that because MongoDB data models can be flexible, they can not be structured. However, while MongoDB does not require a defined schema, it does support them. MongoDB allows users to precisely calibrate rules and enforcement levels for every component of data. This enables a level of granular control that traditional databases, with their all-or-nothing approach to schema enforcement, struggle to match. Data model flexibility is not a binary choice between "schemaless" or "strictly enforced." More accurately, it exists on a spectrum in MongoDB. Users can incrementally define schemas in parallel with the overall “hardening” of the application. MongoDB's approach to data modeling makes it an ideal platform for business-critical applications. It is designed to support the entire application lifecycle; from nascent concepts and initial prototypes, to global rollouts of production environments. Enterprise-grade features like ACID transactions and industry-leading scalability ensure MongoDB can meet the demands of any modern application. Learning from the past So why do misconceptions persist regarding MongoDB? These perceptions originated over a decade ago. Teams working with MongoDB back in 2014 or earlier faced challenges when deploying it in production. Applications could slow down under heavy loads, data consistency was not guaranteed when writing to multiple documents, and teams lacked tools to monitor and manage deployments effectively. As a result, MongoDB gained a perception of being unsuitable for specific use cases or critical workloads. This perception has persisted despite a decade of subsequent development and innovation . Therefore, this is now an inaccurate assessment of today’s preeminent document database. MongoDB has evolved into a mature platform that directly addresses these historical pain points. Today’s MongoDB delivers robust tooling, guaranteed consistency, and comprehensive data validation capabilities. Myth: MongoDB is a niche database What are the top use cases for MongoDB? This question is difficult to answer because MongoDB is a general-purpose database that can support any use case. The document model is the primary driver of MongoDB’s versatility. Documents are similar to JSON objects with data being represented as key-value pairs. Values can be simple types like strings or numbers. However, values can also be arrays or nested objects which allows documents to easily represent complex hierarchical structures. The document model's flexibility allows data to be stored exactly as the application consumes it. This enables highly efficient writing and optimizes data for retrieval without needing to set up standard or materialized views, although both are supported . While MongoDB is no longer a niche database, it does have advanced capabilities to support niche requirements. The aggregation pipeline provides a powerful framework for data analytics and transformation. Time-series collections store and query temporal data efficiently to support IoT and financial applications. Geospatial indexes and queries enable location-based applications to perform complex proximity calculations. MongoDB Atlas includes native support for vector search . This enabled Cisco to experiment with generative AI use cases and streamline their applications to production. MongoDB handles the diverse data requirements that power modern applications. The document model provides the foundation for general use. Concurrently, advanced features ensure teams do not need to integrate additional tools as application requirements evolve. The result is a single platform that can grow from prototype to production, handling general requirements and specialized workloads with equal proficiency. Myth: MongoDB is not suitable for enterprise-grade workloads A common perception is that MongoDB works well for small applications but falls short at enterprise scale. Ironically, many organizations first consider MongoDB while struggling to scale their relational databases. These organizations have discovered MongoDB’s architecture is specifically designed to support scale-out distributed deployments. While MongoDB matches relational databases in vertical scaling capabilities, the document model enables a more natural and intuitive approach for horizontal scaling. Related data is stored together in a single document. Therefore, MongoDB can easily distribute complete data units across shards. This contrasts with relational databases. Relational data is split across multiple tables. This makes it difficult to place all related data on the same shard. Horizontal scaling with MongoDB sets an organization up for better performance. Most MongoDB queries need to access only a single shard. Equivalent queries in a relational database often require costly cross-server communication. Telefonica Tech has leveraged horizontal scaling to nearly double their capacity with a 40% hardware reduction . MongoDB Atlas further automates and simplifies these scaling capabilities through a fully managed service built to meet demanding enterprise requirements. Atlas provides a 99.995% uptime guarantee and availability across AWS, Google Cloud, and Azure in over 100 regions worldwide. This frees teams to focus on rapid development and innovation rather than infrastructure maintenance by offloading the operational complexity of deploying and running databases at scale. Powering the enterprise applications of today and tomorrow Over 50,000 customers and 70% of the Fortune 100 rely on MongoDB to power their enterprise applications. Independent industry reports from Gartner and Forrester continue to recognize MongoDB as a leader in the database space. Do not let outdated myths prevent your organization from the competitive advantages of MongoDB's enterprise capabilities. To learn more about MongoDB, head over to MongoDB University and take our free Intro to MongoDB course . Read more about customers building on MongoDB. Read our first blog in this series about myths around MongoDB vs relational databases. Check out the full video to learn about the other 6 myths that we're debunking in this series.
MongoDB & DKatalis’s Bank Jago, Empowering Over 500 Engineers
DKatalis , a technology company specialized in developing scalable digital solutions, is the engineering arm behind Bank Jago , Indonesia’s first digital bank. An app-only institution, Bank Jago enables end-to-end banking with features such as auto budgeting. This allows Bank Jago’s customers to easily and effectively organize their finances by creating " Pockets "—for expenses like food, savings, or entertainment. Launched in 2019, Bank Jago has seen tremendous growth in only a few years, with its customer base reaching 14.1 million as of October 2024. While speaking at MongoDB.local Jakarta , Chris Samuel, Staff Engineer at DKatalis, shared how MongoDB became the data backbone of Bank Jago, and how MongoDB Atlas supported Bank Jago’s growth. Bank Jago’s journey with MongoDB started in 2019, when DKatalis built the first version of Bank Jago using the on-premise version of MongoDB: MongoDB Community Edition . “We did everything ourselves, up to the point when we realized that the bigger our user [base] grew, the more painful it was for us to monitor everything,” said Samuel. In 2021, DKatalis decided to migrate Bank Jago [from MongoDB Community Edition] to MongoDB Atlas. This first involved migrating all data to Atlas. Then the database platform had to be set up to facilitate scalability and enable improved maintenance operations in the long-term. “In terms of process, it is actually seamless,” said Samuel during his MongoDB.local talk. Specifically, MongoDB Atlas offers six key capabilities that have facilitated the bank’s daily operations, supported its fast growth, and improved efficiencies: Flexibility: MongoDB's document model supports diverse data types and adapts to Jago's dynamic requirements. Scalability: MongoDB Atlas effortlessly supports the rapid growth in user base and data volume. High performance: The platform enables fast query execution and efficient data retrieval for a seamless customer experience. Real-time capabilities: MongoDB Atlas prevents delays during transactions, account creation, and balance checking. Regulation compliance: With MongoDB Atlas, local hosting is possible. This enables DKatalis to meet Indonesian financial regulatory standards. Community support: MongoDB’s strong developer community and rich ecosystem in Jakarta fosters collaboration and learning. All of these have also helped improve efficiencies for DKatalis’s team of over 500 engineers, who are now able to reduce data architecture complexity, and focus on innovation. Fostering a great engineering culture and community with MongoDB In another talk at MongoDB.local Singapore , DKatalis’s Chief Engineering Officer, Alex Titlyanov, explained that using MongoDB has been and continues to be a great learning, upskilling, and operational experience for his team. “DKatalis has a pretty unique organizational culture when it comes to its engineering teams: there are no designated engineering managers or project managers; instead, teams are self-managed,” said Titlyanov. “This encourages a community-driven environment, where engineers are continuously upgrading their skills, particularly with tools like MongoDB.” The company has established internal communities, such as the MongoDB community led by Principal Software Engineer Boon Hian Tek. These communities focus on knowledge sharing, skill-building, and ensuring that the company’s 500 engineers are proficient in using MongoDB. This deep knowledge of MongoDB—and the ease of use offered by the Atlas platform—means that DKatalis’s engineers are also able to build their own bespoke tools to improve daily operations and meet specific needs. For example, the team has built a range of tools aimed at helping deal with the complexity and scale of Bank Jago’s data architecture. “Most traditional banks offer their customers access to six months, sometimes a year’s worth of transaction history. But Bank Jago gives access to the entire transaction history,” said Boon. The engineering team ended up having to deal with 56 different databases and 485 data collections. Some would reach 1.13 billion documents, while others receive up to 42.5 million new documents every day. Some of the bespoke tools built on MongoDB Atlas include: Index sync report: DKatalis implemented a custom-built tool using MongoDB’s Atlas API to manage database indexing automatically. This was essential given the bank’s real-time requirements. Adding indexes manually during peak hours would have disrupted performance. Daily reporting: The team built a tool to monitor for slow queries. This provides daily reports on query performance so issues can be identified and resolved quickly. Add index: The Rolling Index feature from Atlas was initially used. However, the team required greater context for each index. Therefore, they built a tool that at 3:00 am automatically checks if there are any indexes to create. The tool calls in the Atlas API to create and publish the results. Exporting metrics: The Atlas console was used to source diagrams that were helpful. However, the team required each metric to be available per database and per collection versus cluster. The team built a thin layer on top of the Atlas console to slice up the required metrics using the Atlas API. “The scalability and flexibility of MongoDB have been essential in helping the team handle the bank’s fast growth and complex feature set. MongoDB’s document-oriented structure enables us to develop innovative features like ‘Pockets’, and we continue to see MongoDB as an integral part of our technology stack in the future,” said Titlyanov. Visit our product page to learn more about MongoDB Atlas . To learn how MongoDB powers solutions in the financial services industry, visit our solutions page .
BAIC Group Powers the Internet of Vehicles With MongoDB
The Internet of Vehicles (IoV) is revolutionizing the automotive industry by connecting vehicles to the Internet. Vehicle sensors generate a wealth of data, affording manufacturers, vehicle owners, and traffic departments deep insights. This unlocks new business opportunities and enhances service experiences for both enterprises and consumers. BAIC Research Institute , a subsidiary of Beijing Automotive Group Co. (BAIC Group), is a backbone enterprise of the Chinese auto industry. Headquartered in Beijing, BAIC Group is involved in everything from R&D and manufacturing of vehicles and parts to the automobile service trade, comprehensive traveling services, financing, and investments. BAIC Group is a Fortune Global 500 company with more than 67 billion USD of annual revenue. The Institute is also heavily invested in the IoV industry. It plays a pivotal role in the research and development of the two major independent passenger vehicle products in China: Arcfox and Beijing Automotive . It is also actively involved in building vehicle electronic architecture, intelligent vehicle controls, smart cockpit systems, and smart driving technologies. To harness cutting-edge, data-driven technologies such as cloud computing, the Internet of Things, and big data, the Institute has built a comprehensive IoV cloud platform based on ApsaraDB for MongoDB . The platform collects, processes, and analyzes data generated by over a million vehicles, providing intelligent and personalized services to vehicle owners, automotive companies, and traffic management departments. At MongoDB.local Beijing in September 2024, BAIC Group’s Deputy Chief Engineer Chungang Zuo said that the BAIC IoV cloud platform facilitates data access for over a million vehicles. It also supports online services for hundreds of thousands of vehicles. Data technology acts as a key factor for IoV development With a rapid increase of vehicle ownership in recent years, the volume of data on BAIC Group’s IoV cloud platform quickly surged. This led to several data management challenges, namely the need to handle the following: Large data volumes High update frequencies Complex data formats High data concurrency Low query efficiency Data security issues The IoV platform also needed to support automotive manufacturers who must centrally store and manage a large amount of diverse transactional data. Finally, the platform is needed to enable manufacturers to leverage AI and analytical capabilities to interpret and create value from this data. BAIC Group’s IoV cloud platform reached a breaking point because the legacy databases it employed were incapable of handling the deluge of exponential vehicle data nor supporting planned AI-driven capabilities. The Institute identified MongoDB as the solution to support its underlying data infrastructure. By using MongoDB, BAIC would gain a robust core to enhance data management efficiency from the business layer to the application layer. The power of MongoDB as a developer data platform offered a wide range of capabilities. This was a game-changer for the Institute. MongoDB’s document model makes managing complex data simple Unlike traditional relational database models, MongoDB’s JSON data structure and flexible schema model are well suited for the variety and scale of the ever-changing data produced by connected vehicles. In traditional databases, vehicle information is spread across multiple tables, each with nearly a hundred fields, leading to redundancy, inflexibility, and complexity. With MongoDB, all vehicle information can be stored in a single collection, simplifying data management. Migrating vehicle information to MongoDB has significantly improved the Institute’s data application efficiency. MongoDB’s GeoJSON supports location data management The ability to accurately calculate vehicle location within the IoV cloud platform is a key benefit offered by MongoDB. Particularly, MongoDB’s GeoJSON (geospatial indexing) supports important features, such as the ability to screen vehicle parking situations. Zuo explained that during the data cleaning phase, the Institute formats raw vehicle data for MongoDB storage and outputs it as standardized cleaned data. In the data calculation phase, GeoJSON filters vehicles in a specific range. This is followed by algorithmic clustering analysis of locations to gain vehicle parking information. Finally, the Institute retrieves real-time data from the MongoDB platform to classify and display vehicle parking situations on a map for easy viewing. MongoDB provides scalability and high-performance MongoDB’s sharded cluster enhances data capacity and processing performance, enabling the Institute to effectively manage exponential IoV data growth. The querying and result-returning processes are executed concurrently in a multi-threaded manner. This facilitates continuous horizontal expansion without any downtime as data needs grow. Zuo said that a significant advantage for developers is the high self-healing capability of the sharded cluster; if a primary node fails, MongoDB automatically switches to a backup. This ensures seamless service and process integrity. Security features meet data regulatory requirements MongoDB’s built-in security features enable the IoV platform to meet rigorous data protection standards, helping the Institute stay compliant with regulatory requirements and industry standards. With MongoDB, the Institute can ensure end-to-end data encryption throughout the entire data lifecycle, including during transmission, storage, and processing, with support for executing queries directly on encrypted data. For example, during storage, MongoDB encrypts sensitive data, such as vehicle identification numbers and phone numbers. Sharding and replication mechanisms establish a robust data security firewall. Furthermore, MongoDB’s permission control mechanism enables secure database management with decentralized authority. Zuo said that MongoDB’s sharded storage and clustered deployment features ensure the platform’s reliability exceeds the 99.99% service-level agreement. MongoDB’s high concurrency capabilities enable the Institute to share real-time vehicle status updates with vehicle owners’ apps, enhancing user experience and satisfaction. In addition, MongoDB’s unique compression technology and flexible cloud server configurations reduce data storage space and resource waste. This significantly lowers data storage and application costs. BAIC uses MongoDB to prepare for future opportunities Looking ahead, Zuo Chungang stated that the BAIC IoV cloud platform has expanding demands for data development and application in three areas: vehicle data centers, application scenario implementation, and AI applications. MongoDB’s capabilities will remain core to helping address the Institute’s upcoming needs and challenges.