MongoDB Blog
Announcements, updates, news, and more
Announcing the MongoDB Plugin for Firebase Genkit
We’re thrilled to introduce the MongoDB Plugin for Genkit, designed to accelerate your AI-powered applications with advanced search and database tooling—all within the Genkit ecosystem. Whether you're building chatbots, intelligent assistants, or recommendation engines, this plugin brings together MongoDB’s cutting-edge search capabilities and Genkit’s AI workflows, enabling seamless vector, full-text, and hybrid search with zero hassle.
Personalized Retail Media Platforms—Powered by MongoDB
Retailers today have access to a wealth of data that is an invaluable, unique asset that only they possess—such as first-party and customer interaction data. With the rapid evolution of digital commerce and the depreciation of third-party cookies, retail media networks (RMNs) have emerged as a critical channel for monetization and customer engagement. Through the strategic use of shopper data (which spans both digital and physical retail channels), retailers are empowered to provide brands with a unique opportunity to engage in direct advertising with in-market consumers, achieving an unprecedented level of precision and conversion. You can find more information on this in our blog post on
Analyze Query Shapes With MongoDB and Datadog
In July 2025, MongoDB introduced Query Shape Insights for MongoDB Atlas. This feature provides customers with powerful tools to understand query performance trends at a granular level. Almost immediately upon launch, enterprise customers asked an important question: “Can we view these insights directly in Datadog, where we already monitor our applications?”
Innovating with MongoDB | Customer Successes, October 2025
It’s officially fall! The start of every new season is a perfect time to consider change and new beginnings. While fall might make you think about pumpkin spice and newly chilly evenings, I’m thinking about the latest round of transformations that MongoDB’s customers are embracing to thrive in an AI-powered world. In all seriousness, legacy systems and technical debt are huge challenges: the cost of tech debt has been estimated at almost $4 trillion dollars. That’s trillion with a T! Legacy systems can slow down innovation, create bottlenecks, and make it tough to deliver the seamless, real-time experiences customers increasingly expect. But companies are finding that modernizing their applications isn't just about fixing what's broken—modernization enables them to move faster and innovate for end-users. That’s why I'm incredibly excited to share the recent launch of MongoDB’s Application Modernization Platform (AMP). This AI-powered program is designed to help enterprises move beyond outdated infrastructures to embrace a flexible, data-driven future. AMP is a comprehensive approach to modernization that combines smart AI tooling with proven methodologies, enabling businesses to transform their applications from the ground up, moving from legacy monoliths to a more flexible, microservices-based architecture. In this roundup, we're spotlighting customers who understand the strategic importance of modernization. You'll see how Wells Fargo is using MongoDB to power a new credit card platform, how CSX is ensuring business continuity during a critical migration, how Intellect Design is modernizing its wealth management platform, and how Deutsche Telekom is transforming its B2C digital channels. With MongoDB, customers are showing how integral a modern database is to powering the next generation of applications—and succeeding in the AI era. Wells Fargo Wells Fargo sought to modernize its mainframe-dependent credit card platform to provide a faster, more seamless customer experience and handle an exponential increase in transaction data. The company's legacy system was costly to manage and lacked the scalability needed for its "Cards 2.0" initiative. To solve this, Wells Fargo built an operational data store (ODS) using MongoDB. This new platform allowed them to adopt reusable APIs, streamline integrations, and move from a monolithic architecture to flexible microservices. The ODS now serves 40% of traffic from external vendors, handling more than 7 million transactions with sub-second service. By leveraging MongoDB, Wells Fargo was able to jumpstart its mainframe modernization and create curated data products to serve real-time, personalized financial services. CSX CSX , a major U.S. railroad company, sought to modernize its critical operations platform, RTOP, by migrating it to the cloud. The challenge was to maintain the platform's 24/7 availability with minimal disruption to its mission-critical, near real-time operations during the transition. To solve this, CSX selected MongoDB Atlas on Azure and partnered with MongoDB Professional Services . Leveraging the Cluster-to-Cluster Sync feature, the team was able to facilitate continuous data synchronization and complete the entire migration in just a few hours. The move to MongoDB Atlas has equipped CSX with a more scalable and resilient platform. This modernization effort established a blueprint for migrating other critical applications and helped CSX continue its digital transformation journey toward becoming America’s best-run railroad. Intellect Design Intellect Design , a global fintech company, sought to modernize its wealth management platform to overcome legacy system bottlenecks and multihour batch processing delays. The company's rigid relational database architecture limited its ability to scale and innovate. To solve this, the company partnered with MongoDB, using our AMP methodology and generative AI tools. This transformation reengineered the platform's core components, resulting in an 85% reduction in onboarding workflow times, allowing clients to access critical portfolio insights faster than ever. This initiative is the first step in Intellect Design's long-term vision to integrate its entire application suite into a unified, AI-driven service. By leveraging MongoDB Atlas's flexible schema and powerful native tools, the company is now better positioned to deliver smarter analytics and advanced AI capabilities to its customers. Watch Intellect AI’s MongoDB.local Bengaluru keynote presentation to learn how AMP helped them transform outdated systems into scalable, modern solutions. Deutsche Telekom Deutsche Telekom , a leading telecommunications company, sought to modernize its B2C digital channels, which were fragmented by outdated legacy systems. The company needed to create a unified digital experience for its 30 million customers while improving developer productivity. By leveraging MongoDB Atlas as part of its Internal Developer Platform, Deutsche Telekom built a robust data infrastructure to unify customer data and power its new digital services. This approach allowed the company to retire legacy systems and reduce its reliance on physical shops and call centers. The transition to MongoDB Atlas led to a massive surge in digital engagement, with daily customer interactions rising from under 50,000 to approximately 1.5 million. The company's customer data platform now handles up to 15 times the load of legacy systems, supporting large-scale loyalty programs and transforming the customer experience. Video spotlight: Bendigo Bank Before you go, watch how Bendigo and Adelaide Bank modernized their core banking technology using MongoDB Atlas and generative AI. Bendigo and Adelaide Bank reduced the migration time for legacy applications from 80 hours to just five minutes. This innovative approach allowed them to quickly modernize their systems and better serve their 2.5 million customers. Want to get inspired by your peers and discover all the ways we empower businesses to innovate for the future? Visit MongoDB’s Customer Success Stories hub to see why these customers, and so many more, build modern applications with MongoDB.
The 10 Skills I Was Missing as a MongoDB User
When I first started using MongoDB, I didn’t have a plan beyond “install it and hope for the best.” I had read about how flexible it was, and it felt like all the developers swore by it, so I figured I’d give it a shot. I spun it up, built my first application, and got a feature working. But I felt like something was missing. It felt clunky. My queries were longer than I expected, and performance wasn’t great; I had the sense that I was fighting with the database instead of working with it. After a few projects like that, I began to wonder if maybe MongoDB wasn’t for me. Looking back now, I can say the problem wasn’t MongoDB, but was somewhere between the keyboard and the chair. It was me. I was carrying over habits from years of working with relational databases, expecting the same rules to apply. If MongoDB’s Skill Badges had existed when I started, I think my learning curve would have been a lot shorter. I had to learn many lessons the hard way, but these new badges cover the skills I had to piece together slowly. Instead of pretending I nailed it from day one, here’s the honest version of how I learned MongoDB, what tripped me up along the way, and how these Skill Badges would have helped. Learning to model the MongoDB way The first thing I got wrong was data modeling. I built my schema like I was still working in SQL– every entity in its own collection, always referencing instead of embedding, and absolutely no data duplication. It felt safe because it was familiar. Then I hit my first complex query. It required data from various collections, and suddenly, I found myself writing a series of queries and stitching them together in my code. It worked, but it was a messy process. When I discovered embedding, it felt like I had found a cheat code. I could put related data together in one single document, query it in one shot, and get better performance. That’s when I made my second mistake. I started embedding everything. At first, it seemed fine. However, my documents grew huge, updates became slower, and I was duplicating data in ways that created consistency issues. That’s when I learned about patterns like Extended References, and more generally, how to choose between embedding and referencing based on access patterns and update frequency. Later, I ran into more specialized needs, such as pre-computing data, embedding a subset of a large dataset into a parent, and tackling schema versioning. Back then, I learned those patterns by trial and error. Now, they’re covered in badges like Relational to Document Model , Schema Design Patterns , and Advanced Schema Patterns . Fixing what I thought was “just a slow query” Even after I got better at modeling, performance issues kept popping up. One collection in particular started slowing down as it grew, and I thought, “I know what to do! I’ll just add some indexes.” I added them everywhere I thought they might help. Nothing improved. It turns out indexes only help if they match your query patterns. The order of fields matters, and whether you cover your query shapes will affect performance. Most importantly, just because you can add an index doesn’t mean that you should be adding it in the first place. The big shift for me was learning to read an explain() plan and see how MongoDB was actually executing my queries. Once I started matching my indexes to my queries, performance went from “ok” to “blazing fast.” Around the same time, I stopped doing all my data transformation in application code. Before, I’d pull in raw data and loop through it to filter, group, and calculate. It was slow, verbose, and easy to break. Learning the aggregation framework completely changed that. I could handle the filtering and grouping right in the database, which made my code cleaner and the queries faster. There was a lot of guesswork in how I created my indexes, but the new Indexing Design Fundamentals covers that now. And when it comes to querying and analyzing data, Fundamentals of Data Transformation is there to help you. Had I had those two skills when I first started, I would’ve saved a lot of time wasted on trial and error. Moving from “it works” to “it works reliably” Early on, my approach to monitoring was simple: wait for something to break, then figure out why. If a performance went down, I’d poke around in logs. If a server stopped responding, I’d turn it off and on again, and hope for the best. It was stressful, and it meant I was always reacting instead of preventing problems. When I learned to use MongoDB’s monitoring tools properly, that changed. I could track latency, replication lag, and memory usage. I set alerts for unusual query patterns. I started seeing small problems before they turned into outages. Performance troubleshooting became more methodical as well. Instead of guessing, I measured. Breaking down queries, checking index use, and looking at server metrics side by side. The fixes were faster and more precise. Reliability was the last piece I got serious about. I used to think a working cluster was a reliable cluster. But reliability also means knowing what happens if a node fails, how quickly failover kicks in, and whether your recovery plan actually works in practice. Those things you can now learn in the Monitoring Tooling , Performance Tools and Techniques, and Cluster Reliability skill badges. If you are looking at deploying and maintaining MongoDB clusters, these skills will teach you what you need to know to make your deployment more resilient. Getting curious about what’s next Once my clusters were stable, I stopped firefighting, and my mindset changed. When you trust your data model, your indexes, your aggregations, and your operations, you get to relax. You can then spend that time on what’s coming next instead of fixing what’s already in production. For me, that means exploring features I wouldn’t have touched earlier, like Atlas Search , gen AI, and Vector Search . Now that the fundamentals are solid, I can experiment without risking stability and bring in new capabilities when a project actually calls for them. What I’d tell my past self If I could go back to when I first installed MongoDB, I’d keep it simple: Focus on data modeling first. A good foundation will save you from most of the problems I ran into. Once you have that, learn indexing and aggregation pipelines. They will make your life much easier when querying. Start monitoring from day one. It will save you a lot of trouble in the long run. Take a moment to educate yourself. You can only learn so much from trial and error. MongoDB offers a myriad of resources and ways to upskill yourself. Once you have established that base, you can explore more advanced topics and uncover the full potential of MongoDB. Features like Vector Search, full-text search with Atlas Search, or advanced schema design patterns are much easier to adopt when you trust your data model and have confidence in your operational setup. MongoDB Skill Badges cover all of these areas and more. They are short, practical, and focused on solving real problems you will face as a developer or DBA, and most of them can be taken over your lunch break. You can browse the full catalog at learn.mongodb.com/skills and pick the one that matches the challenge you are facing today. Keep going from there, and you might be surprised how much more you can get out of the database once you have the right skills in place.
Smarter AI Search, Powered by MongoDB Atlas and Pureinsights
We’re excited to announce that the integration of MongoDB Atlas with the Pureinsights Discovery Platform is now generally available—bringing to life a reimagined search experience powered by keyword, vector, and gen AI. What if your search box didn’t just find results, but instead understood intent? That’s exactly what this integration delivers! Beyond search: From matching to meaning Developers rely on MongoDB’s expansive knowledge ecosystem to find answers fast. But even with a rich library of technical blogs, forum threads, and documentation, traditional keyword search often falls short—especially when queries are nuanced, multilingual, or context-driven. That’s where the MongoDB-Pureinsights solution shines. Built on MongoDB Atlas and orchestrated by the Pureinsights Discovery platform, this intelligent search experience starts with the fundamentals: fast, accurate keyword results, powered by MongoDB Atlas Search . But as queries grow more ambiguous—say, “tutorials for AI”—the platform steps up. MongoDB Atlas Vector Search with Voyage AI , available as an embedding and reranking option (now part of MongoDB), goes beyond literal keywords to interpret intent—helping applications deliver smarter, more relevant results. The outcome: smarter, semantically aware responses that feel intuitive and accurate—because they are. What’s more, with generative answers enabled, the platform synthesizes information across MongoDB’s ecosystem (blog content, forums, and technical docs) to deliver clear, contextual answers using state-of-the-art language models. But it's not just pointing you to the right page. Instead, the platform is providing the right answer, with citations, ready to use. It’s like embedding a domain-trained AI assistant directly into your search bar. “As organizations look to move beyond traditional keyword search, they need solutions that combine speed, relevance, and contextual understanding,” said Haim Ribbi, Vice President, Global CSI, VAR & Tech Partner at MongoDB. “MongoDB Atlas provides the foundation for smarter discovery, and this collaboration with Pureinsights shows how easily teams can deliver gen AI-powered search experiences using their existing content.” Built for users everywhere But intelligence alone doesn’t make it transformational. What sets this experience apart is its adaptability. Whether you’re a developer troubleshooting in Berlin or a product owner building in São Paulo, the platform tailors responses to your preferences. Prefer concise summaries or deep technical dives? Want to translate answers in real time? Need responses that reflect your role and context? You’re in control. From tone and length to language and specificity, this is a search that truly understands you—literally and figuratively. Built on MongoDB. Elevated by Voyage AI. Delivered by Pureinsights. At the core of this solution is MongoDB Atlas, which unifies fast, scalable data access to structured content through Atlas Search and Atlas Vector Search. Looking ahead, by integrating with Voyage AI’s industry-leading embedding models, MongoDB Atlas aims to make semantic search and retrieval-augmented generation (RAG) applications even more accurate and reliable. While currently in private preview, this enhancement signals a promising future for developers building intelligent, AI-powered experiences. Pureinsights handles the orchestration layer. Their Discovery Platform ingests and enriches content, blends keyword, vector, and generative search into a seamless UI, and integrates with large language models like GPT-4. The platform supports multilingual capabilities, easy deployment, and enterprise-grade scalability out of the box. While generative answers are powered by integrated large language models (LLMs) and may vary by deployment, the solution is enterprise-ready, cloud-native, and built to scale. Bringing intelligent discovery to your own data Watch the demo video to see AI-powered search in action across 4,000+ pages of MongoDB content—from community forums and blog posts to technical documentation. While the demo features MongoDB’s content, the solution is built to adapt. You can bring the same AI-powered experience to your internal knowledge base, customer support portal, or developer hub—no need to build from scratch. Visit our partner page to learn more about MongoDB and Pureinsights and how we’re helping enterprises build smarter, AI-powered search experiences. Apply for a free gen AI demo using your enterprise content.
Charting a New Course for SaaS Security: Why MongoDB Helped Build the SSCF
The way companies everywhere work is powered by SaaS. From collaboration tools to critical infrastructure, organizations rely on SaaS applications to drive their business forward. But this widespread adoption has created a significant security blind spot. How can you ensure every one of these applications is configured securely when they all offer different settings, capabilities, and levels of visibility? This inconsistency creates friction, wastes resources, and ultimately, exposes businesses to unnecessary risk. At MongoDB, we believe that securing the SaaS ecosystem is a shared responsibility. That's why we were proud to collaborate with the Cloud Security Alliance (CSA) and industry leaders like GuidePoint Security to develop a new standard—the SaaS Security Capability Framework (SSCF) . The problem: A gap in cloud security For years, the majority of security assessments have focused on the SaaS provider's organizational security, often through frameworks like SOC 2 or ISO 27001. While essential, these frameworks don't always address a critical question: what security capabilities are available to the SaaS customer within the application? This gap means that security teams face a chaotic landscape. Every new SaaS app brings a different set of configurable controls for logging, identity management, and data access. This makes it nearly impossible to implement and track consistent security policies at scale, leading to a burdensome assessment process for everyone involved. The solution: A common framework for SaaS security The SSCF was created to solve this problem by establishing a clear, technical set of customer-facing security controls that SaaS vendors should provide. The framework is designed to empower customers by ensuring they have the tools they need to operate applications securely at scale on their side of the Shared Security Responsibility Model (SSRM). The framework helps with many use cases, but three key audiences stand out: For risk management teams: The SSCF provides a clear baseline to use during vendor assessments, simplifying procurement. For SaaS security teams: It offers a checklist for implementing the security features enterprises expect, streamlining the security program. For SaaS vendors: The SSCF standardizes assessment responses, reducing the overhead of custom questionnaires and helping vendors meet customer requirements. The SSCF focuses on six critical domains, aligned with CSA’s Cloud Control Matrix, providing specific and actionable controls for each: Change Control and Configuration Management (CCC): Ensuring you can programmatically query and get documentation on all security configurations. Data Security and Privacy Lifecycle Management (DSP): Giving customers control over features like disabling file uploads to prevent malicious code. Identity and Access Management (IAM): Providing robust, modern controls for user access, including SSO enforcement, non-human identity (NHI) governance, and a dedicated read-only security auditor role. Interoperability and Portability (IPY): Giving administrators control over mass data exports and visibility into application integrations. Logging and Monitoring (LOG): Defining a clear set of comprehensive requirements for machine-readable logs with mandatory fields for effective threat detection and forensics. Security Incident Management (SEF): Requiring a simple, effective way for vendors to notify a designated customer security contact during an incident. MongoDB's commitment to a more secure ecosystem Our involvement in creating the SSCF stems from our deep commitment to the security of our customers' data and the broader developer community. We believe that robust security shouldn't be an afterthought; it must be built in and easy to consume. The principles outlined in the SSCF—like strong identity controls and comprehensive logging—are philosophies we already built into our own data platform. Strong security capabilities allow our customers to build and innovate faster and more securely, knowing they have a reliable foundation. And personally, as a co-chair of the CSA SSCF, I’ve seen great excitement and engagement on the part of our working group—which helped me realize how many companies are affected by this lack of consistency. The SSCF is a vital step toward creating a more trusted, efficient, and secure global SaaS ecosystem. We are thrilled to have been a part of this foundational work and will continue to champion this standard that empowers developers and security teams alike. Visit our security page to learn more about how MongoDB helps protect your data.
From Niche NoSQL to Enterprise Powerhouse: The Story of MongoDB's Evolution
I joined MongoDB two years ago through the acquisition of Grainite, a database startup I co-founded. My journey here is built on a long career in databases, including many years at Google, where I was most recently responsible for the company’s entire suite of native databases—Bigtable, Spanner, Datastore, and Firestore—powering both Google's own products and Google Cloud customers. My passion has always been large-scale distributed systems, and I find that the database space offers the most exciting and complex challenges to solve. At MongoDB my focus is on architectural improvements across the product stack. I've been impressed with the progression of MongoDB's capabilities and the team's continuous innovation ethos. In this blog post, I’ll share some of my understanding of MongoDB’s history and how MongoDB became the de facto standard for document databases. I’ll also highlight select innovations we are actively exploring. The dawn of NoSQL During the "move fast and break things" era of Web 2.0, the digital landscape was exploding. Developers were building dynamic, data-rich applications at an unprecedented pace, and the rigid, tabular structures of legacy relational databases like Oracle and Microsoft SQL Server quickly became a bottleneck. A new approach was needed, one that prioritized developer productivity, flexibility, and massive scale. At the same time, JSON's popularity as a flexible, cross-language format for communicating between browsers and backends was surging. This collective shift toward flexibility gave rise to NoSQL databases , and MongoDB, with its native document-based approach, was at the forefront of the movement. In the early days, there was a perception that MongoDB was great for use cases like social media feeds or product catalogs, but not for enterprise applications where data integrity is non-negotiable—like financial transactions. This view was never perfectly accurate, and it certainly isn't today. So, what created this perception? It came down to two main factors: categorization and maturity. First, most early NoSQL databases were built on an “eventually consistent” model, prioritizing Availability and Partition Tolerance (AP) under the CAP theorem . MongoDB was an exception, designed to prioritize Consistency and Partition Tolerance (CP). But, in a market dominated by AP systems, MongoDB was often lumped in with the rest, leading to the imprecise label of having “light consistency.” Second, all new databases take time to mature for mission-critical workloads. Any established system-of-record database today has gone through many versions over many years to earn that trust. After more than 15 years of focused engineering, today MongoDB has the required codebase maturity, features, and proven track record for the most demanding enterprise applications. The results speak for themselves. As our CEO Dev Ittycheria mentioned during the Q2 2026 earnings call, over 70% of the Fortune 500—as well as 7 of the 10 largest banks, 14 of the 15 largest healthcare companies, and 9 of the 10 largest manufacturers globally—are MongoDB customers. This widespread adoption by the world's most sophisticated organizations is a testament to a multi-year, deliberate engineering journey that has systematically addressed the core requirements of enterprise-grade systems. MongoDB’s engineering journey: Building a foundation of trust MongoDB’s evolution from being perceived as a niche database to an enterprise powerhouse wasn't an accident; it was the result of a relentless focus on addressing the core requirements of enterprise-grade systems. Improvements instrumental to this transformation include: High availability with replica sets: The first step was eliminating single points of failure. Replica sets were introduced as self-healing clusters that provide automatic failover, ensuring constant uptime and data redundancy. Later, the introduction of a Raft-style consensus protocol provided even more reliable and faster failover and leader elections, especially in the event of a network partition. This architecture is the foundation for MongoDB’s current multi-region or run-anywhere deployments, and even allows a single replica set to span multiple cloud providers for maximum resilience. Figure 1. Horizontal scaling. Massive scalability with horizontal sharding: Introduced at the same time as replica sets, sharding is a native, foundational part of MongoDB. MongoDB built sharding to allow data to be partitioned across multiple servers, enabling virtually limitless horizontal scaling to support massive datasets and high-throughput operations. Advanced features like zone sharding further empower global applications by pinning data to specific geographic locations to reduce latency and comply with data residency laws like GDPR. Tunable consistency: Recognizing that not all data is created equal, MongoDB empowered developers with tunable read and write concerns. Within a single application, some data—like a 'page view count'—might not have the same consistency requirements as a 'order checkout value'. Instead of using separate, specialized databases for each use case, developers can use MongoDB for both. This moved the platform beyond a one-size-fits-all model, allowing teams to choose the precise level of consistency their application required per operation—from "fire and forget" for speed to fully acknowledged writes across a majority of replicas for guaranteed durability. This flexibility provides the best price/performance tradeoffs for modern applications. The game-changer, multi-document ACID transactions : From its inception, MongoDB has always provided atomic operations for single documents. The game-changing moment was the introduction of multi-document ACID transactions in 2018 with MongoDB 4.0, which was arguably the single most important development in its history. This feature, later extended to include sharded clusters, meant that complex operations involving multiple documents—like a financial transfer between two accounts—could be executed with the same atomicity, consistency, isolation, and durability (ACID) guarantees as a traditional relational database. This milestone shattered the biggest barrier to adoption for transactional applications. And the recently released MongoDB 8.2 is the most feature-rich and performant version of MongoDB yet. Strict security and compliance: To meet the stringent security demands of the enterprise, MongoDB layered in a suite of advanced security controls. Features like Role-Based Access Control (RBAC), detailed auditing, and Field-Level Encryption were just the beginning. The release of Queryable Encryption ( to which we recently introduced support for prefix, suffix, and substring queries ) marked a revolutionary breakthrough, allowing non-deterministic encrypted data to be queried without ever decrypting it on the server, ensuring data remains confidential even from the database administrator. To provide independent validation, MongoDB Atlas has achieved a number of internationally recognized security certifications and attestations, including ISO/IEC 27001 , SOC 2 Type II , PCI DSS , and HIPAA compliance, demonstrating a commitment to meeting the rigorous standards of the world's most regulated industries. Figure 2. Queryable Encryption. The ultimate proof of enterprise readiness lies in real-world adoption. Today, MongoDB is trusted by leading organizations across the most demanding sectors to run their core business systems. For example, Citizens Bank , one of the oldest and largest financial institutions in the United States, moved to modernize its fraud detection capabilities from a slow, batch-oriented legacy system. They built a new, comprehensive fraud management platform on MongoDB Atlas that allows for near real-time monitoring of transactions. This use case in a highly regulated industry requires high availability, low latency, and strong consistency to analyze transactions in real-time and prevent financial loss—a direct refutation of the old "eventual consistency" criticism. Another example is that of Bosch Digital , the software and systems house for the Bosch Group. Bosch Digital uses MongoDB for its IoT platform, Bosch IoT Insights, to manage and analyze massive volumes of data from connected devices—from power tools used in aircraft manufacturing, to sensors in vehicles. IoT data arrives at high speeds, in huge volumes, and in variable structures. This mission-critical use case demonstrates MongoDB's ability to handle the demands of industrial-scale IoT, providing the real-time analytics needed to ensure quality, prevent errors, and drive innovation. Then there’s Coinbase , which relies on MongoDB to seamlessly handle the volatile and unpredictable cryptocurrency market. Specifically, Coinbase architected a MongoDB Atlas solution that would accelerate scaling for large clusters. The result was that Coinbase end-users gained a more seamless experience. Previously, traffic spikes could impact some parts of the Coinbase app. Now, users don’t even notice changes happening behind the scenes. These are just a few examples; customers across all verticals, industries, and sizes depend on MongoDB for their most demanding production use cases. A common theme is that real-world data is messy, variable, and doesn't fit neatly into rigid, tabular structures. The old adage says that if all you have is a hammer, everything looks like a nail. For decades, developers only had the relational "hammer." With MongoDB, they now have a modern tool that adapts to how developers work and the data they need to manage and process. The road ahead: Continuous innovation MongoDB is not resting on its laurels. The team is as excited about what the future holds as they were when MongoDB was first launched, and we continue to innovate aggressively to meet—and anticipate—the modern enterprise’s demands. Here are select improvements we are actively exploring. A critical need we hear from customers is how to support elastic workloads in a price-performant way. To address this, over the past two years we’ve rolled out Search Nodes, which is a unique capability in MongoDB that allows scaling of search and vector workloads independent from the database to improve availability and price performance. We are now working closely with our most sophisticated customers to explore how to deliver similar capabilities across more of MongoDB. Our vision is to enable customers to scale compute for high-throughput queries without over-provisioning storage , and vice versa. We can do all this while building upon what is already one of the strongest security postures of any cloud database, as we continue to raise the bar for durability, availability, and performance. Another challenge facing large enterprises is the significant cost and risk associated with modernizing legacy applications. To solve this, we are making a major strategic investment in enterprise application modernization, and recently announced the MongoDB Application Modernization Platform . We have been engaged with several large enterprises in migrating their legacy relational database applications—code, data, and everything in between—over to MongoDB. This is not a traditional, manual migration effort capped by the number of bodies assigned. Instead, we are systematically developing Agentic tooling and AI-based frameworks, techniques, and processes that allow us to smartly migrate legacy applications into modern microservices-based architectures at scale. One of the more exciting findings from a recent effort, working with a large enterprise in the insurance sector, was that optimized queries on MongoDB ran just as fast, and often significantly faster, than on their legacy relational database, even when schemas were translated 1:1 between relational tables and MongoDB collections, and lots of nested queries and joins were involved. Batch jobs implemented as complex stored procedures that took several hours to execute on the relational database could be completed in under five minutes, thanks to the parallelism MongoDB natively enables (for more, see the MongoDB Developer Blog ). Based on the incredible performance gains seen in these modernization projects, we're addressing another common need: ensuring fast queries even when data models aren't perfectly optimized. We are actively exploring improvements to our Query Optimizer that will improve lookup and join performance. While the document model will always be the most performant way to model your data, we are ensuring that even when you don't create the ideal denormalized data model, MongoDB will deliver performance that is at par or better than the alternatives. Finally, developers today are often burdened with stitching together multiple services to build modern, AI-powered applications. To simplify this, the platform is expanding far beyond a traditional database, focused on providing a unified developer experience . This includes a richer ecosystem with integrated capabilities like Atlas Search for full-text search, Atlas Vector Search for AI-powered semantic search, and native Stream Processing to handle real-time data. We are already working on our first integrations, and continue to explore how embedding generation as a service within MongoDB Atlas, powered by our own Voyage AI models, can further simplify application development. From niche to necessity MongoDB began its journey as a (seemingly) niche NoSQL database with perceptions and tradeoffs that made it unsuitable for many core business applications. But, through a sustained and deliberate engineering effort, it has delivered the high availability, tunable consistency, ACID transactions, and robust security that enterprises demand. The perceptions of the past no longer match the reality of the present. When 7 of the 10 largest banks are already using MongoDB, isn’t it time to re-evaluate MongoDB for your most critical applications? For more on why innovation requires a modern, AI-ready database—and why companies like Nationwide, Wells Fargo, and The Knot Worldwide chose MongoDB over relational databases— see the MongoDB customer use case site .
MongoDB SQL Interface: Now Available for Enterprise Advanced
Today, we’re excited to announce the general availability of MongoDB SQL Interface for MongoDB Enterprise Advanced . This builds upon the foundation established by MongoDB Atlas SQL Interface, which began by extending SQL connectivity to self-managed MongoDB deployments. Teams can now query their MongoDB data directly from familiar BI tools like Tableau and Microsoft’s Power BI using standard ODBC and Java Database Connectivity (JDBC) connections, eliminating the need to learn MongoDB Query Language (MQL), build extract, transform, and load (ETL) pipelines, or move data. Bridging the SQL-MongoDB gap Organizations new to MongoDB often face a data access challenge: While developers benefit from increased flexibility and performance, teams moving from SQL-based tools often struggle to access the data they need. Without direct SQL connectivity, they must either learn MongoDB’s query language or build and maintain custom ETL pipelines to move data out of MongoDB for reporting and analytics. This creates fragmented operational reporting workflows, with users switching between multiple tools and data sources to piece together the insights they need. These approaches often lead to increased maintenance overhead, outdated data, and dependency bottlenecks. MongoDB SQL Interface now eliminates this friction by providing direct SQL access to MongoDB data through custom connectors and drivers. This works by generating comprehensive JSON schemas of MongoDB collections and translating standard SQL queries into MongoDB operations in real time. Users can connect from popular BI tools like Tableau and Power BI, or through JDBC and ODBC drivers for other SQL-based tools. They can use familiar SQL syntax, including joins, aggregations, and subqueries through MongoSQL, a SQL-92 compatible dialect designed specifically for MongoDB. This speeds up analysis and enables self-service reporting while maintaining database performance. Getting started MongoDB SQL Interface is now included with Enterprise Advanced licenses and works with MongoDB 6.0 or higher, requiring no changes to your existing MongoDB server configuration. The setup process involves three main steps: Download the MongoDB SQL Schema Builder CLI from the download center . Use the command line interface (CLI) to analyze your data structure and generate schemas that map your collections’ document structures to SQL-queryable formats. Connect your BI tools using MongoDB’s custom connectors for Tableau and Power BI, or JDBC and ODBC drivers for other SQL-based tools. The Schema Builder CLI examines your existing collections to understand document patterns, nested objects, and array structures. It then creates JSON Schema definitions that preserve the full richness of your document model while making complex nested structures and arrays queryable through familiar SQL syntax. This schema-first approach ensures optimal query performance and maintains data type accuracy across your SQL operations. Once the MongoDB Schema Builder CLI generates your schemas, it stores them alongside your data. SQL Interface then automatically uses them to validate queries and provide proper type information for results. This creates a seamless bridge between MongoDB’s flexible document model and SQL’s structured query expectations. Moving forward from MongoDB BI Connector For organizations currently using MongoDB BI Connector , MongoDB SQL Interface represents a significant improvement to our SQL connectivity solution. The interface addresses several limitations of the MongoDB BI Connector approach, including improved query performance through native MongoDB operations and enhanced schema flexibility that better represents document structures. While support for BI Connector will continue until September 2026, MongoDB SQL Interface offers improved performance, enhanced schema control, and a more intuitive setup process. Ready to get started with MongoDB SQL Interface for Enterprise Advanced? Documentation : Complete the implementation guide with configuration options and best practices. Download center : Get the MongoDB SQL Schema Builder CLI and drivers for your deployment. README : Use this guide for quick reference for installation and usage. Demo video : See MongoDB SQL Interface in action with a step-by-step walkthrough.