MongoDB Blog

Announcements, updates, news, and more

Simplify AI-Driven Data Connectivity With MongoDB and MCP Toolbox

The wave of generative AI applications is revolutionizing how businesses interact with and derive value from their data. Organizations need solutions that simplify these interactions and ensure compatibility with an expanding ecosystem of databases. Enter MCP Toolbox for Databases , an open-source Model Context Protocol (MCP) server that enables seamless integration between gen AI agents and enterprise data sources using a standardized protocol pioneered by Anthropic. With the built-in capability to query multiple data sources simultaneously and unify results, MCP Toolbox eliminates fragmented integration challenges, empowering businesses to unlock the full potential of their data. With MongoDB Atlas now joining the ecosystem of databases supported by MCP Toolbox, enterprises using MongoDB’s industry-leading cloud-native database platform can benefit from streamlined connections to their gen AI systems. As businesses adopt gen AI to unlock insights and automate workflows, the choice of database is critical to meeting demands for dynamic data structures, scalability, and high-performance applications. MongoDB Atlas, with its fully managed, document-oriented NoSQL design and capabilities for flexible schema modeling, is the ultimate companion to MCP Toolbox for applications requiring unstructured or semistructured data connectivity. This blog post explores how MongoDB Atlas integrates into MCP Toolbox, its advantages for developers, and the key use cases for enabling AI-driven data solutions in enterprise environments. Figure 1. MongoDB as a source for MCP Toolbox for Databases. How it works The integration of MongoDB Atlas with MCP Toolbox enables users to perform Create, Read, Update, Delete (CRUD) operations on MongoDB data sources using the standardized MCP. Beyond fundamental data management tasks, this integration also unlocks capabilities from MongoDB’s aggregation framework , enabling users to seamlessly execute complex data transformations, computations, and analyses. This empowers businesses to not only access and modify their data but also uncover valuable insights by harnessing MongoDB’s powerful query functionality within workflows driven by MCP Toolbox. By combining the scalability and flexibility of MongoDB Atlas with MCP Toolbox’s ability to query across multiple data sources, organizations can develop advanced AI-driven applications, enhance operational efficiency, and uncover deeper analytical opportunities. The use of MongoDB as both a source and a sink within MCP Toolbox is simple and highly versatile, thanks to the flexibility of the configuration file. To configure MongoDB as a data source, you can define it under the sources section, specifying parameters such as its kind ("mongodb") and the connection’s Uniform Resource Identifier (URI) to establish access to your MongoDB instance. sources: my-mongodb: kind: mongodb uri: "mongodb+srv://username:password@host.mongodb.net" In the tools section, various operations—such as retrieving, updating, inserting, or deleting data—can be defined by linking the appropriate source, specifying the target database and dataset, and configuring parameters such as filters, projections, sorting, or payload structures. Additionally, databases can act as sinks for storing data by enabling operations to write new records or modify existing ones, making them ideal for workflows where applications or systems need to interact dynamically with persistent storage. The toolsets section facilitates grouping related tools, making it easy to load and manage specific sets of operations based on different use cases or requirements. Whether used for reading or writing data, the integration of databases via MCP Toolbox provides a streamlined and consistent approach to managing and interacting with diverse data sources. Below is an example of running "find query" on MongoDB Atlas using the MCP Toolbox. tools: get_user_profile: kind: mongodb-find-one source: my-mongo-source description: Retrieves a user's profile by their email address. database: user_data collection: profiles filterPayload: | { "email": {{json .email}} } filterParams: - name: email type: string description: The email address of the user to find. projectPayload: | { "password_hash": 0, "login_history": 0 } Getting started The integration of MongoDB Atlas and MCP Toolbox for Databases marks a significant step forward in simplifying database interactions for enterprises embracing gen AI. By enabling seamless connectivity, advanced data operations, and cross-source queries, this collaboration empowers businesses to build AI-driven applications that maximize the value of their data while enhancing efficiency and scalability. Get started today through Google Cloud Marketplace . Set up MCP Toolbox for Databases locally. Set up MongoDB Atlas source connector . And then set up MongoDB Atlas tools .

September 22, 2025
Artificial Intelligence

MongoDB Community Edition to Atlas: A Migration Masterclass With BharatPE

Launched in 2018, BharatPE is a fintech pioneer serving millions of Indian retailers and small businesses across more than 450 cities. The company processes over ₹12,000 crore (about US $1.368 billion) in monthly Unified Payments Interface (UPI)-based transactions. One of BharatPE’s most innovative financial solutions is India’s first interoperable UPI QR code—a scannable 2D barcode that empowers users to make payments using the UPI system in India—and a zero-MDR (Merchant Discount Rate) payment acceptance service, which enables merchants to accept payments through the same system without any charges. Behind BharatPE’s success is the ability to manage high volumes of data, maintain data security, and scale to accommodate growth and adapt to traffic peaks, all while keeping operational and maintenance burden low. This is all powered by MongoDB Atlas . Sumit Malik, Head of Database Operations at BharatPE, presented at MongoDB .local Delhi in July 2025 , sharing the company’s transformational journey from managing a self-hosted version of MongoDB . From Community Edition to Atlas: Unlocking more scale and reducing complexity BharatPE’s legacy infrastructure relied on a self-hosted version of MongoDB: MongoDB Community Edition . The setup included three sharded clusters, each with three nodes (one primary, two secondary), handling BharatPE’s 45 terabytes of data. However, self-managing this large deployment created several challenges. Data was spread unevenly across clusters, which caused imbalances and scaling complexities. Maintaining the database also proved costly and time-consuming for the team. BharatPE was also looking to expand its disaster recovery capabilities to remove business continuity and downtime risks. Finally, operating in a regulated industry with high security standards meant that BharatPE needed to create robust end-to-end security and compliance. “We needed a database platform that could scale seamlessly, secure our data, and minimize operational burden,” said Malik. After careful consideration and due diligence, it was determined that MongoDB Atlas delivered the ideal solution against BharatPE's requirements. A carefully planned, 5-step migration approach Key to BharatPE’s successful migration was a methodical approach built around five key phases, central to avoiding downtime and maintaining business continuity throughout the process: Design phase: defining scope and strategy - In the initial phase, the BharatPE team laid the groundwork for the migration by clearly defining its scope, timeline, resources, and dependencies. They analyzed data volume, structure, and compatibility between the source system (self-hosted MongoDB) and the target system (MongoDB Atlas). “We carefully designed a migration strategy that accounted for every possible risk and dependency within our system,” said Malik. De-risk phase: assessing and mitigating risks - This phase focused on identifying and addressing potential risks associated with the migration. BharatPE validated application compatibility with MongoDB Atlas and assessed the suitability of its driver versions. Malik shared: “Understanding compatibility challenges early on helped us eliminate surprises during production.” Test phase: validating systems in lower environments - Before touching the production environment, BharatPE conducted extensive testing in a development environment that closely emulated its real-world setup. “We created a fully mirrored MongoDB Atlas test environment where we integrated our existing systems and validated application sanity and compatibility,” said Malik. Introducing an additional MongoDB server allowed the team to simulate real-world scenarios and ensure readiness. Migration phase: data transition and security - BharatPE used MongoDB’s mongosync tool to migrate terabytes of data securely and efficiently. Ensuring data privacy during transit was a top priority, and the team adopted MongoDB’s robust encryption functionality to protect sensitive financial information and ensure compliance. Validation phase: confirming data integrity and optimizing performance - Once the data was moved, BharatPE performed rigorous post-migration checks. Automated scripts were developed to validate the integrity of the migrated data, ensuring it matched the original source without discrepancies. Additionally, monitoring systems and real-time alerting were set up to catch and resolve any issues immediately. This meticulous five-step approach allowed BharatPE to transition to MongoDB Atlas without impacting its production environment, all while ensuring data security, operational continuity, and reliability. MongoDB Atlas boosts performance by 40% Since migrating to MongoDB Atlas, BharatPE has realized tangible benefits that have directly impacted its operations and customer experience. “ With MongoDB Atlas, we effectively reduced operational complexity and improved scalability ,” Malik said. Atlas’s auto-scaling capabilities enabled BharatPE to effortlessly handle the volume spikes associated with 500M+ UPI transactions monthly. Atlas’s reliability has improved availability and minimized downtime, critical to BharatPE’s 24/7 operations. “ The system’s auto-failover ensures seamless service continuity, even during node failures ,” said Malik. Notably, MongoDB’s SLA-guaranteed 99.995% uptime delivered improved consistency. Performance enhancements have been equally transformative with a 40% improvement in query response times thanks to built-in query performance analytics. Observability dashboards and real-time alerts have enabled faster issue resolution. The migration also addressed BharatPE’s security concerns. BharatPE now fully meets fintech security and compliance requirements, enabled by MongoDB’s advanced security features such as data encryption, role-based access control, and VPC peering. Finally, by eliminating the complexities of self-managed infrastructure, the company has freed resources to focus on business growth and customer experience. “ MongoDB handles audit logs with a single click—we no longer need third-party tools or manual setups ,” said Malik. “ The migration has future-proofed our infrastructure while reducing costs and improving reliability. ” MongoDB Atlas now underpins the foundations of BharatPE’s operations, and ensures merchants can continue transacting seamlessly while enabling BharatPE to expand its offerings across India’s growing fintech landscape. Visit the Atlas Learning Hub to learn more about Atlas and start building your MongoDB skills. To learn more about MongoDB Community Edition, visit the product page .

September 21, 2025
Home

Celebrating Excellence: MongoDB Global Partner Awards 2025

In a world being reshaped by AI and rapid technological change, one thing is clear: our partners are powering the future with MongoDB. Together, we help customers modernize legacy systems, solve challenges from security to budget constraints, and build the next wave of AI-powered applications. That’s why we’re proud to announce the annual MongoDB Global Partner Awards — celebrating partners who led the way in 2025. From pioneering AI and modernization to advancing public sector innovation to building bold go-to-market collaborations, these partners set the standard for excellence. Their leadership doesn’t just move the needle — it redefines what’s possible. Global Cloud Partner of the Year: Microsoft We are proud to recognize Microsoft for exceptional year-over-year growth as MongoDB’s Global Cloud Partner of the Year. Together, MongoDB and Microsoft have delivered strong momentum across industries such as healthcare, telecommunications, and financial services, helping organizations build great applications that deliver exceptional customer experiences. Microsoft’s deep commitment to collaboration, customer success, and cloud leadership makes it an indispensable part of MongoDB’s partner ecosystem. The strength of the partnership continues to grow; in fact, MongoDB was recently selected as a Microsoft partner for a “Unify your data solution play,” which enables customers to benefit from the joint integrations and go-to-market (GTM) resources between MongoDB Atlas on Azure and Native Microsoft services. Global AI Cloud Partner of the Year: Amazon Web Services (AWS) AWS has been a driving force in helping customers unlock the full potential of AI with MongoDB, highlighted by our work with Novo Nordisk , who leveraged Amazon Bedrock and MongoDB Atlas to build an AI solution that cut one of their most time-intensive workflows from 12 weeks to 10 minutes. The work with Novo Nordisk is just one example of many that showcases the power of our partnership to create business differentiation for customers in the gen AI era. MongoDB was also a generative AI Competency launch partner for AWS, further tightening our collaboration in AI. From breakthrough generative AI use cases and beyond, our partnership empowers organizations to move faster, innovate more boldly, and transform with confidence. Together, AWS and MongoDB are shaping what’s possible in the AI era. Global Cloud GTM Partner of the Year: Google Cloud Google Cloud is being honored for accelerating new business through impactful joint GTM initiatives. MongoDB's partnership with Google Cloud has set the standard for meaningful collaboration—driving new business and delivering impact across some of the world’s most complex global enterprises. The joint Google Cloud and MongoDB Sales Development Representative program has been the cornerstone of this success, ensuring early-stage talent get the opportunity to work with the largest organisations in the world whilst learning a sales playbook that will serve them well for the rest of their career. Google Cloud continues to be a driving force in MongoDB’s global growth thanks to its joint commitment to innovative GTM strategies. Global Systems Integrator Partner of the Year: Accenture Accenture has demonstrated exceptional commitment as a Global SI Partner, establishing a dedicated center of excellence for MongoDB within its software engineering service line. Together, MongoDB and Accenture have delivered transformative customer outcomes across industries, from payment modernization for a leading bank to data transformation for a major manufacturer. Meanwhile, closer collaboration with Accenture’s BFSI business unit has continued to fuel global customer success. By combining MongoDB’s modern database platform with Accenture’s deep industry expertise, our partnership continues to help customers modernize, unlock data-driven insights, and accelerate digital transformation at enterprise scale. Global Public Sector Partner of the Year: Accenture Federal Services Accenture Federal Services has played a pivotal role in advancing MongoDB’s presence in the public sector. Thanks to its scale, expertise, and focus on customer outcomes, it has driven remarkable year-over-year growth and has supported critical government missions in coordination with MongoDB. MongoDB and Accenture Federal Services are helping government agencies meet their efficiency goals by modernizing legacy applications, seamlessly consolidating platforms, and streamlining architectures, all while reducing costs. We are excited to have Accenture Federal Services as a key sponsor of our inaugural MongoDB Public Sector Summit in January 2026. Global Tech Partner of the Year: Confluent Confluent —the data streaming platform built by the co-creators of Apache Kafka®—continues to be a strategic partner with more than 550 joint customer deployments delivering impact across industries worldwide. Over the past year, MongoDB and Confluent have strengthened global go-to-market (GTM) alignment, focusing acceleration of co-sell engagement across EMEA and APAC. Together, MongoDB and Confluent have delivered gen AI quickstarts, no-code streaming demos, and co-authored agentic AI thought leadership to help customers accelerate innovation with data in motion and build event-driven AI applications. Our partnership is anchored in strong field collaboration, with ongoing co-sponsored AI workshops and hands-on developer events. A standout highlight of our GTM collaboration was a joint gen AI Developer Day with Confluent and LangChain, where AI leaders engaged 80+ developers to showcase how our combined platforms enable cost-effective, explainable, and personalized multi-agent systems. Global ISV Partner of the Year: BigID BigID has remained a standout ISV partner for MongoDB, consistently delivering strong results for customers across financial services, insurance, and healthcare. Together, we have launched impactful joint GTM initiatives, from customer events to tailored incentive programs that have accelerated growth opportunities. BigID continues to be recognized as a leader in data security, privacy, and AI data management, and thanks to our close global alignment, is further strengthening MongoDB’s position as a trusted partner for organizations operating in highly regulated industries. Global AI Tech Partner of the Year: LangChain MongoDB’s partnership with LangChain has unlocked powerful new integrations that make it easier for developers to build retrieval-augmented generation (RAG) applications and intelligent agents on MongoDB. From hybrid search and parent document retrievers to short- and long-term memory capabilities, these joint solutions are helping developers push the boundaries of what’s possible with AI. Through joint workshops, webinars, and hands-on training, we have equipped developers with the tools and knowledge to adopt these capabilities at scale. Momentum continues to build rapidly, and adoption of both the LangChain/MongoDB and LangGraph/MongoDB packages continues to grow, highlighting the strength of our collaboration and the thriving developer ecosystem that MongoDB and LangChain are enabling together. Global AI SI Partner of the Year: Pureinsights Pureinsights accelerates intelligent search and AI application development with its powerful Discovery Platform. A standout capability is its integration with Voyage AI by MongoDB , delivering advanced embeddings, multimodal embedding, and result reranking, earning recognition for its strong proof point track record and differentiated value in enterprise-grade use cases. With a focus on implementing generative AI, vector search, and RAG use cases, Pureinsights continues to empower clients to innovate quickly, reliably, and at scale. Global Modernization Partner of the Year: gravity9 gravity9 has established itself as a trusted MongoDB partner by delivering consistent impact through modernization and jumpstart projects across industries and geographies, powered by AI. As a strategic implementation partner, gravity9 specializes in designing and delivering cloud-native, scalable solutions that help organizations modernize legacy systems, adopt new technologies, accelerate time-to-value, and prepare for the AI era. By combining deep technical expertise with an agile delivery model, gravity9 enables customers to unlock transformation opportunities, whether moving workloads to the cloud, building new AI experiences, or optimizing existing infrastructure. gravity9’s close collaboration with MongoDB’s Professional Services teams has generated consistently high customer ratings, demonstrating the quality and reliability of their work. Global Impact Partner of the Year: IBM IBM is being recognized for the Impact Partner of the Year award for their strategic contributions across a variety of large, industry-leading clients. IBM has played a critical role in securing large contracts with several multinational financial institutions and is investing more in expanding the partnership globally. The partnership continues to grow, including with Atlas & Watsonx.ai, and increasing numbers of differentiated projects on the IBM Z Systems or LinuxOne infrastructure. IBM is a trusted vendor for large Enterprises, and is a strategic partner in over 25% of MongoDB's largest customers. Global Cloud - Certified DBaaS Partner of the Year: Alibaba Alibaba Cloud has established itself as a strategic MongoDB partner by driving innovation with ApsaraDB for MongoDB and utilizing AI to help organizations build modern applications. With a strong focus on key verticals such as Gaming, Automotive, Retail, and Fintech, Alibaba Cloud is enabling enterprises to modernize faster and unlock new opportunities across industries. By combining cutting-edge data solutions with a bold global expansion strategy, Alibaba Cloud empowers customers worldwide to accelerate transformation, whether scaling digital platforms, delivering new customer experiences, or optimizing mission-critical workloads. Looking ahead Congratulations to all of the 2025 Global Partner Award winners! Their commitment to innovation, collaboration, and customer success has—and will have—a lasting impact on organizations worldwide. These awards not only recognize the past year’s achievements, but also underscore MongoDB’s vision for what we, alongside our partners, will build together in the future. To learn more about the MongoDB Partner Program, please visit our partners page .

September 18, 2025
Home

The Future of AI Software Development is Agentic

Today in New York, our flagship MongoDB.local event is bringing together thousands of developers and tech leaders to discuss the future of building with MongoDB. Among the many exciting innovations and product announcements shared during the event, one theme has stood out: empowering developers to reliably build with AI and create AI solutions at scale on MongoDB. This post will explore how these advancements are set to accelerate developer productivity in the AI era. Ship faster with the MongoDB MCP Server Software development is rapidly evolving with AI tools powered by large language models (LLMs). From AI-driven editors like VS Code with GitHub Copilot and Windsurf, to terminal-based coding agents like Claude Code, these tools are transforming how developers work. While these tools bring tremendous productivity gains already, coding agents are still limited by the context they have. Since databases hold the core of most application-related data, access to configuration details, schemas, and sample data from databases is essential for generating accurate code and optimized queries. With Anthropic’s introduction of the Model Context Protocol (MCP) in November 2024, a new way emerged to connect AI agents with data sources and services. Database connection and interaction quickly became one of the most popular use cases for MCP in agentic coding. Today, we’re excited to announce the general availability (GA) of the MongoDB MCP Server, giving AI assistants and agents access to the context they need to explore, manage, and generate better code with MongoDB. Building on our public preview used by thousands of developers, the GA release introduces key capabilities to strengthen production readiness: Enterprise-grade authentication (OIDC, LDAP, Kerberos) and proxy connectivity. Self-hosted remote deployment support, enabling shared deployments across teams, streamlined setup, and centralized configuration. Note that we recommend following security best practices , such as implementing authentication for remote deployments. Accessible as a bundle with the MongoDB for VS Code extension , it delivers a complete experience: visually explore your database with the extension or interact with the same connection through your AI assistant, all without switching context. Figure 1. Overview of the MongoDB MCP Server. Meeting developers where they are with n8n and CrewAI integrations AI is transforming how developers build with MongoDB, not just in coding workflows, but also in creating AI applications and agents. From retrieval-augmented generation (RAG) to powering agent memory, these systems demand a database that can handle diverse data types—such as unstructured text (e.g., messages, code, documents), vectors, and graphs—all while supporting comprehensive retrieval mechanisms at scale like vector and hybrid search. MongoDB delivers this in a single, unified platform: the flexible document model supports the varied data agents need to store, while advanced, natively integrated search capabilities eliminate the need for separate vector databases. With Voyage AI by MongoDB providing state-of-the-art embedding models and rerankers, developers get a complete foundation for building intelligent agents without added infrastructure complexity. As part of our commitment to making MongoDB as easy to use as possible, we’re excited to announce new integrations with n8n and CrewAI . n8n has emerged as one of the most popular platforms for building AI solutions, thanks to its visual interface and out-of-the-box components that make it simple and accessible to create reliable AI workflows. This integration adds official support for MongoDB Atlas Vector Search , enabling developers to build RAG and agentic RAG systems through a flexible, visual interface. It also introduces an agent chat memory node for n8n agents, allowing conversations to persist by storing message history in MongoDB. Figure 2. Example workflow with n8n and MongoDB powering an AI agent. Meanwhile, CrewAI—a fast-growing open-source framework for building and orchestrating AI agents—makes multi-agent collaboration more accessible to developers. As AI agents take on increasingly complex and productive workflows such as online research, report writing, and enterprise document analysis, multiple specialized agents need to interact and delegate tasks with each other effectively. CrewAI provides an easy and approachable way to build such multi-agent systems. Our official integration adds support for MongoDB Atlas Vector Search , empowering developers to build agents that leverage RAG at scale. Learn how to implement agentic RAG with MongoDB Atlas and CrewAI. The future is agentic AI is fundamentally reshaping the entire software development lifecycle, including for developers building with MongoDB. New technology like the MongoDB MCP Server is paving the way for database-aware agentic coding, representing the future of software development. At the same time, we’re committed to meeting developers where they are: integrating our capabilities into their favorite frameworks and tools so they can benefit from MongoDB’s reliability and scalability to build AI apps and agents with ease. Start building your applications with the MongoDB MCP Server today by following the Get Started guide . Visit the AI Learning Hub to learn more about building AI applications with MongoDB.

September 17, 2025
Artificial Intelligence

MongoDB Queryable Encryption Expands Search Power

Today, MongoDB is expanding the power of Queryable Encryption by introducing support for prefix, suffix, and substring queries. Now in public preview, these capabilities extend the technology beyond equality and range queries, unlocking broader use cases for secure, expressive search on encrypted data. Developed by the MongoDB Cryptography Research Group , Queryable Encryption is a groundbreaking, industry-first in use encryption technology. It enables customers to encrypt sensitive application data, store it in encrypted form in the MongoDB database, and perform expressive queries directly on that encrypted data. This release provides organizations with the tools to perform flexible text searches on encrypted data, such as matching partial names, keywords, or identifiers, without ever exposing the underlying information. This helps strengthen data protection, simplify compliance, and remove the need for complex workarounds such as external search indexes, all without any changes to the application code. With support for prefix, suffix, and substring queries, Queryable Encryption enables organizations to protect sensitive data throughout its lifecycle: at rest, in transit, and in use. As a result, teams can build secure, privacy-preserving applications without compromising functionality or performance. Queryable Encryption is available at no additional cost in MongoDB Atlas , Enterprise Advanced , and Community Edition . Encryption: Securing data across its lifecycle Many organizations must store and search sensitive data, such as personally identifiable information (PII) like names, Social Security numbers, or medical details, to power their applications. Implementing this securely presents real challenges. Encrypting data at rest and in transit is widely adopted and table stakes. However, encrypting data while it is actively being used, known as encryption in use, has historically been much harder to realize. The dilemma is that traditional encryption makes data unreadable, preventing databases from running queries without first decrypting it. For instance, a healthcare provider may need to find all patients with diagnoses that include the word “diabetes.” However, without decrypting the medical records, the database cannot search for that term. To work around this, many organizations either leave sensitive fields unencrypted or use complex and less secure workarounds, such as building separate search indexes. Both approaches add operational overhead and increase the risk of unauthorized access. They also make it harder to comply with regulations like the Health Insurance Portability and Accountability Act (HIPAA), Payment Card Industry Data Security Standard (PCI-DSS), or General Data Protection Regulation (GDPR), where violations can carry significant fines. To fully protect sensitive data and meet compliance requirements, organizations need the ability to encrypt data in use, in transit, and at rest without compromising operational efficiency. Building secure applications with fewer tradeoffs MongoDB Queryable Encryption solves this quandary. It protects sensitive data while eliminating the tradeoff between security and development velocity. Organizations can encrypt sensitive data, such as personally identifiable information (PII) or protected health information (PHI), while still running queries directly on that data without exposing it to the database server. With support for prefix, suffix, and substring queries (in public preview), Queryable Encryption enables MongoDB applications to encrypt sensitive fields such as names, email addresses, notes, and ID numbers while still performing native partial-match searches on encrypted data. This eliminates the impasse between protecting sensitive information and enabling essential application functionality. For business leaders, Queryable Encryption strengthens data protection, supports compliance requirements, and reduces the risk of data exposure. This helps safeguard reputation, avoid costly fines, and eliminate the need for complex third-party solutions. For developers, advanced encrypted search is built directly into MongoDB’s query language. This eliminates the need for code changes, external indexes, or client-side workarounds while simplifying architectures and reducing overhead. Some examples of what organizations can now achieve: PII Search for compliance and usability: Regulations such as GDPR and HIPAA mandate strict privacy of personal information. With prefix queries, teams can retrieve users by last name or email prefix while ensuring the underlying data remains encrypted. This makes compliance easier without reducing search functionality. Keyword filtering in support workflows: Customer service notes often contain sensitive details in free-text fields. With substring query support, teams can search encrypted notes for specific keywords, e.g. “refund,” “escalation,” or “urgent”. This is possible without exposing the contents of those notes. Secure ID validation: Identity workflows often rely on partial identifiers such as the last digits of a Social Security Number in the U.S., a National Insurance Number in the UK, or an Aadhaar Number in India. Suffix queries enable these lookups on encrypted fields without revealing full values. This reduces the risk of data leaks in regulated environments. Case management for public agencies: Case numbers and reference IDs in public sector applications often follow structured formats. Now agencies can securely retrieve records using a prefix query based on region- or office-based prefixes without exposing sensitive case metadata, e.g. “NYC-” or “EUR-”. Note: This functionality is in public preview. Therefore, MongoDB recommends that these new Queryable Encryption features not be used for production workloads until they are generally available in 2026. MongoDB wants to build and improve Queryable Encryption with customer needs and use cases in mind. As General Availability approaches, customers are encouraged to contact their account team or share feedback through the MongoDB Feedback Engine . Robust data protection at every stage MongoDB offers unmatched protection for sensitive data throughout its entire lifecycle with Queryable Encryption. This includes data in transit, at rest, or in use. With the addition of prefix, suffix, and substring query support, Queryable Encryption meets even more of the demands of modern applications, unlocking new use cases. To learn more about Queryable Encryption and how it works, explore the features documentation page . To get started using Queryable Encryption, read the Quick Start Guide .

September 17, 2025
Home

Supercharge Self-Managed Apps With Search and Vector Search Capabilities

MongoDB is excited to announce the public preview of search and vector search capabilities for use with MongoDB Community Edition and MongoDB Enterprise Server. These new capabilities empower developers to prototype, iterate, and build sophisticated, AI-powered applications directly in self-managed environments with robust search functionality. Versatility is one of the reasons why developers love MongoDB. MongoDB can run anywhere. 1 This includes local setups where many developers kickstart their MongoDB journey, to the largest enterprise data centers when it is time to scale, and MongoDB’s fully managed cloud service, MongoDB Atlas . Regardless of where development takes place, MongoDB effortlessly integrates with any developer's workflow. MongoDB Community Edition is the free, source-available version of MongoDB that millions of developers use to learn, test, and grow their skills. MongoDB Enterprise Server is the commercial version of MongoDB’s core database. It offers additional enterprise-grade features for companies that prefer to self-manage their deployments on-premises or in public, private, or hybrid cloud environments. With native search and vector search capabilities now available for use with Community Edition and Enterprise Server, MongoDB aims to deliver a simpler and consistent experience for building great applications wherever they are deployed. What is search and vector search? Similar to the offerings in MongoDB Atlas, MongoDB Community Edition and MongoDB Enterprise Server now support two distinct yet complementary search capabilities: Full-text search is an embedded capability that delivers a seamless, scalable experience for building relevance-based app features. Vector search enables developers to build intelligent applications powered by semantic search and generative AI using native, full-featured vector database capabilities. There are no functional limitations on the core search aggregation stages in this public preview. Therefore, $search , $searchMeta , and $vectorSearch are all supported with functional parity to what is available in Atlas, excluding features in a preview state. For more information, check out the search and vector search documentation pages. Solving developer challenges with integrated search Historically, integrating advanced search features into self-managed applications often required bolting on external search engines or vector databases to MongoDB. This approach created friction at every stage for developers and organizations, leading to: Architectural complexity: Managing and synchronizing data across multiple, disparate systems added layers of complexity, demanded additional skills, and complicated development workflows. Operational overhead: Handling separate provisioning, security, upgrades, and monitoring for each system placed a heavy load on DevOps teams. Decreased developer productivity: Developers are forced to learn and use different query APIs and languages for both the database and the search engine. This resulted in frequent context switching, steeper learning curves, and slower release cycles. Consistency challenges: Aligning the primary database with separate search or vector indexes risked producing out-of-sync results. Despite promotions of transactional guarantees and data consistency, these indexes were only eventually consistent. This led to incomplete results in rapidly changing environments. With search and vector search now integrated into MongoDB Community Edition and MongoDB Enterprise Server, these trade–offs disappear. Developers can now create powerful search capabilities using MongoDB's familiar query framework, removing the synchronization burden and the need to manage multiple single-purpose systems. This release simplifies data architecture, reduces operational overhead, and accelerates application development. With these capabilities, developers can harness sophisticated out-of-the-box capabilities to build a variety of powerful applications. Potential use cases include: table, th, td { border: 1px solid black; border-collapse: collapse; } th, td { padding: 5px; } Use Case Description Keyword/Full-text search Autocomplete and fuzzy search Create real-time suggestions and correct spelling errors as users type, improving the search experience Search faceting Apply quick filtering options in applications like e-commerce, so users can narrow down search results based on categories, price ranges, and more Internal search tools Build search tools for internal use or for applications with sensitive data that require on-premises deployment Vector search AI-powered semantic search Implement semantic search and recommendation systems to provide more relevant results than traditional keyword matching Retrieval-augmented generation (RAG) Use search to retrieve factual data from a knowledge base to bring accurate, context-aware data into large language model (LLM) applications AI agents Create agents that utilize tools to collect context, communicate with external systems, and execute actions Hybrid search Hybrid search Combine keyword and vector search techniques Data processing Text analysis Perform text analysis directly in the MongoDB database MongoDB offers native integrations with frameworks such as LangChain , LangGraph , and LlamaIndex . This streamlines workflows, accelerates development, and embeds RAG or agentic features directly into applications. To learn more about other AI frameworks supported by MongoDB, check out this documentation . MongoDB’s partners and champions are already experiencing the benefits from utilizing search and vector search across a wider range of environments: “We’re thrilled that MongoDB search and vector search are now accessible in the already popular MongoDB Community Edition. Now our customers can leverage MongoDB and LangChain in either deployment mode and in their preferred environment to build cutting-edge LLM applications.”—Harrison Chase, CEO, LangChain. “MongoDB has helped Clarifresh build awesome software, and I’ve always been impressed with its rock-solid foundations. With search and vector search capabilities now available in MongoDB Community Edition, we gain the confidence of accessible source code, the flexibility to deploy anywhere, and the promise of community-driven extensibility. It’s an exciting milestone that reaffirms MongoDB’s commitment to developers.”—Luke Thompson, MongoDB Champion, Clarifresh. “We’re excited about the next interaction of search experiences in MongoDB Community Edition. Our customers want the highest flexibility to be able to run their search and gen AI-enabled applications, and bringing this functionality to Community unlocks a whole new way to build and test anywhere.”—Jerry Liu, CEO, LlamaIndex. “Participating in the Private Preview of Full-text and Vector Search for MongoDB Community has been an exciting opportunity. Having $search, $searchMeta, and $vectorSearch directly in Community Edition brings the same powerful capabilities we use in Atlas—without additional systems or integrations. Even in early preview, it’s already streamlining workflows and producing faster, more relevant results.”—Michael Höller, MongoDB Champion, akazia Consulting. Accessing the public preview The public preview is available for free and is intended for testing, evaluation, and feedback purposes only. Search and Vector Search with MongoDB Community Edition. The new capabilities are compatible with MongoDB version 8.2+, and operate on a separate binary, mongot, which interacts with the standard mongodb database binary. To get started, ensure that: A MongoDB Community Server cluster is running using one of the following three methods: Download MongoDB Community Server version 8.2 from the MongoDB Downloads page . As of public preview, this feature is available for self-managed deployments on supported Linux distributions and architectures for MongoDB Community Edition version 8.2+. Download the ```mongot``` binary from the MongoDB Downloads page . Pull the container image for Community Server 8.2 from a public Docker Hub repository . Coming soon: Deploy using the MongoDB Controllers for Kubernetes Operator (Search Support for Community Server is planned for version 1.5+ ). Search and Vector Search for use with MongoDB Enterprise Server . The new capabilities are deployed as self-managed search nodes in a customer's Kubernetes environment. This will seamlessly connect to any MongoDB Enterprise Server clusters, residing inside or outside Kubernetes itself. To get started, ensure that: A MongoDB Enterprise Server cluster is running. version 8.0.10+ (for MongoDB Controllers for Kubernetes operator 1.4). version 8.2+ (for MongoDB Controllers for Kubernetes operator 1.5+). A Kubernetes environment. The MongoDB Controllers for Kubernetes Operator are installed in the Kubernetes cluster. Find installation instructions here . Comprehensive documentation for setup for MongoDB Community Edition and MongoDB Enterprise Server is also available. What's next? During the public preview, MongoDB will deliver additional updates and roadmap features based on customer feedback. After the public preview, these search and vector search capabilities are anticipated to be generally available for use with on-premise deployments. For Community Edition, these capabilities will be available at no additional cost as part of the Server Side Public License (SSPL) . For MongoDB Enterprise Server, these capabilities will be included in a new paid subscription offering that will launch in the future. Pricing and packaging details for the subscription will be available closer to launch. For developers seeking a fully managed experience in the cloud, MongoDB Atlas offers a production-ready version of these capabilities today. MongoDB would love to hear feedback! Suggest new features or vote on existing ideas at feedback.mongodb.com . The input is critical for shaping the future of this product. Users can contact their MongoDB account team to provide more comprehensive feedback. Check out MongoDB’s documentation to learn how to get started with Search and Vector Search in MongoDB Community Edition and MongoDB Enterprise Server . 1 MongoDB can be deployed as a fully managed multi-cloud service across all major public cloud providers, in private clouds, locally, on-premises and hybrid environments.

September 17, 2025
Artificial Intelligence

MongoDB AMP: An AI-Driven Approach to Modernization

Why should a database company be your modernization partner? It’s a fair question. From over a decade of experience with database migrations, we've learned that the database is often the single biggest blocker preventing digital transformation. It's where decades of business logic have been embedded, where critical dependencies multiply, and where the complexity that blocks innovation actually lives. But by working with MongoDB, customers have found that transforming their data layer removed the barriers that had stalled previous modernization attempts. Now, with today’s launch of the MongoDB Application Modernization Platform (AMP), we're providing customers a proven approach to full-stack modernization. MongoDB AMP is an AI-powered solution that rapidly and safely transforms legacy applications into modern, scalable services. MongoDB AMP integrates agentic AI workflows into our modernization methodology, alongside reusable, battle-tested tooling, and the expertise we've developed through customer engagements over the past decade—a powerful combination of tools, technique, and talent. By combining AMP tooling with MongoDB’s proven, repeatable framework, customers have seen tasks like code transformation sped up by 10x or more—with overall modernization projects implemented 2–3 times faster on average. Figure 1. The MongoDB Application Modernization Platform. The common challenges Many of our customers are facing the same impossible choice: accept growing technical debt that slows every business initiative, or risk disruption with a full system rewrite. Their teams are stuck maintaining legacy code instead of building new capabilities. These legacy systems have evolved into interconnected webs (“spaghetti messes”) where even simple changes require coordination across multiple systems and teams. Database changes require corresponding updates to middleware integrations, application business logic, and user interface components. Teams struggle to update systems because any change brings risks breaking something else they don't fully understand. Innovation initiatives often get blocked because new capabilities struggle to integrate within the constraints of legacy systems. Technical debt accumulates with every workaround, making each subsequent change more complex and risky than the last. Before working with MongoDB, Intellect Design's Wealth Management platform exemplified this challenge perfectly . Key business logic was locked in hundreds of SQL stored procedures, leading to batch processing delays of up to eight hours and limiting scalability as transaction volumes grew. The platform’s rigid architecture hindered innovation and blocked integration with other systems, such as treasury and insurance platforms, preventing the delivery of unified financial services that their enterprise clients demanded. In cases like this, the result is stagnation disguised as stability. Systems "work" but can't evolve. Applications can handle today's requirements, but can't adapt to tomorrow's opportunities. Legacy architectures have become the foundation on which everything else depends—and the constraint that prevents everything else from changing. Battle-tested solutions By working through challenges with customers, we've built a comprehensive methodology for modernization, backed by sophisticated tools that address the messy reality of legacy applications. Our approach empowers application teams with proven processes and purpose-built technology to systematically address key challenges. Central to our methodology is a test-first philosophy that has proven essential for safe, reliable modernization. Before any transformation begins, we develop comprehensive test coverage for existing applications, creating a baseline that captures how legacy systems actually behave in production. This upfront investment in testing becomes the foundation for everything that follows, providing guardrails that ensure modernized code performs identically to the original while giving teams the confidence to make changes without fear of breaking critical business processes. Our test-driven approach ensures modernization is a methodical, validated process where every change is verified. Before we make any code changes, we establish a complete picture of the legacy system. We've built sophisticated analysis tools that comprehensively map legacy application architectures. These tools uncover the complex interdependencies and embedded logic that make legacy applications far more intricate than they appear on the surface. This deep analysis isn't just about cataloging complexity; it's about understanding the true scope, informing execution of the transformation, and identifying potential risks before they derail projects. Analysis is just the start. By working with customers, we've learned that successful modernization requires careful sequencing and planning. Our dependency analysis capabilities help teams understand not just what needs to be migrated, but the critical order of operations and what safeguards need to be in place at each step. It's critical to avoid the temptation to migrate everything at once. MongoDB’s approach is designed to make complex modernizations successful by transforming applications incrementally with robust validation. Instead of crossing your fingers and hoping everything works after months of development, our methodology decomposes large modernization efforts into manageable components where every component is iteratively tested and verified. Issues are caught early when they're easy to fix, not after months of development when rollback becomes costly and complex. Each successful iteration reduces risk rather than accumulating it. The agentic AI acceleration MongoDB AMP represents over two years of dedicated effort to integrate AI-powered automation into our battle-tested processes, dramatically accelerating modernization while maintaining the reliability our customers depend on. AI powerfully expands our validation processes by generating additional test cases to validate modernized applications against their legacy counterparts. This dramatically improves confidence in migration results while reducing the time teams spend manually creating test cases for the complex business logic they are trying to preserve. Our existing analysis tools, which decompose embedded logic into smaller segments, now feed directly into AI systems that can automatically transform the code components they discover. What once required weeks of manual code conversion can now happen in hours, with testing frameworks providing the same rigorous validation we've always insisted on. For example, Bendigo and Adelaide Bank reduced the development time to migrate a banking application by up to 90% . The difference is speed and scale, without sacrificing quality or safety. Figure 2. The AMP process. Years of customer engagement and refined processes provide the foundation and guardrails that make AI-powered modernization effective and safe. With MongoDB AMP, AI becomes a force multiplier that transforms our proven approach into something that can tackle modernization challenges at unprecedented speed and scale. Migrating simple code is now 50 to 60 times quicker, and we can migrate small applications 20 times faster to MongoDB. Regression testing also went from three days to three hours with automated test generation. Fabrice Bidard, Head of Technical Architecture, Lombard Odier Ready to begin your modernization journey? Legacy application modernization doesn't have to be a leap of faith. With MongoDB as your partner, you gain access to proven methodologies, battle-tested tools, and the accelerated capabilities that agentic AI brings to our existing expertise. Contact our team to discuss your specific challenges and learn how our proven methodology can be applied to your environment.

September 16, 2025
Artificial Intelligence

Unlock AI With MongoDB and LTIMindtree’s BlueVerse Foundry

Many enterprises are eager to capitalize on gen AI to transform operations and stay competitive, but most remain stuck in proofs of concept that never scale. The problem isn’t ambition. It’s architecture. Rigid legacy systems, brittle pipelines, and fragmented data make it hard to move from idea to impact. That’s why LTIMindtree partnered with MongoDB to create BlueVerse Foundry : a no-code, full-stack AI platform powered by MongoDB Atlas , built to help enterprises quickly go from prototype to production without compromising governance, performance, or flexibility. The power of MongoDB: Data without limits At the heart of this platform is MongoDB Atlas, a multi-cloud database that redefines how enterprises manage and use data for AI. Unlike traditional relational databases, MongoDB’s document model adapts naturally to complex, evolving data, without the friction of rigid schemas or heavy extract, transform, and load pipelines. For AI workloads that rely on diverse formats like vector embeddings, images, or audio, MongoDB is purpose built. Its real-time data capabilities eliminate delays and enable continuous learning and querying. Search is another differentiator. With MongoDB Atlas Search and Atlas Vector Search , MongoDB enables enterprises to combine semantic and keyword queries for highly accurate, context-aware results. GraphRAG adds another layer, connecting relationships in data through retrieval-augmented generation (RAG) to reveal deeper insights. Features like semantic caching ensure performance remains high even under pressure, while built-in support for both public and private cloud deployments makes it easy to scale. Together, these capabilities turn MongoDB from a data store into an AI acceleration engine, supporting everything from retrieval to real-time interaction to full-stack observability. The challenge: Building with limitations Traditional systems were never designed for the kind of data modern AI requires. As enterprises embrace gen AI models that integrate structured and unstructured data, legacy infrastructure shows its cracks. Real-time processing becomes cumbersome, multiple environments create redundancy, and rising computing needs inflate costs. Building AI solutions often demands complex coding, meticulous model training, and extensive infrastructure planning, resulting in a delayed time to market. Add to that the imperative of producing responsible AI, and the challenge becomes even steeper. Models must not only perform but also be accurate, unbiased, and aligned with ethical standards. Enterprises are left juggling AI economics, data security, lineage tracking, and governance, all while trying to deliver tangible business value. This is precisely why a flexible, scalable, and AI-ready data foundation like MongoDB is critical. Its ability to handle diverse data types and provide real-time access directly addresses the limitations of traditional systems when it comes to gen AI. The solution: A smarter way to scale AI With BlueVerse Foundry and MongoDB Atlas, enterprises get the best of both worlds: LTIMindtree’s rapid no-code orchestration and MongoDB’s flexible, scalable data layer. This joint solution eliminates common AI bottlenecks and accelerates deployment, without the need for complex infrastructure or custom code. BlueVerse Foundry’s modular, no-code architecture enables enterprises to quickly build, deploy, and scale AI agents and apps without getting bogged down by technical complexity. This is significantly amplified by MongoDB’s inherent scalability, schema flexibility, and native RAG capabilities, which were key reasons for LTIMindtree choosing MongoDB as the foundational data layer. With features like the no-code agent builder, agent marketplace, and business-process-automation blueprints, enterprises can create tailored solutions that are ready for production, all powered by MongoDB Atlas. A synergistic partnership: Smarter together The collaboration between MongoDB and LTIMindtree’s BlueVerse Foundry brings together powerful AI capabilities with a future-ready database backbone. This partnership highlights how MongoDB’s AI narrative and broader partner strategy focus on enabling enterprises to build intelligent applications faster and more efficiently. Together, they simplify deployment, enable seamless integration with existing systems, and create a platform that can scale effortlessly as enterprise needs evolve. What makes this partnership stand out is the ability to turn ideas into impact faster. With no-code tools, prebuilt agents, and MongoDB’s flexible data model, enterprises don’t need to wait months to see results. They can use their existing infrastructure, plug in seamlessly, and start delivering real-time AI-driven insights almost immediately. Governance, performance, and scalability aren’t afterthoughts; they’re built into every layer of this ecosystem. “We’re seeing a shift from experimentation to execution—enterprises are ready to scale gen AI, but they need the right data foundation,” said Haim Ribbi, Vice President of Global CSI, VAR and Tech Partner at MongoDB. “That’s where MongoDB Atlas fits in, and where an agentic platform like LTIMindtree’s BlueVerse Foundry uses it to its full potential for innovation.” Real-world impact: From data to differentiated experiences This joint solution is already delivering real-world impact. A leading streaming platform used LTIMindtree’s solution, powered by MongoDB, to personalize content recommendations in real time. With MongoDB handling the heavy lifting of diverse data management and live queries, the company saw a 30% rise in user engagement and a 20% improvement in retention. Central to this transformation is the platform’s content hub, which acts as a unified data catalog, organizing enterprise information so it’s accessible, secure, and ready to power next-generation AI solutions with MongoDB’s robust data management. Whether dealing with text, images, or audio, the platform seamlessly manages multimodal data, eliminating the need for separate systems or processes. For businesses looking to accelerate development, BlueVerse Foundry and Marketplace offer a no-code builder, prebuilt agents, and templates, enabling teams to go from concept to deployment in a fraction of the time compared to traditional methods. BlueVerse Foundry’s RAG pipelines simplify building smart applications, using MongoDB Atlas Search and MongoDB Atlas Vector Search for highly effective RAG. Advanced orchestration connects directly with AI models, enabling rapid experimentation and deployment. A globally acclaimed media company has been using BlueVerse Foundry to automate content tagging and digital asset management, cutting its discovery time by 40% and reducing overheads by 15%—clear evidence of gen AI’s bottom-line impact when implemented right. BlueVerse Foundry’s strength lies in combining speed and control. By providing everything from ready-to-use user-experience kits, over 25 plug-and-play microservices, token-based economic models, 100+ safe listed large language models (LLMs), tools and agents, and full-stack observability, BlueVerse Foundry and Marketplace enables enterprises to move faster without losing sight of governance. Its support for voice interfaces, regional languages, Teams, mobile, and wearables like Meta AI Glasses ensures an omnichannel experience out of the box. Responsible AI: A built-in capability LTIMindtree doesn’t just build AI faster; it builds it responsibly. With built-in measures like LLM output evaluation, moderation, and audit trails, the platform ensures enterprises can trust the results their models generate. This is further supported by MongoDB’s robust security features and data governance capabilities, ensuring a secure and ethical AI ecosystem. It’s not just about preventing hallucinations or bias; it’s about creating an ecosystem where quality, transparency, and ethics are fundamental, not optional. Scaling: Streamlined for the long term The platform’s libraries, app galleries, and FinOps tooling enable businesses to test, deploy, and expand with confidence. Powered by MongoDB Atlas’s inherent scalability and multi-cloud flexibility, BlueVerse Foundry is built for long-term AI success, not just early experimentation. Enterprise AI: From possibility to production The BlueVerse Foundry and Marketplace, powered by MongoDB, is more than a technological partnership; it’s a new standard for enterprise AI. It combines deep AI expertise with an agile data foundation, helping organizations escape the trap of endless proofs of concept and unlock meaningful value. For enterprises still unsure about gen AI’s return on investment, this solution offers a proven path forward, grounded in real-world success, scalability, and impact. The future of AI isn’t something to wait for. With LTIMindtree and MongoDB, it’s already here. Explore how LTIMindtree and MongoDB are transforming gen AI from a concept into an enterprise-ready reality. Learn more about building AI applications with MongoDB through the AI Learning Hub .

September 15, 2025
Artificial Intelligence

Circles Uses MongoDB to Fuel Jetpac’s Rapid Global Expansion

Founded in Singapore in 2014, Circles has grown to become a global telecommunication company revolutionizing the industry with its cutting-edge SaaS platform. Present in 14 countries, Circles empowers telco operators worldwide to launch innovative digital brands and refresh existing ones, accelerating their transformation into ‘techcos’. MongoDB Atlas is at the heart of Circles’ success, enabling one of Circles’ biggest product launches, Jetpac, in 2022. Kelvin Chua, Head of Markets, and Circles' first employee, described Circles’ experience at the recent MongoDB .local Singapore, in July 2025 . During a Fireside session, Chua shared insights into Circles’ journey and his own close relationship with MongoDB. Here’s the full conversation. For the Fireside discussion with Kelvin Chua, skip to 23'40. Before we dive into your work with MongoDB, could you please introduce yourself? Sure. I am currently the Head of Markets for Circles, but I am also their first employee. I've been working in the telco space for more than 20 years, and I've been around the ecosystem of startups a lot, helping build and scale startups. Actually, the first time I used MongoDB was for a startup in the [Silicon] Valley. So you have a pretty long standing relationship with MongoDB. Can you tell us a bit more about that? As I said, my relationship with MongoDB dates back to my start-up days, when MongoDB was still in its infancy. I chose MongoDB to handle about 5 million documents per hour. That was back in 2013. From there, I started looking at how MongoDB scales. Years later, I continued to leverage MongoDB to help build Circles in Singapore, but also for scaling the company globally across Pakistan, Mexico, and other regions. Figure 1. Kelvin Chua, Head of Markets, Circles, speaking at MongoDB.local Singapore in July 2025. How [did] Circles’ journey with MongoDB start? Circles was built on the Community Edition of MongoDB back in 2014 . At the time, our team was using Node.js, and I immediately knew that MongoDB and a NoSQL database model was the right choice to build and scale the business. As a fan of Node.js, it was very natural to feel the ease of using MongoDB. I feel like using Node.js for your workload and then using MongoDB for the backend creates the best tandem. As we transitioned to other development environments and languages, including Golang, MongoDB remained at the core of our database operations, mostly because of its flexibility, the ease of prototyping, and the scalability. We never really saw the need to change from MongoDB as our requirements for a document store are always fulfilled by it. More recently, Circles launched a very successful offering: Jetpac. … This was also built on MongoD, but before we dive into this, can you share more about what sparked the idea for this product? As you know, the COVID-19 pandemic put a hold on all international travel. So when 2022 rolled in, we were expecting a boom in travel again as restrictions eased up. This is when we had the idea for Jetpac, which is basically a travel tech solution providing seamless roaming and innovative travel lifestyle products. You had a pretty challenging timeline to work against, though? Yes! We had a massive challenge because we had six weeks to build Jetpac right from zero. That included solutioning, strategizing, and team-building. Having had ten years of experience working with MongoDB, I knew we needed a NoSQL database, especially to keep track of what people are buying, how they are using the packs, and how to present usage to customers. So Jetpac was built on MongoDB Atlas in just six weeks, in time to launch before the end of year holiday period where international travel was expected to resume. Since then, Jetpac expanded really quickly: revenue grew 500% between January and November 2024, and it is now available in 200 countries. Jetpac is not just a Singapore product anymore—it’s a global brand. Can you explain why you recently decided to move from MongoDB Community to Atlas for your operations in Singapore? Sure. After several years running and scaling the business on MongoDB Community Edition, we decided that it was time to move to MongoDB Atlas because we wanted to optimize efficiencies and reduce operational costs across Circles’ markets, as well as reduce and ease the regulatory compliance risk and burden posed by the Singapore telco industry. It all started when we decided to run an internal comparison to look into how many resources were spent on maintaining our database internally, versus moving to MongoDB Atlas. We realized that we were running very inefficient clusters—many clusters with only about 10% utilization per cluster. That cost goes to the cloud provider. We found that we could aggregate these clusters into MongoDB Atlas to improve utilization and save money. Another main benefit of moving to MongoDB Atlas was the flexibility and productivity that this offered our engineering and developer team. That is something we hear a lot from our customers, how MongoDB Atlas really helps empower their engineering team. How did that show up for you and your team? MongoDB Atlas amplifies everything very fast. It allows engineers to make mistakes in sandbox environments in a way where they don’t get constrained by wondering, ‘What can I break?’ If they make a mistake, they can spin up a new instance and move forward. That is really valuable. Also, with automation and workflows streamlined under MongoDB Atlas, we have really been able to expedite new projects. For example, I was in India a few weeks ago for one of our new Jetpac B2B projects. We were able to shortcut our process by about a week just because contractors could access MongoDB Atlas and select schemas immediately—no delays in consulting environments! What does the future hold for Circles? What are your priorities? What technologies are you investing in? I think AI is really a big priority for us. Actually, some of our users might already have noticed that our app is starting to evolve into a more AI-powered application— we provide predictions, automatic waivers, personalized special offers, and more. As a company, Circles plans to continue leveraging MongoDB as we scale these AI operations, particularly for retrieval-augmented generation (RAG) projects where we want to rely on MongoDB’s vector search capabilities . I love hearing that and am really looking forward to seeing all of these projects come to life! Thank you so much for being with us, Kelvin. Thank you. Learn more about MongoDB Atlas through the learning hub . Visit our product page to learn more about MongoDB Community Edition.

September 11, 2025
Home

MongoDB Atlas Now Available in the Vercel Marketplace

We are pleased to announce that MongoDB Atlas is now available in the Vercel Marketplace . It’s now easier than ever to leverage MongoDB’s flexible document model, distributed architecture, and versatile built-in search capabilities from within the Vercel ecosystem. The combination of Vercel, the company that provides the tools and infrastructure for developers to build on the AI Cloud, and MongoDB, the world’s leading modern database, creates a supercharged offering that uniquely enables developers to rapidly build, scale, and adapt AI applications. In the words of Tom Occhino, Chief Product Officer, Vercel, "We’re excited to partner with MongoDB to bring Atlas into the Vercel Marketplace. By combining MongoDB’s flexible data platform with Vercel’s focus on developer experience, we’re giving our joint community a faster path to build and scale intelligent applications on the AI Cloud.” Andrew Davidson, SVP Products, MongoDB, added, "Vercel powers many of the best experiences on the web, with an exceptional focus on developer experience from open source to their AI Cloud. We are thrilled to be launching onto the Vercel Marketplace, supercharging our joint community with the power of MongoDB's flexible document model with integrated search and vector search." The Vercel Marketplace is the best way for developers hosting web applications on Vercel to manage third-party dependencies. It’s easy to use—with a few clicks, developers can integrate tools for analytics, authentication, logging, testing, and more. Now that MongoDB is part of the Marketplace, you can simply follow the intuitive deployment process and let MongoDB Atlas persist data for web applications built on Vercel. Vercel: Build and deploy on the AI Cloud AI-powered applications and tools have fundamentally changed the landscape of web development. Developers are increasingly tasked with building applications that can process unstructured data, adapt to changing requirements, and scale dynamically—all while maintaining the speed and reliability users expect. Traditional relational databases and hosting solutions can fall short in this new paradigm, creating friction that slows development and limits innovation. Vercel has carved for itself a key place in this new landscape. Originally known for creating and maintaining Next.js , one of the most popular frameworks for building web applications, Vercel has built on that early success to evolve far beyond its frontend origins. The Vercel Marketplace serves as a central hub where developers can discover and manage their third-party services. v0 by Vercel lets developers turn their ideas into interactive web apps, fast, with AI that generates production-grade code from natural language. And Vercel’s AI SDK provides a free, open-source library that gives developers the tools they need to build AI-powered products. The whole ecosystem is incredibly powerful. Anything you create with v0 can be deployed to Vercel. The Marketplace creates a frictionless experience for integrating disparate tools and services, including MongoDB Atlas, without leaving the Vercel ecosystem, further simplifying deployments. Clearly, Vercel’s scope has grown to be a one-stop shop for the creation and hosting of web applications, an “AI Cloud.” MongoDB enhances the Vercel experience MongoDB and Vercel share a commitment to the developer experience, freeing up developer time to focus on developing rather than getting bogged down with infrastructure concerns. MongoDB, with its flexible data model, distributed architecture, and versatile search functionality, acts as a natural complement to Vercel, a classic case of the whole being greater than the sum of its parts. MongoDB’s document model allows developers, both human and agentic, to model their domain intuitively and work with both structured and unstructured data, allowing for fast iteration and powerful abstractions. Via sharding and replica sets , MongoDB scales up to meet developer needs, offering an easy-to-use, performant, and scalable developer experience at the data layer to accompany Vercel's scalability at the application layer. MongoDB offers a myriad of ways to search your data (vector, semantic, even hybrid), meeting the requirements for virtually any AI use case. The Marketplace integration means developers can provision a MongoDB Atlas database directly from their Vercel dashboard, configure connections, and start building—all without context switching between different platforms or dealing with complex setup procedures and fractured billing. It’s never been easier to use MongoDB Atlas with Vercel. Looking ahead This integration marks a key milestone in the deepening of the partnership between MongoDB and Vercel. As both companies continue to grow in the AI space, developers can expect even more powerful tools and capabilities from the successful partnership. The combination of MongoDB Atlas and Vercel provides a strong foundation for developers who want to build the next generation of web and AI applications, simply and scalably. Get started today and experience how MongoDB Atlas and Vercel can supercharge your application development workflow. Interested in trying Vercel and MongoDB together? Take a look at our documentation Install the integration from the Vercel Marketplace directly

September 10, 2025
Home

How MongoDB Helps Your Brand Thrive in the Age of AI

The Zero Moment of Truth (ZMOT) was coined by Google to describe the moment when a user researches a product online before buying—typically through search, reviews, or videos. In a world where AI agents are intermediating shopping decisions (such as through assistant bots, personal agents, or even procurement AIs), the traditional concept of ZMOT starts to break down, because: The “moment” is no longer directly human. The “truth” might be algorithmically filtered. The user delegates the decision process (partially or fully) to an agent. For retailers, this isn't a minor trend—it’s a "change everything" moment. The traditional customer journey is being radically rewired. For decades, the battle was to win the top spot on a search engine results page. But what happens when the customer isn't a person searching, but is instead an AI agent executing a command like, "Buy me the best-value noise-canceling headphones"? If your brand isn't visible to that agent, you are, for all practical purposes, invisible. The brands that will win in this new landscape are the ones that can make their products and services discoverable and transactable not just by humans, but by AI. This shift presents a profound challenge that goes beyond marketing. Brands are shifting their direct relationship with the customer, handing it over to an AI intermediary. Traditional strategies built for human psychology and search engine algorithms become obsolete when the shopper is an AI agent. The core challenges are therefore immense: How do you build trust with an algorithm? How do you communicate your brand's value in a machine-readable format? And most importantly, how do you ensure your product is the one an agent selects from a sea of competitors? This article is meant to provide you with clarity on what the future of online shopping will look like, how your brand will be affected by this new paradigm and why the MongoDB document model is the best underlying tool for organizing and exposing your product catalog to this upcoming agentic ecommerce era. So, how might we rename or reframe ZMOT for this agent-mediated paradigm? To understand this shift, let's first clarify what we mean by 'agentic AI' and 'agents.' Agentic AI refers to artificial intelligence systems capable of acting autonomously to achieve specific goals on behalf of a user, often by interacting with various tools and services. An 'agent' in this context is the specific AI entity that performs these actions. For example, imagine telling your AI assistant, ' Book me a flight to London next month within a £500 budget, departing in the morning .' An AI agent would then autonomously search, compare, and potentially book the flight for you, acting as your personal delegate. Ever since reading the news of OpenAI naming Instacart’s CEO their new Head of Applications, I haven’t stopped thinking about what this will mean for the world of e-commerce and (yes, I’m a millennial) how the term “googling” came to be and became part of our zeitgeist in the early 2000s. The world of e-commerce is on the brink of a similar paradigmatic shift. For years, brands have poured resources into search engine optimization (SEO), battling for coveted spots on search engine results pages. But what if the search engine as we know it gets disrupted? What if, instead of searching, customers simply ask an AI to find and buy for them? This isn't a far-off futuristic fantasy. It's happening now. With the rise of powerful AI assistants like OpenAI's Improved Shopping Results from ChatGPT Search and the new Operator agent, we are entering a new era of "agentic commerce." This is the Agentic Moment of Truth (AMOT): the precise point at which an autonomous agent, acting on behalf of a user, synthesizes data, context, and intent to make or recommend a purchase decision. For retailers, this is a "change everything" moment. The traditional customer journey, from discovery to purchase, is being radically rewired. The brands that will win in this new landscape are the ones that can make their products and services discoverable and transactable not just by humans, but by AI agents. Figure 1. Evolution of the customer journey thanks to agentic AI. The new customer flow: From ZMOT to AMOT For over a decade, marketers have been obsessed with the ZMOT. But, AI agents are collapsing the ZMOT. Instead of a human spending hours browsing websites, reading reviews, and comparing prices, an AI can do it in seconds. This new customer flow, driven by agents, looks something like this: The prompt: A user gives a natural language command to their AI assistant, like, "Find me the best noise-canceling headphones for under $200 with good battery life." The agent's work: The AI agent, like OpenAI's Operator, goes to work. It doesn't just crawl the web in the traditional sense. It interacts with various services and APIs to gather information, compare options, and make a recommendation. The transaction: Once the user approves the recommendation, the agent can complete the purchase, all without the user ever visiting a traditional e-commerce website. This shift has profound implications for retailers. If your brand isn't "agent-friendly," you're essentially invisible in this new world of commerce. So, how do you make your brand discoverable and transactable by AI agents? The answer is to build a remote MCP server. But what exactly is an MCP server, and what are the operational challenges for an e-commerce business in deploying one? An MCP (Model Context Protocol) server is an open standard that allows AI models to connect to and interact with external tools and data sources. Think of it as a universal language for AI. In our context, think of it as a universal translator that enables AI agents to understand and use your product catalog, inventory, pricing, and even checkout functionalities. While this is suitable for internal agentic applications, how can you provide third-party online agents with real-time, up-to-date, and commercially strategic product data? This is where a remote MCP server , powered by technologies like MongoDB Atlas , becomes not just a nice-to-have, but a mission-critical component of your tech stack. However, creating and deploying such a server generates significant operational challenges for an e-commerce business. You need to manage complex, dynamic data structures for product information, rapidly adapt to new AI agent requirements, ensure your infrastructure can scale globally and reliably, and, critically, protect sensitive customer and product data. By creating your own remote MCP server, you can expose your product catalog, inventory, pricing, and even checkout functionality to AI agents in a structured, machine-readable format, and MongoDB Atlas directly addresses these operational hurdles: Superior architecture (the document model): E-commerce data is inherently varied and complex, with products having diverse attributes. The flexible document model of MongoDB Atlas allows you to store product information in a rich, nested structure that mirrors real-world objects. Innovate faster: With the agility of the document model and MongoDB Atlas's developer-friendly environment, your teams can respond to the dynamic needs of agentic commerce at an unprecedented pace. You can rapidly iterate on how your product data is exposed and consumed by AI agents, testing new features and optimizing agent interactions without time-consuming database migrations or refactoring. This speed is crucial in a fast-evolving AI landscape. Build once, deploy everywhere: E-commerce demands low-latency access for agents and users across diverse geographic locations. MongoDB Atlas offers multi-cloud and multi-region deployment options, allowing you to deploy your remote MCP server and product catalog close to your agents and customers, wherever they are. This global distribution capability minimizes latency and ensures high availability, overcoming infrastructure management complexities and guaranteeing that your brand is always transactable. Built-in enterprise security: Exposing your valuable product catalog and transactional capabilities to AI agents requires robust security. MongoDB Atlas provides comprehensive, built-in enterprise-grade security features, including encryption at rest and in transit, network isolation, fine-grained access controls, and auditing. This ensures that your data is protected from unauthorized access and cyber threats, mitigating the significant security challenges associated with opening your systems to external AI interactions. Why retailers must act now The shift to agentic commerce is not a question of if, but when. The MCP Registry, a public directory for AI agents to discover MCP-compliant servers, is set to launch in the fall of 2025. This will be the "yellow pages" for AI agents, and if your brand isn't listed, you'll be left behind. Discover how MongoDB powers the future of retail and helps brands thrive in the age of AI. Learn more about MongoDB for Retail . Ready to boost your MongoDB skills? Visit the Atlas Learning Hub to get started.

September 9, 2025
Artificial Intelligence

MongoDB Engineering: Expanding Our Presence in Greater Toronto

Toronto has long been recognized as one of North America's fastest-growing tech hubs , boasting a diverse, world-class talent pool and a vibrant startup culture. As MongoDB continues to expand globally, Toronto stands out as a strategic location to drive engineering excellence, foster innovation, and cultivate a collaborative culture. We're currently hiring for three key product areas in the greater Toronto area: Identity and Access Management (IAM), Atlas Stream Processing, and Atlas Search. At MongoDB, our engineers are empowered to solve complex problems, take ownership of their work, and collaborate with world-class colleagues to build the future of data. As we scale our presence in Toronto, we aim to create an environment where local engineers can grow their careers, work on cutting-edge technology, and have a meaningful impact on the products that enable organizations around the globe to build the applications of today and tomorrow. Why Toronto? "Toronto is well known as one of the largest tech hubs in North America. We're constantly looking to attract the most talented engineers to work with, and are really excited about expanding into Toronto, which has previously been untapped," said Kevin Rosendahl , Director of Engineering for Atlas Search and Vector Search. The decision to invest in Toronto is strategic. According to Tim Sedgwick , Vice President of Engineering for Atlas Stream Processing and App Services, "It enables us to increase engineering capacity responsibly, access high-velocity teams, and establish an innovation hub that mirrors our company’s values. We’re building a long-term hub here, and we want top engineers shaping that foundation with us." Meet the teams Identity and Access Management (IAM) The IAM team at MongoDB is responsible for managing customer identities and access to MongoDB products. "If we're doing our job well, we're making you safe and secure and not getting in your way," said Harry Wolff , Director of Engineering for Atlas IAM. "We own login, registration, SSO for other teams within MongoDB, and provide features like customer federation so large companies can securely log in with their own credentials." IAM is becoming a key differentiator in MongoDB's ability to land major enterprise customers. "We're going from a means to an end to a concrete dependency that unblocks major enterprise deals," Wolff said. "Our security bar is growing higher, and the work we do actively contributes to signing major contracts. I find that really exciting." The new team in Toronto will focus on building a new enterprise-grade information architecture. "Right now, one company could have 50-plus organizations in Atlas. We're building an umbrella layer to consolidate resources, configure access at scale, and give customers greater auditability and control," said Wolff. "We're building brand new functionality that enterprise customers are asking for." Wolff also emphasized career development and growth: "I joined MongoDB as a senior UI engineer, helped start the IAM team, and now I’m a Director. The company invests in its people, and this Toronto team will have the opportunity to grow alongside the product and make their mark." Atlas Stream Processing Atlas Stream Processing enables developers to continuously process streams of data using the MongoDB aggregation framework. It simplifies the creation of event-driven applications by eliminating the need for specialized infrastructure, allowing developers to stay within the MongoDB ecosystem. "Stream Processing is core to powering modern, event-driven applications and delivering value from streaming data," said Tim Sedgwick. "Our goal is to meet developers where they are and make it easy to build with MongoDB." The product has been generally available for just over a year, and there is a lot of exciting work on the horizon. "Some of the things we're focusing on this year include new sources and sinks, like Kinesis and Apache Iceberg, user-defined functions (UDFs), and distributed processing," Sedgwick said. "We're still early in the product lifecycle, and we're constantly learning from customers to deliver immediate impact." The Stream Processing team is around 50 people, distributed across the U.S., and now expanding into Toronto. "It's a strategic growth lever for us. We're creating a long-term innovation hub here," Sedgwick said. "Toronto engineers will be shaping the foundation of this product." Career growth is deeply embedded in the team culture. "I was the founding lead engineer in Austin for Atlas App Services. That experience of helping grow a new engineering hub was invaluable in my career. Now I lead engineering for Stream Processing," Sedgwick shared. "Joining MongoDB in Toronto could be a similar launchpad for someone else's journey." Atlas Search Atlas Search and Atlas Vector Search provide developers with built-in, relevance-based retrieval capabilities in the MongoDB database. This eliminates the need to sync data with external search engines, allowing teams to focus on building their applications. "Search at MongoDB is a fascinating place to be right now," said Kevin Rosendahl. "We're providing cutting-edge capabilities that power AI applications, large-scale systems, and we're making those tools more accessible across all MongoDB deployments." The team is distributed across major U.S. tech hubs and is now expanding into Toronto. Rosendahl explained, "We look for engineers excited about collaborating on complex, large-scale systems. The goal is to make powerful tools simple and intuitive for developers." A major focus in the coming year is on integrating capabilities from Voyage AI , a recent MongoDB acquisition. "We're bringing intelligent, AI-powered search out of the box," Rosendahl said. "And we're making sure these tools are available for developers everywhere, whether they use our managed service or deploy MongoDB on their own." Rosendahl’s own growth at MongoDB reflects the opportunities available: "I started as an individual contributor helping launch Atlas Search. I becamea lead engineer, then a staff engineer on a research team, and now I'm a Director of Engineering. MongoDB has supported my career development every step of the way." "Our engineers prioritize working together to build the right thing. That creates a culture that values collaboration, communication, and low ego," he added. "We're always looking for the next generation of leaders." Why join MongoDB? Joining MongoDB's engineering organization means becoming part of a culture rooted in trust, innovation, and impact. Our engineers are encouraged to take initiative, pursue curiosity, and help shape the future of software development. "We always look for culture adds—people who make us better, not just the same," said Harry Wolff. "Diversity of perspective and opinions is important at MongoDB ." Whether you're passionate about redefining access control, building intelligent data pipelines, or scaling AI-powered search, MongoDB offers the opportunity to work on industry-defining products alongside some of the most talented and driven people in tech. "MongoDB can be a defining moment in your career - through the unique set of challenges you’ll solve and the amazing people you’ll work with," said Sedgwick. "Above all, what makes MongoDB great is the people." Learn more about #LifeAtMongoDB and join us in building the future of data— become part of our talent community today . Visit our careers page to check out our open roles.

September 4, 2025
Culture

Ready to get Started with MongoDB Atlas?

Start Free