MongoDB Blog
Announcements, updates, news, and more
New Research Reveals Overcoming Legacy Tech Issues Key to AI Success
This guest post comes from IDC’s Dr. William Lee, Senior Research Director, Service Provider and Core Infrastructure Research. MongoDB commissioned IDC to explore the connection between legacy infrastructure, data challenges, and AI across Asia Pacific, and today we’re happy to share that work. For more, see the full MongoDB-sponsored IDC InfoBrief, Modernizing Legacy: Winning in the Age of AI, Doc #AP242555-IB, April 2026.
Plugging the Gap in Automotive Data Interoperability
Imagine if every electric vehicle (EV) came with its own dedicated charging connector unique to its brand or model. Similar to the early days with mobile phones, charging operators would need to support a range of incompatible plugs—leaving drivers to wonder whether they could charge at a given station. Managing this disparity would quickly become impractical, slowing the ecosystem’s growth.
MongoDB as the Mandate Ledger for Agentic Commerce: Supporting A2A, AP2 & UCP
Agentic commerce is here! Retailers and technologists are faced with the task of creating new architectures to support trustworthy, secure, and auditable agentic commerce. The tech sector has moved quickly to meet this challenge with a new wave of agentic protocols. The industry is moving fast: following the launch of Agent to Agent Protocol (A2A) in April 2025, Google launched Agents Payments Protocol (AP2) in Sept 2025, followed by Unified Commerce Protocol (UCP) in January 2026.
Improved Multitenancy Support in Vector Search: Introducing Flat Indexes
The future of AI is personal. The more accustomed to AI tools users are, the more they want their experience of working with them to be personalized and agentic. Whether it is an AI assistant recalling your past conversations, a legal tool reviewing a specific company's contracts, or a personal knowledge base searching through your private documents, these applications all rely on one core capability: providing "memory" specific to a single user or business.
LoanOptions.ai Scales AI-driven Finance Broking With MongoDB Atlas
Finance brokerage processes are known for being clunky and complex. It’s traditionally been very paper-heavy and manually intensive, with opaque and confusing credit policies, and it often takes weeks for borrowers to get an answer on whether their loan is approved.
Introducing MongoDB Agent Skills and Plugins for Coding Agents
Software engineering is evolving into agentic engineering. According to the Stack Overflow Developer Survey 2025, 84% of respondents use or plan to use AI tools in their development, up from 76% the previous year. At this rate, the tooling needs to keep pace. Last year, we introduced the MongoDB MCP Server to give agents the connectivity they need to interact with MongoDB, helping them generate context-aware code. But connectivity was only the start. Agents are generalists by design, and they don't inherently know the best practices and design patterns that real-world production systems demand. Today, we're addressing this by introducing official MongoDB Agent Skills: structured instructions, best practices, and resources that agents can discover and apply to generate more reliable code across the full development lifecycle, from schema design and performance optimization to implementing advanced capabilities like AI retrieval. To bring this directly into the tools you use, we're also launching plugins for Claude Code, Cursor, Gemini CLI, and VS Code, combining the MongoDB MCP Server and Agent Skills in a single, ready-to-use package. Turning coding agents into MongoDB experts Coding agents are great at producing working code, but they still make common mistakes in production systems, often defaulting to relational thinking that doesn't translate well to MongoDB, such as: Over-normalizing schemas, ignoring MongoDB's document-oriented strengths. Underusing compound indexes, causing performance bottlenecks at scale. Misusing indexes and search indexes, overlooking the consistency trade-off for high-performance full-text search. Because these pitfalls mirror common human errors, they are naturally reflected in agent outputs. MongoDB Agent Skills address this by providing expert guidance to agents, like schema design heuristics, indexing strategies, query patterns, and operational safeguards, enabling agents to ship more reliable, more consistent code faster. Agent Skills were introduced by Anthropic as an open standard and have since been adopted by the leading AI development tools, including Claude Code, Cursor, Codex, and more. This initial release covers the full application development lifecycle on MongoDB, from connection management and schema design to guidance on implementing advanced capabilities. We will continue to update and expand our skills library based on user needs. Figure 1. MongoDB Agent Skills. Scaling agentic engineering with MongoDB As organizations embrace agentic software engineering, existing processes and workflows must be reimagined. The MongoDB MCP Server and MongoDB Agent Skills are built for this shift and work best together, giving builders and agents the tools to move fast without sacrificing guardrails or control. The MongoDB MCP Server serves as the connectivity layer for your MongoDB deployments. It manages authentication and defines exactly what agents can access and do. Combined with MongoDB’s native authorization, it ensures agents operate with only the permissions they need, while giving teams governance through configurable controls like disabling specific tools. Agent Skills ensure agents follow best practices from the start, reducing architectural risk, accelerating implementation, and raising the baseline quality of every agent-generated code. While some skills can be used independently, others work in conjunction with the MongoDB MCP Server for workflows that require it. To simplify setup, the MCP Server and skills are now packaged together as plugins and extensions for Claude Code, Cursor, Gemini CLI, and VS Code, bringing these capabilities directly into your preferred tools. Figure 2. MongoDB for Claude plugin in action. We also encourage you to build your own skills as your agentic workflows mature. Whether enforcing internal naming conventions, custom data modeling patterns, or team-specific workflows, skills give you a practical way to codify institutional knowledge and ensure every agent and every developer works from the same playbook. How to get started Whether you’re using Claude Code, Cursor, Gemini CLI, or other AI development tools, you can install the MongoDB MCP Server and Agent Skills in seconds. For example, in Claude Code, install the plugin that bundles both: Code Snippet /plugin marketplace add mongodb/agent-skills /plugin install mongodb@mongodb-plugins For Cursor, Gemini CLI, and VS Code extensions, refer to their respective documentation. You can also install the skills for most coding agents using the Vercel Skills CLI (requires Node.js): Code Snippet npx skills add mongodb/agent-skills If you prefer, you can manually clone the GitHub repository and copy the skills into the appropriate folder for your agent. Similarly, to install the MongoDB MCP Server, use the following command: Code Snippet npx mongodb-mcp-server@latest setup Agentic engineering is changing how teams work, and it is changing fast. Agents need the context and guidance to meet the standards of real-world production applications. With the official MongoDB Agent Skills and plugins, builders can move faster with confidence, and organizations can adopt coding agents knowing that MongoDB best practices are embedded directly into every workflow. Next Steps Ship faster, more reliable apps on MongoDB with Agent Skills. Install for Claude Code, Cursor, Gemini CLI and VS Code!
Zomato Cuts $11M in Support Costs With MongoDB-Powered AI Platform
With more than 25 million active monthly users—and hundreds of millions of food delivery orders annually—Indian-born Zomato is the world's second-largest food delivery company. At the heart of the business’s success is Zomato’s ability to seamlessly scale, manage complex data, and build innovative AI-powered applications at pace.
The Modern End-to-End Digital Lending Journey Powered by MongoDB and Agentic AI
Traditional lending systems rely on disconnected legacy applications that were never designed for real-time data, automation, or digital-first customer experiences. Today, customers expect instant decisions, seamless digital experiences, and immediate transparency, while lenders must manage rising risk, regulatory pressure, and data complexity. Modern digital lending platforms are transforming this reality by unifying origination, decisioning, funding, and servicing into a single, intelligent workflow. In this article, we break down the end-to-end digital lending lifecycle and show how data-driven architectures are redefining how loans are created, approved, funded, and managed instantaneously.
Observability and OpenTelemetry: Introducing MongoDB Atlas Log Integration
In high-stakes enterprise environments, outages do not wait for business hours, and neither do IT/Network Operators. A latency spike hits the dashboard, and metrics signal that the database is under pressure. The cause? Indeterminate. Meanwhile, the business impact is immediate: orders fail to process, customers can’t access accounts, transactions stall, and critical records become temporarily unavailable. Every minute of uncertainty translates into lost revenue, frustrated users, and escalating pressure. Teams often fall back on a familiar—yet time-consuming—ritual: logging into their data platform, exporting large log files, extracting compressed archives, and manually searching through thousands of lines of entries to identify the issue. What should be a quick diagnosis becomes a manual context-switching investigation. By the time the problematic query, configuration issue, or audit event is identified, users have already experienced the disruption—and the business has absorbed the cost. MongoDB believes the database should be the heartbeat of a digital business. So we’re introducing a new log integration that brings MongoDB Atlas system and audit logs directly into external observability and storage platforms. This enhancement helps bridge the gap between metrics and meaning when it matters most. Flexible log delivery for modern observability workflows Now database operators, DevOps pros, and IT Operations teams alike can send MongoDB system and audit logs—including mongod, mongos, and audit logs—directly to the tools they already rely on: Datadog, Splunk, Google Cloud Storage, Azure Blob Storage, or Amazon S3. Beyond native integrations, MongoDB supports sending logs via OpenTelemetry (OTel), the open-source standard for collecting and transmitting telemetry data. This enables customers to export MongoDB logs to any observability or logging backend that supports OTel. By using a vendor-neutral, standards-based protocol, MongoDB fits seamlessly into modern observability architectures. This eliminates lock-in and preserves flexibility as tooling strategies evolve. Enabling real-time clarity Modern enterprises generate rich system logs essential for debugging and compliance. However, when these logs are siloed, operational inefficiencies grow. Manual log access introduces friction, delays resolution, and creates a visibility gap between metrics and logs. MongoDB’s new log integration transforms that experience with: Accelerated troubleshooting: Send logs in near real-time to observability platforms like Datadog, Splunk, or OpenTelemetry-compatible backends, enabling teams to quickly identify issues and reduce manual operational steps that slow incident resolution. Unified telemetry: Correlate MongoDB logs with application traces and infrastructure metrics in existing observability platforms, helping teams quickly understand how database behavior impacts overall system performance. Simplified compliance: Automatically route audit logs to secure long-term storage such as Amazon S3, helping organizations meet regulatory and audit requirements without manual log management. Figure 1. Atlas Log Integration configuration options for delivering MongoDB logs to observability and storage platforms. image Real-world use cases How does this look in practice for modern application, operations, and engineering teams? Here are a few examples. table The criticality of observability As applications scale, the database becomes the most critical layer of an organization’s technology stack. Missing or siloed visibility leads to costly downtime and fragmented decision-making. This log integration is available for dedicated M10+ clusters. An external sink can be configured in minutes: Navigate to the Project Integrations page in the MongoDB Atlas UI. Select the intended destination: Datadog, Splunk, Google Cloud, Microsoft Azure, Amazon S3, or any OTel log endpoint. Enter the required credentials and select the desired logs to send: mongod, mongos, or audit. Note: Atlas Search logs are also currently available via private preview. Figure 2. MongoDB Atlas logs integrated into an OpenTelemetry observability pipeline. image One observability strategy, built to scale For teams that need fast, MongoDB-centric visibility, MongoDB Atlas continues to offer powerful native tools like Query Insights and the Query Profiler. These capabilities are designed to surface what is happening inside a user’s clusters with minimal friction. However, as organizations scale, database insights can not live in isolation. MongoDB Atlas’s log integration extends observability systematically to the data plane. This enables MongoDB logs to flow into the observability platforms teams already use across engineering, security, IT operations, and compliance. With native integrations and an OpenTelemetry-compatible endpoint, teams can route logs wherever they are needed. This enables rapid troubleshooting, stronger auditability, and confident scaling without blind spots.
Atlas Stream Processing Now Supports Apache Avro With Schema Registry
MongoDB Atlas Stream Processing now supports Apache Avro serialization when integrated with the Confluent Schema Registry, removing key barriers that have made migrating streaming workloads difficult. You no longer have to choose between the flexibility of MongoDB and the performance of binary serialization. Whether you’re building real-time fraud detection, monitoring IoT sensor grids, or synchronizing microservices, MongoDB Atlas Stream Processing provides the tools to do it with confidence and at scale.