MongoDB Blog
Announcements, updates, news, and more
Unlocking Literacy: Ojje’s Journey With MongoDB
In the rapidly evolving landscape of education technology, one startup is making waves with a bold mission to revolutionize how young minds learn to read. Ojje is redefining literacy education by combating one of the most pressing issues in education today—reading proficiency. To do so, Ojje leverages groundbreaking technology to ensure every child can access the world of stories, at their own pace, in their own language. That transformative change is powered by a strategic partnership with MongoDB . Meet Ojje: A vision beyond words From electric cars to diabetes apps, Adrian Chernoff has been at the forefront of breakthrough innovations. Now, as the Founder and CEO of Ojje , he's channeling his passion for invention and entrepreneurship into something deeply personal and universally important—literacy. At its core, Ojje is an adaptive literacy learning platform that offers stories in 15 different reading levels, available in both English and Spanish. Grounded in the science of reading, it features elements like read-aloud functionality and dyslexia-friendly fonts to engage every learner. Ojje is not just a tool—it’s a gateway to personalized literacy education. Ojje's mission is to reach every learner by providing materials that are leveled, accessible, and engaging. By doing so, Ojje aims to vastly improve reading outcomes across K-12 education. Solving a literacy crisis with innovative solutions With literacy rates in the U.S. alarmingly low—almost 70% of low-income fourth grade students cannot read at a basic level according to the National Literacy Institute— Ojje's mission couldn't be more crucial. Chernoff and his team developed their platform in response to teachers' complaints about the stark lack of appropriate reading materials available to students. Schools needed a tool that could effortlessly cater to varying reading abilities within a single classroom. Ojje fills this gap by offering a dynamic platform that adapts to individual students’ needs, allowing educators to personalize instruction. The potential to genuinely connect with every student is realized through Ojje’s innovative use of technology. Powered by MongoDB At the root of every great tech innovation is an infrastructure that allows it to flourish. For Ojje, MongoDB is that foundation. As a startup, speed and adaptability are vital, and MongoDB’s flexible document model provides just that. It allows the Ojje team to launch rapidly, scale efficiently, and to handle a variety of data structures seamlessly—all without the cumbersome need for rigid schemas. “MongoDB handles everything from structured data to student performance tracking, without unnecessary overhead,” Chernoff said. “The platform scales with our needs, and the built-in monitoring tools give our team confidence as usage grows.” Why MongoDB? For Ojje, it was about the flexibility to handle educational content, ensure secure data handling for students, and to offer scalability for thousands of classrooms. MongoDB proved to be the perfect fit, offering a balance of adaptability and comprehensive data management. Working with MongoDB also offered Ojje access to the MongoDB for Startups program, providing essential Atlas credits, valuable technical resources, and access to our vast network of partners. This support played a crucial role during Ojje’s developmental stages and early launch, helping to position the company for successful growth and innovation. What’s next for Ojje? With an eye towards broadening their impact, the Ojje team plans to expand its library to include STEM materials and engaging biographies, alongside enhancing existing content. Additionally, Ojje will introduce tools for educators to track each reader’s progress in real time, further personalizing instruction. “We believe every student deserves the chance to love reading—and every teacher deserves tools that make that possible,” Chernoff said. “That’s why we’re building Ojje: To make literacy more accessible, engaging, and joyful. When students can learn to read and read to learn, it transforms not only their K–12 experience but their entire future.” In an exciting development, Ojje will soon unveil Ojje at Home. This initiative aims to extend literacy support beyond the classroom, providing families with valuable resources to join their children on the journey to literacy. Building a future where every child reads Ojje's combination of strategic foresight, cutting-edge technology, and genuine passion for educational impact make it a standout player in the education sector. By partnering with MongoDB, the company has created a robust, adaptive platform that not only meets the demands of today’s classrooms but is poised to address future literacy challenges. As the digital landscape continues to evolve, so must our methods of teaching and learning. Ojje is leading the charge, ensuring that every child has the opportunity to love reading and reap the lifelong benefits it brings. Interested in MongoDB but not sure where to start? Check out our quick start guides for detailed instructions on deploying and using MongoDB.
MongoDB Atlas is Now Available as a Microsoft Azure Native Integration
Since 2019, MongoDB and Microsoft Azure have striven to make it easy for enterprises to launch cutting-edge, modern applications. Key to this effort—and to enabling organizations everywhere to make an impact with AI—has been our work integrating MongoDB Atlas with the Microsoft Intelligent Data Platform. Our aim is to give developers a streamlined, fully integrated experience that they’ll love to use. So I’m very happy to announce the public preview of MongoDB Atlas as an Azure Native Integration (ANI). This latest step in MongoDB’s collaboration with Microsoft means that enterprise customers will be able to easily create and manage MongoDB Atlas organizations while also consolidating billing for Atlas within the Azure console, empowering them to interact with MongoDB Atlas as if it were a first-party service from Azure. I am also pleased to announce MongoDB Atlas on Azure Service Connector, one of several new integrations set to follow directly from the MongoDB Atlas as an Azure Native Integration announcement. Azure Service Connector makes it easy for developers to securely connect Azure compute services to backing services like databases, now including MongoDB Atlas. MongoDB’s mission has always been to empower our customers to move fast with data. With MongoDB Atlas as a native integrated service to Azure, we’re unlocking new possibilities for organizations to harness real-time insights, scale globally, and to accelerate their AI-driven roadmaps—all while reducing operational overhead. With Azure’s robust ecosystem of AI and analytics tools, teams can build and innovate with greater confidence, ultimately transforming how they serve their customers and shaping the future of software. "Integrating MongoDB Atlas as a Microsoft Azure Native Integration marks a significant milestone in our partnership with MongoDB. This integration empowers our customers to seamlessly manage their MongoDB Atlas resources within the Azure ecosystem, including unified billing and robust security features,” said Sandy Gupta, Vice President, Global Software Companies Ecosystem, Microsoft. “By simplifying operations and reducing technical complexity, we are enabling organizations to innovate faster and deliver exceptional value to their customers." Why this matters: Accelerated development & seamless operations This streamlined approach reduces technical and organizational complexity, with organizations benefiting from integrated billing, consolidated support, and simplified deployment. Connecting a database platform to external services typically requires juggling multiple portals, credentials, and security configurations. Starting today, with MongoDB Atlas as an Azure Native Integration, organizations can: Create and manage Atlas organizations directly within Azure, including the Azure Portal UI and CLI/SDK/ARM. Enjoy consolidated billing for both Azure and MongoDB Atlas. Access Azure’s AI services, data analytics, and more—all while harnessing the flexible, scalable power of MongoDB Atlas. It’s worth dwelling for a minute on the simplified onboarding and billing component of ANI, one of the biggest benefits of this integration for customers. As an Azure Native Integration, users can create their MongoDB Atlas organization and select their company billing plan directly from Azure, automatically applying the Azure billing plan to the Atlas Organization. This is made possible by leveraging Azure's comprehensive suite of billing and cost management tools, providing enterprises with enhanced control and visibility over their expenditures. Benefits of using MongoDB Atlas and Microsoft Azure together This latest MongoDB Atlas integration on Azure builds on a strong foundation of technical collaboration. Together, MongoDB Atlas on Azure already delivers a powerful set of integrations that offer customers and development teams a wide range of benefits, including: Unified workloads: MongoDB Atlas offers a single platform that supports a range of workloads, from transactional, time series, and search, to real-time analytics. With native integration on Azure, teams can quickly build across a wide variety of data-driven use cases. This can range from e-commerce transactions to generative AI applications, all without any re-architecting. Streamline AI integration: Accelerate machine learning (ML) workflows and generative AI projects with minimal configuration. Organizations can connect to Azure AI Foundry, Azure OpenAI Service, Microsoft Fabric, or Azure Databricks for advanced analytics, and MongoDB Atlas automatically scales in response to dynamic workloads. End-to-end security and compliance: MongoDB Atlas integrates with Microsoft Entra ID (formerly Azure AD), Azure Key Vault, and Azure private link for secure single sign-on, encryption key management, and private networking, respectively. With Microsoft Purview, organizations can meet stringent governance and compliance requirements, and teams remain agile without sacrificing enterprise-grade security. Scalability and global footprint: Azure’s extensive regional coverage enables organizations to deploy MongoDB Atlas in 40+ Azure regions worldwide. This ensures data remains close to users for low-latency, high-performance applications. How to deploy MongoDB Atlas as an Azure Native Integration 1. Search for MongoDB Atlas in the Azure Portal and the Azure Marketplace. 2. Create a MongoDB Atlas Organization and choose an Azure billing plan. That’s it! You’ve successfully created an Atlas Organization. From your new Atlas Organization, you can start taking advantage of other Azure services already integrated into MongoDB Atlas: Configure security and network settings using existing Azure Virtual Networks and Azure Private Link, as required. Begin building AI capabilities into applications by connecting to Azure AI Foundry, Azure Databricks, or Microsoft Fabric. Get started with deploying MongoDB Atlas as an Azure Native Integration through our quick start guide .
OrderOnline: AI Improves Conversion Rate by 56% with MongoDB
Established by Ordivo Group in 2018, OrderOnline has quickly become a driving force behind Indonesia’s thriving social commerce market. OrderOnline offers an end-to-end solution for organizations and individuals selling on social platforms like Facebook Marketplace, typically through social ads, landing pages, and storefronts. OrderOnline built its social commerce platform on the MongoDB Community Edition , and later migrated to MongoDB Atlas in 2022. The platform provides everything from managing orders to handling logistics for companies and individuals selling on social platforms. It addresses common social commerce pain points, such as complex logistics, failed deliveries, and unmanageable order processing due to scale. Speaking at MongoDB.local Jakarta 2024 , Wafda Mufti, Vice President of Technology for Ordivo Group, explained how his slogan—“Simple Input, High Accuracy”—drove OrderOnline to become one of Indonesia’s leading social commerce companies. “We have sellers using storefronts, landing pages, and checkout forms. Thanks to MongoDB's flexibility, we can manage these unpredictable business processes. We even store our front-end structure in MongoDB,” said Mufti. “Thanks to MongoDB, we can ensure we have the highest quality of data.” Mufti also shared how the company is using MongoDB Atlas Search and MongoDB Atlas Vector Search to power innovative search and AI use cases. Scaling social commerce with MongoDB Atlas Five years after its launch, OrderOnline had grown to 40,000 users and was handling 1.5 million transactions each month. This fast growth led to challenges, particularly around managing data at scale and ensuring high success rates for sellers. Most of OrderOnline’s users drive orders by using a wide range of sources. These include social ads, landing pages, and storefronts. Many of OrderOnline’s orders are handled via WhatsApp through Click to WhatsApp Ads (CTWA). Initially, managing orders via platforms like WhatsApp was feasible. However, as social commerce became more popular, the volume of orders increased, which quickly became overwhelming. Furthermore, for large sellers who do not handle their own products, OrderOnline had to manage order packing and shipping, as well as managing returns. “We were overwhelmed with orders, but we wanted to manage our SLAs,” said Mufti. “We wanted to ensure products were well-delivered.” MongoDB Atlas’s flexibility has enabled OrderOnline to manage unpredictable business processes, and to efficiently handle various complex tasks associated with order management and logistics. Because MongoDB Atlas is designed for fast iteration, it enables OrderOnline to swiftly adapt its platform in response to changing business needs and user demands. MongoDB Atlas also supports high scalability. This empowers OrderOnline to manage a growing user base and increasing transaction volumes without compromising performance. Additionally, MongoDB Atlas's reliability under high transactional loads ensures that OrderOnline can maintain quick response times—a core part of their SLA. This is critical for maintaining the agility needed in the dynamic world of social commerce. “We have a monitoring system that triggers alarms if response times fall below one second,” noted Mufti. Another critical SLA that OnlineOrder tracks is the delivery success rate. Previously, deliveries were only successful 94% of the time. Using MongoDB Atlas, OrderOnline built OExpress, a service that sellers can use to customize the number of delivery attempts based on specific service agreements. An upper limit cap of up to five delivery attempts is also mandated. OExpress closely tracks delivery attempts data. This ensures packages are delivered and minimizes returns and damages. “Thanks to MongoDB, we have achieved a success rate of 98.4%,” said Mufti. “We can manage multiple attempts to deliver to the customers, so sellers don’t have to worry about dealing with delivery issues anymore when using a marketplace.” Beyond deliveries, OrderOnline identified seamless search and customer support integrations as key operations that MongoDB could enhance. AI and search: conversion rates jump by 56% As OrderOnline’s business grew, scalability created specific challenges with CTWAs. Particularly, OrderOnline’s platform struggled to manage and make sense of the growing volume of inconsistent data types it was receiving, such as location, postal codes, and product details—accurate input of data is vital to ensuring orders are being processed and delivered. “People want [to be able to input] freeform text. They want things to be simple and easy, and not be restricted by rigid formats,” said Mufti. “But we still have to ensure data accuracy.” One of the standout features that helped OrderOnline improve search accuracy and management is MongoDB Atlas Search . Fuzzy search in MongoDB Atlas Search can handle typo errors when searching for districts. For example, if a user types “Surabaya,” Atlas Search will still fetch results for “Surabaya”. Furthermore, synonyms in MongoDB Atlas Search can handle shortened names for provinces and districts in Indonesia. For example, “Jabar” for Jawa Barat or “Jateng” for Jawa Tengah. Acronyms are also handled. “Because there’s AI in the background, there’s no need to manually input zip codes for example. Our engine can search for it,” said Mufti. “Someone clicks, then places an order, fills out the form, and it goes straight into our order management system, which supports fuzzy search.” As OrderOnline grew, it also needed to increase customer support with 24/7 availability and fast response times. MongoDB Atlas Vector Search supported the development of a seamless and user-friendly interface with the creation of an AI Chatbot. This chatbot provides sellers with ease in managing customer interactions, checking stock availability, and calculating shipping costs. “If the ad contains a WhatsApp link, it will be directly managed by the chatbot. The chatbot even checks shipping costs, compares prices, and shows how much it would cost if you purchased five items,” explained Mufti. “The AI handles requests for photos, checks stock availability, and much more. And once a deal is closed, it goes directly into our order management system.” Before the creation of the AI chatbot with MongoDB Atlas Vector Search, the WhatsApp conversion rate was 50%. Out of 100 interactions, 50 would successfully close the deal. With the implementation of AI, this rate has increased to 78%. Building on these successes, OrderOnline is now looking at further business and geographic expansion supported by MongoDB’s global reach, with the aim to help more sellers throughout Indonesia make the best out of social commerce. Visit the MongoDB Atlas Learning Hub to boost your MongoDB skills. To learn more about MongoDB Atlas Search, visit our product page . Get started with Atlas Vector Search today through our quick start guide .
How MongoDB and Google Cloud Power the Future of In-Car Assistants
The automotive industry is evolving fast: electrification, the rise of autonomous driving, and advanced safety systems are reshaping vehicles from the inside out. But innovation isn’t just happening to the drivetrain. Drivers (and passengers) now expect more intelligent, intuitive, and personalized experiences whenever they get into a car. That’s where things get tricky. While modern cars are packed with features, many of them are complex to use. Voice assistants were supposed to simplify things, but most still only handle basic tasks, like setting navigation or changing music. As consumers’ expectations of technology grow, so does pressure on automakers. Standing out in a competitive market, accelerating time to market, and managing rising development costs—all while delivering seamless digital experiences—is no small task. The good news? Drivers are ready for something better. According to a SoundHoundAI study , 79% of drivers in Europe would use voice assistants powered by generative AI. And 83% of those planning to buy a car in the next 12 months say they’d choose a model with AI features over one without. Gen AI is transforming voice assistants from simple command tools into dynamic copilots—able to answer questions, offer insights, and adapt to each user. At CES 2025, we saw major players like BMW, Honda, and HARMAN pushing the boundaries of AI-driven car assistants. To truly make these experiences personalized, you need the right data infrastructure. Real-time signals from the car, user preferences, and access to unstructured content like manuals and FAQs are essential for building truly intelligent systems. By combining gen AI with powerful data infrastructure, we can create more responsive, smarter in-car assistants. With flexible, scalable data access and built-in vector search, MongoDB Atlas is an ideal solution. Together with partners like Google Cloud, MongoDB is helping automotive companies innovate faster and deliver better in-car experiences. MongoDB as the data layer behind smarter assistants Building intelligent in-car assistants isn't just about having cutting-edge AI models—it’s about what feeds them. A flexible, scalable data platform is the foundation. To deliver real-time insights, personalize interactions, and evolve with new vehicle features, automakers need a data layer that can keep up. MongoDB gives developers the speed and simplicity they need to innovate. Its flexible document model lets teams store data the way applications use it—without rigid schemas or complex joins. That means faster development, fewer dependencies, and less architectural friction. Built-in capabilities like time series, full-text search, and real-time sync mean fewer moving parts and faster time to market. And because MongoDB Atlas is built for scale, availability, and security, automakers get the enterprise-grade reliability they need. Toyota Connected , for example, relies on MongoDB Atlas to power its Safety Connect platform across millions of vehicles, delivering real-time emergency support with 99.99% availability. But what really sets MongoDB apart for gen AI use cases is the way it handles data. AI workloads thrive on diverse, often unstructured inputs—text, metadata, contextual signals, vector embeddings. MongoDB’s document model handles all of it, side by side, in a single, unified platform. That’s why companies like Cognigy use MongoDB to power leading conversational AI platforms that manage hundreds of queries per second across multiple channels and data types. With Atlas Vector Search , development teams in the automotive industry can bring semantic search to unstructured data like manuals, support docs, or historical interactions. And by keeping operational, metadata, and vector data together, MongoDB makes it easier to deploy and scale gen AI apps that go beyond analytics and actually transform in-car experiences. MongoDB is already widely adopted across the automotive industry, powering innovation from the factory floor to the finish line . With its ability to scale and adapt to complex, evolving needs, MongoDB is helping automakers accelerate digital transformation and deliver next-gen in-car experiences. Architecture that drives intelligence at scale To bring generative AI into the driver’s seat, we designed an architecture that shows how these systems can work together in the real world. At the core, we combined the power of MongoDB Atlas with Google Cloud’s AI capabilities to build a seamless, scalable solution. Google Cloud powers speech recognition and language understanding, while MongoDB provides the data layer with Atlas Database and Atlas Vector Search . MongoDB has also worked with PowerSync to keep vehicle data in sync across cloud and edge environments. Imagine you're driving, and a red light pops up on your dashboard. You’re not sure what it means, so you ask the in-car assistant, “What is this red light on my dashboard?” The assistant transcribes your question, checks the real-time vehicle signals to identify the issue, and fetches relevant guidance from your car’s manual. It tells you what the warning means, whether it’s urgent, and what steps you should take. If it’s something that needs attention, it can suggest adding a service stop to your route. Or maybe switch your dashboard view to show more details. All of this happens through a natural voice interaction—no menus, no guesswork. Figure 1. A gen AI in-car assistant in action. Under the hood, this flow brings together several key technologies. Google Cloud’s Speech-to-Text and Text-to-Speech APIs handle the conversation. Document AI breaks the car manual into smaller, searchable chunks. Vertex AI generates text embeddings and powers the large language model. All of this connects to MongoDB Atlas, where Atlas Vector Search retrieves the most relevant content. Vehicle signals are kept up to date using PowerSync, which enables real-time, bidirectional data sync. And, by using the Vehicle Signal Specification (VSS) from COVESA, we’re following a widely adopted standard that makes it easy to expand and integrate with more systems down the road. Figure 2. Reference architecture overview. This is just one example of how flexible, future-ready architecture can unlock powerful, intuitive in-car experiences. Reimagining the driver experience Smarter in-car assistants start with smarter architectures. As generative AI becomes more capable, the real differentiator is how well it connects to the right data—securely, in real time, and at scale. With MongoDB Atlas, automakers can accelerate innovation, simplify architecture complexity, and cut development costs to deliver more intuitive, helpful experiences. It’s not just about adding features—it’s about making them work better together, so drivers get real value from the technology built into their cars. Learn how to power end-to-end value chain optimization with AI/ML, advanced analytics, and real-time data processing for innovative automotive applications. Visit our manufacturing and automotive web page. Want to get hands-on experience? Explore our GitHub repository for an in-depth guide on implementing this solution.
Capgemini & MongoDB: Smarter AI and Data for Business
AI is reshaping the way enterprises operate, but one fundamental challenge still exists: Most applications were not built with AI in mind. Traditional enterprise systems are designed for transactions, not intelligent decision-making, making it difficult to integrate AI at scale. To bridge this gap, MongoDB and Capgemini are enabling businesses to modernize their infrastructure, unify data platforms, and power AI-driven applications. This blog explores the trends driving the AI revolution and the role that Capgemini and MongoDB play in powering AI solutions. The Challenge: Outdated infrastructure is slowing AI innovation In talking to many customers across industries, we have heard the following key challenges in adopting AI: Data fragmentation: Organizations have long struggled with siloed data, where operational and analytical systems exist separately, making it difficult to unify data for AI-driven insights. In fact, according to the Workday Global survey , 59% of C-suite executives said their organizations' data is somewhat or completely siloed, which results in inefficiencies and lost opportunities. Moreover, AI workloads such as retrieval-augmented generation (RAG), semantic search , and recommendation engines require vector databases, yet most traditional data architectures fail to support these new AI-driven capabilities. Lack of AI-ready data infrastructure: The lack of AI-ready data infrastructure forces developers to work with multiple disconnected systems, adding complexity to the development process. Instead of seamlessly integrating AI models, developers often have to manually sync data, join query results across multiple platforms, and ensure consistency between structured and unstructured data sources. This not only slows down AI adoption but also significantly increases the operational burden. The solution: AI-Ready data infrastructure with MongoDB and Capgemini Together, MongoDB and Capgemini provide enterprises with the end-to-end capabilities needed to modernize their data infrastructure and harness AI's full potential. MongoDB provides a flexible document model that allows businesses to store and query structured, semi-structured, and unstructured data seamlessly, a critical need for AI-powered applications. Its vector search capabilities enable semantic search, recommendation engines, RAG, and anomaly detection, eliminating the need for complex data pipelines while reducing latency and operational overhead. Furthermore, MongoDB’s distributed and serverless architecture ensures scalability, allowing businesses to deploy real-time AI workloads like chatbots, intelligent search, and predictive analytics with the agility and efficiency needed to stay competitive. Capgemini plays a crucial role in this transformation by leveraging AI-powered automation and migration frameworks to help enterprises restructure applications, optimize data workflows, and transition to AI-ready architectures like MongoDB. Using generative AI, Capgemini enables organizations to analyze existing systems, define data migration scripts, and seamlessly integrate AI-driven capabilities into their operations. Real-world use cases Let's explore impactful real-world use cases where MongoDB and Capgemini have collaborated to drive cutting-edge AI projects. AI-powered field operations for a global energy company: Workers in hazardous environments, such as oil rigs, previously had to complete complex 75-field forms, which slowed down operations and increased safety risks. To streamline this process, the company implemented a conversational AI interface, allowing workers to interact with the system using natural language instead of manual form-filling. This AI-driven solution has been adopted by 120,000+ field workers, significantly reducing administrative workload, improving efficiency, and enhancing safety in high-risk conditions. AI-assisted anomaly detection in the automotive industry: Manual vehicle inspections often led to delays in diagnostics and high maintenance costs, making it difficult to detect mechanical issues early. To address this, an automotive company implemented AI-powered engine sound analysis, which used vector embeddings to identify anomalies and predict potential failures before they occurred. This proactive approach has reduced breakdowns, optimized maintenance scheduling, and improved overall vehicle reliability, ensuring cost savings and enhanced operational efficiency. Making insurance more efficient: GenYoda, an AI-driven solution developed by Capgemini, is revolutionizing the insurance industry by enhancing the efficiency of professionals through advanced data analysis. By harnessing the power of MongoDB Atlas Vector Search, GenYoda processes vast amounts of customer information including policy statements, premiums, claims histories, and health records to provide actionable insights. This comprehensive analysis enables insurance professionals to swiftly evaluate underwriters' reports, construct detailed health summaries, and optimize customer interactions, thereby improving contact center performance. Remarkably, GenYoda can ingest 100,000 documents within a few hours and deliver responses to user queries in just two to three seconds, matching the performance of leading AI models. The tangible benefits of this solution are evident; for instance, one insurer reported a 15% boost in productivity, a 25% acceleration in report generation—leading to faster decision-making—and a 10% reduction in manual efforts associated with PDF searches, culminating in enhanced operational efficiency. Conclusion As AI becomes operational, real-time, and mission-critical for enterprises, businesses must modernize their data infrastructure and integrate AI-driven capabilities into their core applications. With MongoDB and Capgemini, enterprises can move beyond legacy limitations, unify their data, and power the next generation of AI applications. For more, watch this TechCrunch Disrupt session by Steve Jones (EVP, Data-Driven Business & Gen AI at Capgemini) and Will Shulman (former VP of Product at MongoDB) to learn about more real world use cases. And discover how Capgemini and MongoDB are driving innovation with AI and data solutions.
Reimagining Investment Portfolio Management with Agentic AI
Risk management in capital markets is becoming increasingly complex for investment portfolio managers. The need to process vast amounts of data—from real-time market to unstructured social media data—demands a level of flexibility and scalability that traditional systems struggle to keep up with. AI agents —a type of artificial intelligence that can operate autonomously and take actions based on goals and real-world interactions—are set to transform how investment portfolios are managed. According to Gartner, 33% of enterprise software applications will include agentic AI by 2028, up from less than 1% in 2024. At least 15% of day-to-day work decisions are being made autonomously through AI agents. 1 MongoDB empowers AI agents to effectively transform the landscape of investment portfolio management. By leveraging the combination of large language models (LLMs), retrieval-augmented generation (RAG), and MongoDB Atlas Vector Search , AI agents are enabled to analyze vast financial datasets, detect patterns, and adapt in real time to changing conditions dynamically. This advanced intelligence elevates decision-making and empowers portfolio managers to enhance portfolio performance, manage market risks more effectively, and perform precise asset impact analysis. Intelligent investment portfolio management Investment portfolio management is the process of selecting, balancing, and monitoring a mix of financial assets—such as stocks, bonds, commodities, and derivatives—to achieve a higher return on investment (ROI) while managing risk effectively and proactively. It involves thoughtful asset allocation, diversification to mitigate market volatility, continuous monitoring of market conditions, and the performance of underlying assets to stay aligned with investment objectives. To stay relevant today, investment portfolio management requires the integration of diverse unstructured alternative data like financial news, social media sentiment, and macroeconomic indicators, alongside structured market data such as price movements, trading volumes, index, spreads, and historical execution records. This complex data integration presents a new level of sophistication in portfolio analytics, as outlined in Figure 1. It requires a flexible, scalable, unified data platform that can efficiently store, retrieve, and manage such diverse datasets, and pave the way for building next-gen portfolio management solutions. Figure 1. Investment portfolio analysis Incorporating MongoDB’s flexible schema accelerates data ingestion across various data sources—such as real-time market feeds, historical performance records, and risk metrics. New portfolio management solutions enabled with alternative data supports more intelligent decision-making and proactive market risk mitigation. This paradigm shift realizes deeper insights, enhances alpha generation, and refines asset reallocation with greater precision, underscoring the critical role of data in intelligent portfolio management. How MongoDB unlocks AI-powered portfolio management AI-powered portfolio asset allocation has become a desirable characteristic of modern investment strategies. By leveraging AI-based portfolio analysis, portfolio managers gain access to advanced tools that provide insights tailored to specific financial objectives and risk tolerances. This approach optimizes portfolio construction by recommending an alternate mix of assets—ranging from equities and bonds to ETFs and emerging opportunities—while continuously assessing the evolving market conditions. Figure 2 illustrates a proposed workflow for AI-powered investment portfolio management that brings diverse market data, including stock price, volatility index (VIX), and macroeconomic indicators such as GDP, interest rate, and unemployment rate, into an AI analysis layer to generate actionable and more intelligent insights. Figure 2. AI-powered investment portfolio management MongoDB’s versatile document model unlocks a more intuitive way for the storage and retrieval of structured, semi-structured, and unstructured data. This is aligned with the way developers structure the objects inside the applications. In capital markets, time series are often used to store time-based trading data and market data. MongoDB time series collections are optimal for analyzing data over time, they are designed to efficiently ingest large volumes of market data with high performance and dynamic scalability. Discovering insights and patterns from MongoDB time series collections is easier and more efficient due to faster underlying ingestion and retrieval mechanisms. Taking advantage of MongoDB Atlas Charts' business intelligence dashboard and evaluating advanced AI-generated investment insights, portfolio managers gain access to sophisticated capabilities that integrate high-dimensional insights derived from diverse datasets, revealing new patterns that can lead to enhanced decision-making for alpha generation and higher portfolio performance. MongoDB Atlas Vector Search plays a critical role inthe analysis of market news sentiment by enabling context-aware retrieval of related news articles. Traditional keyword-based searches often fail to capture semantic relationships between news stories, while vector search, powered by embeddings, allows for a more contextual understanding of how different articles relate to a stock sentiment. Storing news as vectors: When stock-related news are ingested, each news article is vectorized as a high-dimensional numerical representation using an embedding model. These embeddings encapsulate the meaning and context of the text, rather than just individual words. The raw news articles are embedded and stored in MongoDB as vectors. Finding related news: Vector search is used to find news articles based on similarity algorithms, even if they don’t contain the exact same stock information. This helps in identifying patterns and trends across multiple news articles based on contextual similarity. Enhancing sentiment calculation: Instead of relying on a single news sentiment, a final sentiment score is aggregated from multiple related news sources with similar and relevant content. This prevents one individual outlier news from influencing the result and provides a more holistic view of market news sentiment. Agentic AI foundation Agentic AI incorporates an orchestrator layer that manages task execution in workflows. AI Agents can operate either fully autonomous or semi-autonomous with a human-in-the-loop (HITL). AI agents are equipped with advanced tools, models, memory, and data storage. Memory leverages both long and short-term contextual data for informed decision-making and continuity of the interactions. Tools and models enable the AI agents to decompose tasks into steps and execute them cohesively. The data storage and retrieval are pivotal to AI agent effectiveness and can be advanced by embedding and vector search capabilities. Figure 3. Agentic AI foundation AI agents’ key characteristics: Autonomy: The ability to make decisions based on the situation dynamically and to execute tasks with minimal human intervention. Chain of thoughts: The ability to perform step-by-step reasoning and breaking complex problems into logical smaller steps for better judgement and decision-making. Context aware: AI agents continuously adapt their actions based on the environment changing conditions. Learning: AI agents improve their performance over time by adapting and enhancing. Intelligent investment portfolio management with AI agents AI agents are positioned to revolutionize portfolio management by shifting from rule-based to adaptive, context aware, and AI-powered decision-making. AI-enabled portfolio management applications continuously learn, adapt, and optimize investment strategies more proactively and effectively. The future isn’t about AI replacing portfolio managers, but rather humans and AI working together to create more intelligent, adaptive, and risk-aware portfolios. Portfolio managers who leverage AI, gain a competitive edge and deeper insights to significantly enhance portfolio performance. The solution, illustrated in Figure 4 below, includes a data ingestion application, three AI Agents, and a market insight application that work in harmony to create a more intelligent, insights-driven approach to portfolio management. Data ingestion application The data ingestion application runs continuously, captures various market data, and stores it in time series or as standard collections in MongoDB. Market data: Collects and processes real-time market data, including prices, volumes, trade activity, and volatility index. Market news: Captures and extracts market and stock-related news. News data is vectorized and stored in MongoDB. Market indicators: Retrieves key macroeconomic and financial indicators, such as GDP, interest rate, and unemployment rate. AI agents In this solution, there are 3 AI agents. Market analysis agent and market news agent have AI analytical workflows. They run based on a daily schedule in a fully automated fashion, producing the expected output and storing it in MongoDB. Market assistant agent has a more dynamic workflow and is designed to play the role of an assistant to a portfolio manager. It works based on prompt engineering and agentic decision making. Market assistant agent is capable of responding to questions about asset reallocation and market risks based on current market conditions and bringing the new AI-powered insights to the portfolio managers. Market analysis agent: Analyzes market trends, volatility, and patterns to generate insights related to the risk of portfolio assets. Market news agent: Assesses the news sentiment for each of assets by analyzing news that directly and indirectly can impact the portfolio performance. This agent is empowered by MongoDB vector search. Market assistant agent: On demand and through a prompt, answers portfolio manager’s questions about market trends, risk exposure, and portfolio allocation by using data sources and insights that other agents create. Market insight application The market insight application is a visualization layer that provides charts, dashboards, and reports for portfolio managers, a series of actionable investment insights from the outputs created by AI agents. This information is generated based on a predetermined daily schedule automatically and presented to portfolio managers. Figure 4. Investment portfolio management powered by MongoDB AI agents AI agents enable portfolio managers to have an intelligent and risk-based approach by analyzing the impact of market conditions on the portfolio and its investment goals. The AI Agents capitalize on MongoDB’s powerful capabilities, including the aggregation framework and vector search, combined with embedding and generative AI models to perform intelligent analysis and deliver insightful portfolio recommendations. Next steps According to Deloitte, by 2027, AI-driven investment tools will become the primary source of advice for retail investors, with AI-powered investment management solutions projected to grow to around 80% by 2028. 2 By leveraging AI agents and MongoDB, financial institutions can unlock the full potential of AI-driven portfolio management to obtain advanced insights that allow them to stay ahead of market shifts, optimize investment strategies, and manage risk with greater confidence. MongoDB lays a strong foundation for Agentic AI journey and the implementation of next-gen investment portfolio management solutions. To learn more about how MongoDB can power AI innovation, check out these additional resources: Transforming capital markets with MongoDB | Solutions page Launching an agentic RAG chatbot with MongoDB and Dataworkz | Solutions page Demystifying AI Agents: A Guide for Beginners 7 Practical Design Patterns for Agentic Systems 1 Sun, D., " Capitalize on the AI Agent Opportunity ”, Gartner, February 27, 2025. 2 AI, wealth management and trust: Could machines replace human advisors? , World Economic Forum, Mar 17, 2025.
MongoDB 8.0, Predefined Roles Now Available on DigitalOcean
I’m pleased to announce that MongoDB 8.0 is now available on DigitalOcean Managed MongoDB, bringing enhanced performance, scalability, and security to DigitalOcean’s fully managed MongoDB service. This update improves query efficiency, expands encryption capabilities, and optimizes scaling for large workloads. Additionally, DigitalOcean Managed MongoDB now includes role-based access control (RBAC) with predefined roles, making it easier to manage access control, enhance security, and streamline database administration across MongoDB clusters on DigitalOcean. DigitalOcean is one of MongoDB’s premier Certified by MongoDBaaS partners, and since launching our partnership in 2021, developer productivity has been the core focus of MongoDB and DigitalOcean’s partnership together. These new enhancements to DigitalOcean Managed MongoDB are a testament to the importance of enabling developers, startups, and small and medium-sized businesses to rapidly build, deploy, and scale applications to accelerate innovation and increase productivity and agility. What’s new in MongoDB 8.0? MongoDB 8.0 features several upgrades designed to enhance its performance, security, and ease of use. Whether you’re managing high-throughput applications or looking for better query optimization, these improvements make DigitalOcean Managed MongoDB even more powerful: Higher throughput and improved replication performance: Dozens of architectural optimizations in MongoDB 8.0 have improved query and replication speed across the board. Better time series handling: Store and manage time series data more efficiently, helping to enable higher throughput with lower resource usage and costs. Expanded Queryable Encryption: MongoDB 8.0 adds range queries to Queryable Encryption, enabling new use cases for secure data operations. With encrypted searches that don’t expose sensitive data, MongoDB 8.0 enhances both privacy and compliance. Greater performance control: Set default maximum execution times for queries and persist query settings after restarts, providing more predictable database performance. MongoDB 8.0 features 36% better read throughput, 59% faster bulk writes, 200% faster time series aggregations, and new sharding capabilities that distribute data across shards up to 50 times faster—making MongoDB 8.0 the most secure, durable, available, and performant version of MongoDB yet. Learn more about MongoDB 8.0 on our release page. Benefits of RBAC for DigitalOcean Managed MongoDB Managing database access across organizations can be a challenge, especially as teams grow and security requirements become more complex. Without a structured approach, organizations risk unauthorized access, operational inefficiencies, and compliance gaps. With RBAC now available in their MongoDB environments, DigitalOcean Managed MongoDB users can avoid these risks and enforce clear, predefined access policies, helping to ensure secure, efficient, and scalable database management. Here’s how RBAC can benefit your business : Stronger data protection: Keep your sensitive information secure by ensuring that only authorized users have access, reducing the risk of data breaches and strengthening overall security. Less manual work, fewer errors: Predefined roles make it easier to manage user access, cutting down on time-consuming manual tasks and minimizing the risk of mistakes. Easier compliance management: Stay ahead of industry regulations with structured access controls that simplify audits and reporting, giving you peace of mind. Lower costs & reduced risk: Automating access management reduces administrative overhead and helps prevent costly security breaches. Seamless scalability: As your business grows, easily adjust user permissions to match evolving team structures and operational needs. Simplified access control: Manage database access efficiently by assigning roles at scale, making administration more intuitive and governance more effective. DigitalOcean Managed MongoDB: Better than ever With the introduction of MongoDB 8.0 and RBAC, DigitalOcean Managed MongoDB is now more powerful, secure, and efficient than ever. Whether you’re scaling workloads, optimizing queries, or strengthening security, these updates empower you to manage your MongoDB clusters with greater confidence and ease. Get started today and take full advantage of these cutting-edge enhancements in DigitalOcean’s Managed MongoDB! To create a new cluster with MongoDB 8.0, or to upgrade your existing cluster through the DigitalOcean Control Panel or API, check out the DigitalOcean site . Ream more about these new features in DigitalOcean's blog about MongoDB 8.0 and RBAC , 0r simply try DigitalOcean Managed MongoDB by getting started here !
Ubuy Scales E-Commerce Globally and Unlocks AI With MongoDB
In today’s digital era, global e-commerce presents a major growth opportunity. This is particularly acute for businesses looking to expand beyond their local markets. While some companies thrive by serving domestic customers, others capitalize on cross-border e-commerce to reach a wider audience. Ubuy , a Kuwait-based e-commerce company, is tapping into this opportunity. Operating in over 180 countries, Ubuy enables customers worldwide to purchase products that may not be available in their local markets. The Ubuy App, which is also available to users on both iOS and Android, supports over 60 languages internationally and is a popular way to access Ubuy’s platform. Ubuy simplifies logistics, customs, and shipping to create a seamless shopping experience. It acts as a bridge between customers and international sellers. Unlike traditional marketplaces, Ubuy provides end-to-end services, from sourcing products and performing quality checks to handling shipping and customs. This ensures that products, even those newly launched in international markets, are accessible to buyers globally with minimal hassle. Founded in 2012, Ubuy initially focused on the Gulf Cooperation Council region. Having identified a gap in international products’ availability there, it used the next three years to expand its services. Today, Ubuy offers a diverse catalog of over 300 million products, and customers worldwide can access products from nine warehouses strategically located in the United States, the United Kingdom, China (including in Hong Kong), Turkey, Korea, Japan, Germany, and Kuwait. Scaling such a vast operation represented a significant technological challenge. MongoDB Atlas proved critical in enabling Ubuy to scale its operations and address specific search performance and inventory management issues. Overcoming search and scalability challenges Before adopting MongoDB Atlas, Ubuy relied on MySQL to manage product data and search functions. However, this model’s limitations led to performance bottlenecks - it couldn’t handle large-scale search operations, lacked high availability, and struggled to manage complex search queries from customers across different markets. Slow query responses, averaging as much as 4–5 seconds per search, impacted the user experience, making it critical for Ubuy to identify a more scalable and performant solution. Ubuy migrated to MongoDB Atlas and implemented both MongoDB Atlas Search and MongoDB Atlas Vector Search to overcome these hurdles. By using these products, Ubuy significantly improved search efficiency, reducing response times to milliseconds. The company can now ensure high search relevancy, enabling users to find products more accurately and quickly. Migrating a large platform to MongoDB Atlas At Ubuy’s scale, the migration to MongoDB Atlas required careful planning. In March 2023, the team conducted a proof of concept to test MongoDB Atlas’s capabilities in handling its vast inventory. A month later, the migration was complete: Ubuy had transitioned from MySQL to a fully managed MongoDB Atlas environment. The transition was seamless, with no downtime. The MongoDB team provided ongoing guidance to help Ubuy optimize search filters and facilitate a smooth integration with its existing e-commerce systems. The result was an improved customer experience through faster and more relevant search results. Ubuy chose MongoDB Atlas for three key reasons: Scalability: MongoDB Atlas provides the ability to handle massive data loads efficiently, enabling smooth search performance even during peak traffic. High availability: As a fully managed cloud database, MongoDB Atlas provides resilience and reduces downtime. AI-powered search: The use of MongoDB Atlas Search improves Ubuy’s product discovery experience, helping customers find the right products without seeing unnecessary results. Additionally, MongoDB Atlas Vector Search provides semantic search capabilities. This enables more intuitive product discovery based on intent rather than merely on keywords, enhancing customer satisfaction. Using AI-powered enhancements to drive customer engagement Beyond improving search performance, Ubuy has been enhancing its customers’ shopping experience through AI. Ubuy integrated AI-powered search and recommendation systems with MongoDB Atlas’s vector database capabilities. This enabled a transition from simple keyword-based searches to a more intuitive, intent-driven discovery experience. For example, when a user searches for a specific keyword, like “Yamaha guitar,” the AI-enhanced product page now provides structured information on this product’s suitability for beginners, professionals, and trainers. This improves user experience and enhances SEO visibility, driving organic traffic to Ubuy’s platform. “With MongoDB Atlas Search and Atlas Vector Search, we are able to deliver personalized product recommendations in real-time, making it easier for customers to find what they need faster than ever before,” said Mr. Omprakash Swami, Head of IT at Ubuy. Achieving response speed and business growth Since implementing MongoDB Atlas and AI-driven enhancements, Ubuy has seen remarkable improvements: Search response time reduced from 4–5 seconds to milliseconds Over 150 million search queries handled annually with improved relevancy Higher engagement on product pages due to AI-enriched content Ability to scale inventory beyond 300 million products with zero performance concerns “Moving to MongoDB Atlas and being able to use features such as Atlas Vector Search have been a game changer,” said Swami. “The ability to handle massive search queries in milliseconds while maintaining high relevancy has dramatically improved our customer experience and business operations. The flexibility of MongoDB Atlas has not only improved our search performance but also set the stage for AI-powered innovations that were previously impossible with our relational database setup.” Enhancing the future of e-commerce Looking ahead, Ubuy aims to optimize search by consolidating inventory visibility across multiple stores. The goal is to enable users to search across all warehouses from a single interface, delivering even greater convenience. Ubuy’s transformation showcases how employing MongoDB Atlas, along with its fully-integrated search capabilities and AI-driven insights, can significantly enhance global e-commerce operations. By addressing scalability and search relevance challenges, the company has positioned itself as a leader in cross-border e-commerce. With a relentless focus on innovation, Ubuy is set to redefine how consumers access international products. Together, Ubuy and MongoDB are helping make shopping across borders effortless and efficient. Visit our product page to learn more about MongoDB Atlas Search. Check out our Atlas Vector Search Quick Start Guide to get started with Vector Search today. Boost your MongoDB skills with our Atlas Learning Hub .
Teach & Learn with MongoDB: Professor Chanda Raj Kumar
Welcome to the second edition of our series highlighting how educators and students worldwide are using MongoDB to transform learning. In this post, we chat with Professor Chanda Raj Kumar of KL University Hyderabad. The MongoDB for Educators program provides free resources like curriculum materials, MongoDB Atlas credits, certifications, and access to a global community of more than 700 universities—helping educators teach practical database skills and inspire future tech talent. Applied learning: Using MongoDB in real-world teaching Chanda Raj Kumar , Assistant Professor at KLEF Deemed to be University, Hyderabad, India, is a MongoDB Educator and Leader of the MongoDB User Group—Hyderabad. With ten years of teaching experience, he empowers students to gain hands-on experience with MongoDB in their projects. Thanks to his mentorship, during last semester’s Skill Week, 80% of his students earned MongoDB certifications, preparing them for careers in tech. His dedication earned him the 2024 Distinguished Mentor Award from MongoDB. His story shows how educators can use MongoDB to inspire students and prepare them for careers in tech. Tell us about your educational and professional journey and what initially sparked your interest in databases and MongoDB. My educational journey consists of an undergraduate degree from Kakatiya University. Following that, I pursued an M.Tech from Osmania University, where I gained immense knowledge in the landscape of computer science, which aided in laying a strong foundation for my technical expertise. Currently, I am pursuing a PhD from Annamalai University, focusing my research on machine learning. Additionally, qualifying exams like UGC NET and TSET have further strengthened my understanding of databases and why they are a core aspect of developing an application. Over the past ten years, I have gained extensive experience in academia and industry, and I currently serve as an Assistant Professor at KL University, Hyderabad. My interest in databases stems from their universal presence in almost every application. Early on, when I first dabbled into the world of databases, I was intrigued by how efficient storage mechanisms severely impact the speed and accuracy of data retrieval and other operations that will be performed on data through our application. While working with relational databases, I encountered challenges related to fixed schemas—certain data insertions were not feasible due to strict structural constraints or the unavailability of data types corresponding to spatial and vectorial data. This led me to delve into MongoDB, where the flexible JSON-based document structure provided a more scalable and dynamic approach to data management, along with MongoDB Atlas conforming to the rapidly evolving cloud computing of today's time. What courses related to databases and MongoDB are you currently teaching? At my university, I teach database-related courses across different levels. As a core course, I teach Database Management Systems (DBMS), covering database fundamentals and operations. I also handle Python Full Stack, MERN Stack, and Java Full Stack Development, integrating MongoDB with modern frameworks. Additionally, I conduct MongoDB certification courses, helping students gain industry-standard knowledge in database technologies. What motivated you to incorporate MongoDB into your curriculum? My journey with databases began when I realized the challenges of relational databases like SQL, with their rigid schema and complex queries. This led me to explore MongoDB, which offers a more flexible, user-friendly approach to data management. I actively advocate for adding MongoDB to the college curriculum to prepare students for the growing demand for NoSQL technologies. By teaching MongoDB alongside relational databases, I aim to help students build practical skills to design and manage modern, dynamic applications. You have successfully built an active student community around MongoDB on your campus. Can you share some insights into how you achieved this and the impact it's had on students? Building an active student community around MongoDB on campus has not only been an exciting journey, but a very enlightening one as well. I had concentrated on a step-by-step teaching approach, beginning with the basics and slowly making my way up to more complex topics. This helped students build a strong foundation while feeling confident and thorough about the things they were learning. One of the main ways I involved students was by incorporating MongoDB into different courses, where they could work on hands-on projects that required using the database. I also encouraged students to earn certifications like Developer and DBA, which gave them valuable credentials and a nod to their MongoDB skills. Furthermore, I arranged group discussions where students brainstormed, solved problems together, and stayed actively engaged in their learning. On top of that, I held special training sessions each semester called “Skill Weeks” that lasted a week to make sure that everyone was aware of the ongoing MongoDB advancements while also teaching newcomers. How do you design your course content to integrate MongoDB in a way that engages students and ensures practical learning experiences? I often begin by building a strong foundation, going over fundamental concepts such as document-oriented storage, collections, indexing, and CRUD operations to ensure students grasp the essentials. Once a solid base has been established, I introduce advanced concepts like aggregation pipelines, indexing, query optimization techniques, and sharding, whilst putting utmost emphasis on hands-on learning with real datasets to further fortify understanding. I also incorporate real-world projects where students design and build complete applications that integrate MongoDB in the backend and thereby, simulate industry use cases to enhance their problem-solving in a professional environment. As for the certification component, I include model quizzes, practice tests, and assignments to evaluate their knowledge and ensure they are job-ready with a validated skill set. How has MongoDB supported you in enhancing your teaching methodologies and upskilling your students? The curated learning paths and comprehensive resources through MongoDB Academia , such as PowerPoint presentations for educators, have best supported me and my teaching methods. The platform offers a wide variety of materials, covering basic to advanced concepts, often accompanied by visual aids that make complex concepts easier to grasp. The learning paths also provide a set of practice questions for the students that can reinforce their understanding. Moreover, the availability of the Atlas free cluster allows students to experiment with real-world database operations without cost, providing a practical experience. These resources offered by MongoDB have significantly reshaped my pedagogy to better accommodate practical elements. Have you conducted any projects or studies on students' experiences with MongoDB? If so, what key insights have you discovered, and how can they benefit other educators? Through surveys, Q&A sessions, and project reviews, I have identified students' strengths and weaknesses in working with MongoDB. Many students find the document-oriented model intuitive and appreciate the flexibility of schema design, but often struggle with optimizing queries, indexing strategies, and understanding aggregation pipelines. These insights have helped me refine and iterate my teaching style by focusing more on demonstrations, interactive exercises, and explanations targeted at complex topics. Other educators can benefit from these conclusions I have arrived at by incorporating regular feedback sessions and adapting their teaching methods to address these loopholes. Could you share a memorable experience or success story of a project from your time teaching MongoDB that stands out to you? One of the most memorable experiences from my time teaching MongoDB was during Skill Week, where 80% of my students earned MongoDB certifications. The structured pedagogy I implemented—combining hands-on learning, real-world projects, and guided problem-solving—played a crucial role in their success. This success was further recognized when I received an award last semester for my contributions to MongoDB education, further proving the impact of my teaching approach. Seeing students excel, gain industry-recognized skills, and confidently apply MongoDB skills in their careers has been incredibly rewarding for me. How has your role as a MongoDB Educator impacted your professional growth and the growth of the student community at your university? I have been able to demonstrate the power of non-relational databases, breaking the initial stigma about NoSQL databases and helping students see the advantages of flexible, scalable data models. This journey has also helped me secure my position as a subject matter expert, allowing me to lead discussions on advanced database concepts and real-world applications. As a MongoDB User Group (MUG) leader, I have built a global network, collaborating with educators, developers, and industry professionals. Additionally, conducting mentoring workshops at other colleges has strengthened my leadership skills while expanding MongoDB awareness beyond the scope of my institution. Most importantly, this role has provided students with direct industry exposure, which I believe plays a pivotal role in the growth of their careers. What advice would you give to educators who are considering integrating MongoDB into their courses to ensure a successful and impactful learning experience for students? My advice is to build upon students’ pre-existing knowledge while gradually introducing the transition or shift to NoSQL concepts. Since most students start with relational databases, it’s important to first highlight the key differences between SQL and NoSQL, and to explain when to use each. Given that students are generally inclined toward SQL (as it’s often the first database they work with), introducing MongoDB as a schema-less, document-oriented database makes the transition smoother. Once the basics are covered, progressing to advanced topics like data modeling, aggregation pipelines, and indexing ensures students gain a deeper understanding of database optimization and performance tuning. By adopting this structured approach, educators can provide a comprehensive, real-world learning experience that prepares students for industry use cases. To learn more, apply to the MongoDB for Educators program and explore free resources for educators crafted by MongoDB experts to prepare learners with in-demand database skills and knowledge.
Announcing the MongoDB MCP Server
Today, MongoDB is pleased to share the MongoDB Model Context Protocol (MCP) Server in public preview. The MongoDB MCP Server enables AI-powered development by connecting MongoDB deployments—whether they’re on MongoDB Atlas, MongoDB Community Edition, or MongoDB Enterprise Advanced—to MCP-supported clients like Windsurf, Cursor, GitHub Copilot in Visual Studio Code, and Anthropic’s Claude. Using MCP as the two-way communication protocol, the MongoDB MCP Server makes it easy to interact with your data using natural language and perform database operations with your favorite agentic AI tools, assistants, and platforms. Originally introduced by Anthropic, the Model Context Protocol has been gaining traction as an open standard for connecting AI agents and diverse data systems. The growing popularity of MCP comes at a pivotal moment as LLMs and agentic AI are reshaping how we build and interact with applications. MCP unlocks new levels of integrated functionality, ensuring that the LLMs behind agentic workflows have access to the most recent and contextually relevant information. And it makes it easier than ever for developers to take advantage of the fast-growing and fast-changing ecosystem of AI technologies. The MongoDB MCP Server: Connecting to the broader AI ecosystem The MongoDB MCP Server enables developer tools with MCP clients to interact directly with a MongoDB database and to handle a range of administrative tasks, such as managing cluster resources, as well as data-related operations like querying and indexing. Figure 1. Overview of MongoDB MCP Server integration with MCP components. Forget separate tools, custom integrations, and manual querying. With the MongoDB MCP Server, developers can leverage the intelligence of LLMs to perform crucial database tasks directly within their development environments, with access to the most recent and contextually relevant data. The MongoDB MCP Server enables: Effortless data exploration: Ask your AI to "show the schema of the 'users' collection" or "find the most active users in the collection." Streamlined database management: Use natural language to perform database administration tasks like "create a new database user with read-only access" or "list the current network access rules." Context-aware code generation: Describe the data you need, and let your AI generate the MongoDB queries and even the application code to interact with it. AI-powered software development with Windsurf and MongoDB To make it easier for developers everywhere to use the MongoDB MCP Server right away, we've made it available out of the box in Windsurf , an AI code editor used by over a million developers and counting. Developers building with MongoDB can leverage Windsurf's agentic AI capabilities to streamline their workflows and accelerate application development. “MongoDB is aligned with Windsurf’s mission of empowering everyone to continuously dream bigger,” said Rohan Phadte, Product Engineer at Windsurf. “Through our integration with the MongoDB MCP Server, we’re helping innovators to create, transform, and disrupt industries with software in this new age of development. Developers can get started today by accessing the MongoDB MCP Server through our official server templates, and take advantage of the combined power of Windsurf and MongoDB for building their next project.” Figure 2. Windsurf MCP server templates. The MongoDB MCP Server in action Check out the videos below to see how to use the MongoDB MCP Server with popular tools like Claude, Visual Studio Code, and Windsurf. Using the MongoDB MCP Server for data exploration With an AI agent capable of directly accessing and exploring your database guided by natural language prompts, you can minimize context switching and stay in the flow of your work. Using the MongoDB MCP Server for database management The MongoDB MCP Server enables AI agents to interact directly with MongoDB Atlas or self-managed MongoDB databases, making it easier to automate manual tasks around cluster and user management. Using the MongoDB MCP Server for code generation Using LLMs and code agents has become a core part of developers’ workflows. Providing context, such as schemas and data structures, enables more accurate code generation, reducing hallucinations and enhancing agent capabilities. The future of software development is agentic The MongoDB MCP Server is a step forward in MongoDB’s mission to empower developers with advanced technologies to effortlessly bring bold ideas to life. By providing an official MCP server release, we’re meeting developers in the workflows and tools they rely on to build the future on MongoDB. As MCP adoption continues to gain momentum, we’ll continue to actively listen to developer feedback and to prioritize enhancements to our MCP implementation. If you have input on the MongoDB MCP Server, please create an issue on GitHub . And to stay abreast of the latest news and releases from MongoDB, make sure you check out the MongoDB blog . Check out the MongoDB MCP Server on GitHub and give it a try—see how it can accelerate your development workflow!
Multi-Agentic Systems in Industry with XMPro & MongoDB Atlas
In 2025, agentic AI applications are no longer pet projects—companies around the world are investing in software to incorporate AI agents into their business workflows. The most common use of an AI agent is to assist with research analysis or writing code. LangChain’s recent survey of over 1000 professionals across multiple industries showed that over 51% have already deployed agents in production, with 60% using the agents for research and summarization tasks. However, leveraging an AI agent for more complex tasks than research and summarization—and implementing them in industrial environments like manufacturing—presents certain challenges. For example, as new technology is introduced into already established companies, the visibility of brownfield deployments increases. This installation and configuration of new hardware or software must coexist with legacy IT systems. And, while it is easy to run an AI agent in a sandboxed environment, it is harder to integrate agents with machines and Operational Technology (OT) systems speaking industrial protocols like Modbus, PROFINET, and BACnet due to existing legacy infrastructure and an accumulation of tech debt. To ensure governance and security in industrial environments, data security policies, regulatory compliance, and governance models are essential. Agent profiles with defined goals, rules, responsibilities, and constraints must be established before agents are deployed. Additionally, addressing real-world constraints—like LLM latency—and strategically selecting use cases and database providers can enhance Al agent effectiveness and optimize response times. What’s more, the successful implementation of AI agents in industrial environments requires a number of foundational elements, including: Flexible data storage and scalability: An agent requires different types of data to function, such as agent profile, short-term memory, and long-term memory. Industrial AI agents require even more types of data, such as time series data from sensors and PLCs. They require efficient and scalable data storage that adapts to the dynamic needs of the environment. Continuous monitoring and analysis: An agent deployed in a manufacturing environment requires real-time observability of ever-changing data generated by the factory. It also needs to keep humans in the loop for any critical decisions that might affect production. High availability: Industrial environments demand near-zero downtime, making system resilience and failover capabilities essential. XMPro joins forces with MongoDB To address these challenges, we are pleased to announce XMPro’s partnership with MongoDB. XMPro offers APEX AI , a low-code control room for creating and managing advanced AI agents for industrial applications. To ensure seamless control over these autonomous agents, XMPro APEX serves as the command center for configuring, monitoring, and orchestrating agent activities, empowering operators to remain in control. Figure 1. XMPro APEX AI platform working with MongoDB Atlas. APEX AI, combined with MongoDB Atlas and MongoDB Atlas Vector Search , addresses a variety of challenges faced by developers when building AI agents for industrial environments. XMPro complements this by seamlessly integrating with industrial equipment such as SCADA systems, PLCs, IoT sensors, and ERPs, enabling continuous monitoring of operations. This integration ensures real-time data acquisition, contextualization, and advanced analytics, transforming raw data into actionable insights. XMPro’s capabilities include condition monitoring, predictive maintenance, anomaly detection, and process optimization, which help reduce downtime and improve operational efficiency while maintaining compliance and safety standards. XMPro’s industrial AI agents rely on memory persistence for contextual decision-making. MongoDB Atlas acts as the database for storing and retrieving agent memories. Using a flexible document database for storing agentic memories enables agents to store different types of data, such as conversational logs, state transitions, and telemetry data, without requiring schema re-design. The capabilities of MongoDB Atlas Vector Search empower APEX AI agents with a retrieval-augmented generation (RAG) tool, which helps to reduce LLM hallucinations. This integration allows agents to access and retrieve verified data, grounding their responses. Having database and vector search tools together in MongoDB Atlas also helps reduce agent latency and speeds up development. APEX AI-enabled multi-agent systems working together in an industrial setting. These context-aware agents can work in tandem, retrieving relevant knowledge stored in MongoDB Atlas to enable meaningful collaboration and better decision-making. XMPro APEX AI also leverages MongoDB Atlas’s robust security and high availability to ensure that agents can securely access and leverage data in real time; features such as role-based access controls, network isolation, encryption in transit and at-rest are key to why this agent-based AI solution is ideal for securing industrial production environments. MongoDB’s highly available and horizontal scalability ensures seamless data access at scale as organizations scale up their APEX AI deployments. Unlocking the future of AI in industrial automation XMPro APEX AI and MongoDB Atlas are a winning combination that paves the way for a new era of industrial automation. By tackling the core challenges of AI agents' deployment in industrial environments, we’re enabling organizations to deploy robust, intelligent, and autonomous industrial AI agents at scale. To learn more about MongoDB’s role in the manufacturing industry, please visit our manufacturing and automotive webpage . Ready to boost your MongoDB skills? Head over to our MongoDB Atlas Learning Hub to start learning today.
VPBank Builds OpenAPI Platform With MongoDB
Open banking is the practice of banks sharing some of their financial data and services to developers for third-party financial service providers through an API. Open banking has accelerated the digitization of the financial services and banking industries. It also helps foster innovation and enhance customer experience by helping create customer-centric, personalized services and experiences. MongoDB has been at the forefront of this revolution. Specifically, MongoDB helps financial institutions worldwide take advantage of OpenAPI . This open-source technology enables an organization’s applications, software, and digital platforms to connect and exchange data with third-party services efficiently and securely. An example is VPBank . One of Vietnam’s largest private banks, it serves over 30 million customers. In 2020, VPBank was the first Vietnamese bank to adopt MongoDB Atlas for OpenAPI. Working with MongoDB, VPBank moved to a microservices architecture , which supported the creation of its own OpenAPI platform and set a new standard for digital banking in Vietnam. Speaking at MongoDB Day in Vietnam in November 2024 , Anh K. Pham, Head of Database Services and Operations for VPBank, shared how MongoDB set up the bank for success with open banking. Migrating from the relational model to the document model Before working with MongoDB, VPBank operated in SQL . The COVID pandemic and the rise of models such as open banking in the early 2020s mandated rapid digitization of banking operations and services. VPBank realized it needed to build the next generation of intelligent banking services to remain competitive. This was not feasible with traditional relational database management systems and the SQL model. VPBank’s primary goal was to harness the power of data and to more efficiently manage unstructured data . This meant switching to an agile architecture based on microservices. “When I was introduced to NoSQL, it made sense,” said Pham. “Data is not always structured. There’s a bunch of different data points here and there, and you can’t make anything of it. But it has to be stored somewhere, it has to be read, and it has to be fed into your applications.” MongoDB Atlas was hosted on Amazon Web Services (AWS) as part of VPBank’s cloud transformation journey. The bank chose MongoDB Atlas for its ability to handle multiple workload types, which had been inadequately supported by its relational databases. These workloads include time series data , event data, real-time analytics, notifications, and big data (like transaction histories, catalog data, and JSON data). Powering 220 microservices with flexibility, scalability, and performance VPBank’s OpenAPI platform consists of over 220 microservices, and it processes more than 100 million transactions per month. By supporting these transactions, MongoDB is ultimately helping VPBank enhance customer experiences and streamline operations. By using MongoDB Atlas, VPBank can better unlock the power of its data to quickly build data-driven applications and services on its microservices architecture. It experienced three substantial benefits by using MongoDB: Flexibility: MongoDB Atlas empowers VPBank to handle complex data, conduct rapid development and iterations, and facilitate efficient API development with BSON. Scalability: MongoDB enables dynamic scaling to handle increasing workloads. Additionally, horizontal scaling distributes data across multiple servers to handle high volumes, spikes in transactions, and API requests. Performance: MongoDB Atlas’s performance capabilities enable VPBank to manage large volumes of data in real time, regardless of acute throughput and latency demands. We have flexibility; we have scalability; we have performance. Those are the main things we want to look at when we’re talking about banking. I need to be flexible. I need to be scalable. I need my performance to be high, because I want my customers to not wait and see if their money is going to go through or not, Ahn K. Pham, Head of Database Services and Operations, VPBank Using OpenShift Container Platform (OCP), VPBank deployed a microservices architecture to run its Open Banking services. “Choosing MongoDB as the modern database was the best choice since it can handle multiple types of data workloads with the performance we needed,” said Pham. Looking to the future VPBank plans to continue its cloud transformation journey. “We’re continuing to migrate our applications from on-premises into the cloud, and we’re continuing to modernize our applications as well,” said Pham. “That means that maybe those other databases that we used to have might be turning into MongoDB databases.” VPBank is also looking at MongoDB to support its AI-driven future: “We really want to focus on AI and data analytics, pulling information from all our customers’ transactions,” explained Pham. “We want to ensure that what we build caters to our 30-plus million customers.” Visit our MongoDB Atlas Learning Hub to boost your MongoDB skills. To learn more about MongoDB for financial services, visit our solutions page .