document-level-locking

3096 results

OrderOnline: AI Improves Conversion Rate by 56% with MongoDB

Established by Ordivo Group in 2018, OrderOnline has quickly become a driving force behind Indonesia’s thriving social commerce market. OrderOnline offers an end-to-end solution for organizations and individuals selling on social platforms like Facebook Marketplace, typically through social ads, landing pages, and storefronts. OrderOnline built its social commerce platform on the MongoDB Community Edition , and later migrated to MongoDB Atlas in 2022. The platform provides everything from managing orders to handling logistics for companies and individuals selling on social platforms. It addresses common social commerce pain points, such as complex logistics, failed deliveries, and unmanageable order processing due to scale. Speaking at MongoDB.local Jakarta 2024 , Wafda Mufti, Vice President of Technology for Ordivo Group, explained how his slogan—“Simple Input, High Accuracy”—drove OrderOnline to become one of Indonesia’s leading social commerce companies. “We have sellers using storefronts, landing pages, and checkout forms. Thanks to MongoDB's flexibility, we can manage these unpredictable business processes. We even store our front-end structure in MongoDB,” said Mufti. “Thanks to MongoDB, we can ensure we have the highest quality of data.” Mufti also shared how the company is using MongoDB Atlas Search and MongoDB Atlas Vector Search to power innovative search and AI use cases. Scaling social commerce with MongoDB Atlas Five years after its launch, OrderOnline had grown to 40,000 users and was handling 1.5 million transactions each month. This fast growth led to challenges, particularly around managing data at scale and ensuring high success rates for sellers. Most of OrderOnline’s users drive orders by using a wide range of sources. These include social ads, landing pages, and storefronts. Many of OrderOnline’s orders are handled via WhatsApp through Click to WhatsApp Ads (CTWA). Initially, managing orders via platforms like WhatsApp was feasible. However, as social commerce became more popular, the volume of orders increased, which quickly became overwhelming. Furthermore, for large sellers who do not handle their own products, OrderOnline had to manage order packing and shipping, as well as managing returns. “We were overwhelmed with orders, but we wanted to manage our SLAs,” said Mufti. “We wanted to ensure products were well-delivered.” MongoDB Atlas’s flexibility has enabled OrderOnline to manage unpredictable business processes, and to efficiently handle various complex tasks associated with order management and logistics. Because MongoDB Atlas is designed for fast iteration, it enables OrderOnline to swiftly adapt its platform in response to changing business needs and user demands. MongoDB Atlas also supports high scalability. This empowers OrderOnline to manage a growing user base and increasing transaction volumes without compromising performance. Additionally, MongoDB Atlas's reliability under high transactional loads ensures that OrderOnline can maintain quick response times—a core part of their SLA. This is critical for maintaining the agility needed in the dynamic world of social commerce. “We have a monitoring system that triggers alarms if response times fall below one second,” noted Mufti. Another critical SLA that OnlineOrder tracks is the delivery success rate. Previously, deliveries were only successful 94% of the time. Using MongoDB Atlas, OrderOnline built OExpress, a service that sellers can use to customize the number of delivery attempts based on specific service agreements. An upper limit cap of up to five delivery attempts is also mandated. OExpress closely tracks delivery attempts data. This ensures packages are delivered and minimizes returns and damages. “Thanks to MongoDB, we have achieved a success rate of 98.4%,” said Mufti. “We can manage multiple attempts to deliver to the customers, so sellers don’t have to worry about dealing with delivery issues anymore when using a marketplace.” Beyond deliveries, OrderOnline identified seamless search and customer support integrations as key operations that MongoDB could enhance. AI and search: conversion rates jump by 56% As OrderOnline’s business grew, scalability created specific challenges with CTWAs. Particularly, OrderOnline’s platform struggled to manage and make sense of the growing volume of inconsistent data types it was receiving, such as location, postal codes, and product details—accurate input of data is vital to ensuring orders are being processed and delivered. “People want [to be able to input] freeform text. They want things to be simple and easy, and not be restricted by rigid formats,” said Mufti. “But we still have to ensure data accuracy.” One of the standout features that helped OrderOnline improve search accuracy and management is MongoDB Atlas Search . Fuzzy search in MongoDB Atlas Search can handle typo errors when searching for districts. For example, if a user types “Surabaya,” Atlas Search will still fetch results for “Surabaya”. Furthermore, synonyms in MongoDB Atlas Search can handle shortened names for provinces and districts in Indonesia. For example, “Jabar” for Jawa Barat or “Jateng” for Jawa Tengah. Acronyms are also handled. “Because there’s AI in the background, there’s no need to manually input zip codes for example. Our engine can search for it,” said Mufti. “Someone clicks, then places an order, fills out the form, and it goes straight into our order management system, which supports fuzzy search.” As OrderOnline grew, it also needed to increase customer support with 24/7 availability and fast response times. MongoDB Atlas Vector Search supported the development of a seamless and user-friendly interface with the creation of an AI Chatbot. This chatbot provides sellers with ease in managing customer interactions, checking stock availability, and calculating shipping costs. “If the ad contains a WhatsApp link, it will be directly managed by the chatbot. The chatbot even checks shipping costs, compares prices, and shows how much it would cost if you purchased five items,” explained Mufti. “The AI handles requests for photos, checks stock availability, and much more. And once a deal is closed, it goes directly into our order management system.” Before the creation of the AI chatbot with MongoDB Atlas Vector Search, the WhatsApp conversion rate was 50%. Out of 100 interactions, 50 would successfully close the deal. With the implementation of AI, this rate has increased to 78%. Building on these successes, OrderOnline is now looking at further business and geographic expansion supported by MongoDB’s global reach, with the aim to help more sellers throughout Indonesia make the best out of social commerce. Visit the MongoDB Atlas Learning Hub to boost your MongoDB skills. To learn more about MongoDB Atlas Search, visit our product page . Get started with Atlas Vector Search today through our quick start guide .

May 13, 2025

How MongoDB and Google Cloud Power the Future of In-Car Assistants

The automotive industry is evolving fast: electrification, the rise of autonomous driving, and advanced safety systems are reshaping vehicles from the inside out. But innovation isn’t just happening to the drivetrain. Drivers (and passengers) now expect more intelligent, intuitive, and personalized experiences whenever they get into a car. That’s where things get tricky. While modern cars are packed with features, many of them are complex to use. Voice assistants were supposed to simplify things, but most still only handle basic tasks, like setting navigation or changing music. As consumers’ expectations of technology grow, so does pressure on automakers. Standing out in a competitive market, accelerating time to market, and managing rising development costs—all while delivering seamless digital experiences—is no small task. The good news? Drivers are ready for something better. According to a SoundHoundAI study , 79% of drivers in Europe would use voice assistants powered by generative AI. And 83% of those planning to buy a car in the next 12 months say they’d choose a model with AI features over one without. Gen AI is transforming voice assistants from simple command tools into dynamic copilots—able to answer questions, offer insights, and adapt to each user. At CES 2025, we saw major players like BMW, Honda, and HARMAN pushing the boundaries of AI-driven car assistants. To truly make these experiences personalized, you need the right data infrastructure. Real-time signals from the car, user preferences, and access to unstructured content like manuals and FAQs are essential for building truly intelligent systems. By combining gen AI with powerful data infrastructure, we can create more responsive, smarter in-car assistants. With flexible, scalable data access and built-in vector search, MongoDB Atlas is an ideal solution. Together with partners like Google Cloud, MongoDB is helping automotive companies innovate faster and deliver better in-car experiences. MongoDB as the data layer behind smarter assistants Building intelligent in-car assistants isn't just about having cutting-edge AI models—it’s about what feeds them. A flexible, scalable data platform is the foundation. To deliver real-time insights, personalize interactions, and evolve with new vehicle features, automakers need a data layer that can keep up. MongoDB gives developers the speed and simplicity they need to innovate. Its flexible document model lets teams store data the way applications use it—without rigid schemas or complex joins. That means faster development, fewer dependencies, and less architectural friction. Built-in capabilities like time series, full-text search, and real-time sync mean fewer moving parts and faster time to market. And because MongoDB Atlas is built for scale, availability, and security, automakers get the enterprise-grade reliability they need. Toyota Connected , for example, relies on MongoDB Atlas to power its Safety Connect platform across millions of vehicles, delivering real-time emergency support with 99.99% availability. But what really sets MongoDB apart for gen AI use cases is the way it handles data. AI workloads thrive on diverse, often unstructured inputs—text, metadata, contextual signals, vector embeddings. MongoDB’s document model handles all of it, side by side, in a single, unified platform. That’s why companies like Cognigy use MongoDB to power leading conversational AI platforms that manage hundreds of queries per second across multiple channels and data types. With Atlas Vector Search , development teams in the automotive industry can bring semantic search to unstructured data like manuals, support docs, or historical interactions. And by keeping operational, metadata, and vector data together, MongoDB makes it easier to deploy and scale gen AI apps that go beyond analytics and actually transform in-car experiences. MongoDB is already widely adopted across the automotive industry, powering innovation from the factory floor to the finish line . With its ability to scale and adapt to complex, evolving needs, MongoDB is helping automakers accelerate digital transformation and deliver next-gen in-car experiences. Architecture that drives intelligence at scale To bring generative AI into the driver’s seat, we designed an architecture that shows how these systems can work together in the real world. At the core, we combined the power of MongoDB Atlas with Google Cloud’s AI capabilities to build a seamless, scalable solution. Google Cloud powers speech recognition and language understanding, while MongoDB provides the data layer with Atlas Database and Atlas Vector Search . MongoDB has also worked with PowerSync to keep vehicle data in sync across cloud and edge environments. Imagine you're driving, and a red light pops up on your dashboard. You’re not sure what it means, so you ask the in-car assistant, “What is this red light on my dashboard?” The assistant transcribes your question, checks the real-time vehicle signals to identify the issue, and fetches relevant guidance from your car’s manual. It tells you what the warning means, whether it’s urgent, and what steps you should take. If it’s something that needs attention, it can suggest adding a service stop to your route. Or maybe switch your dashboard view to show more details. All of this happens through a natural voice interaction—no menus, no guesswork. Figure 1. A gen AI in-car assistant in action. Under the hood, this flow brings together several key technologies. Google Cloud’s Speech-to-Text and Text-to-Speech APIs handle the conversation. Document AI breaks the car manual into smaller, searchable chunks. Vertex AI generates text embeddings and powers the large language model. All of this connects to MongoDB Atlas, where Atlas Vector Search retrieves the most relevant content. Vehicle signals are kept up to date using PowerSync, which enables real-time, bidirectional data sync. And, by using the Vehicle Signal Specification (VSS) from COVESA, we’re following a widely adopted standard that makes it easy to expand and integrate with more systems down the road. Figure 2. Reference architecture overview. This is just one example of how flexible, future-ready architecture can unlock powerful, intuitive in-car experiences. Reimagining the driver experience Smarter in-car assistants start with smarter architectures. As generative AI becomes more capable, the real differentiator is how well it connects to the right data—securely, in real time, and at scale. With MongoDB Atlas, automakers can accelerate innovation, simplify architecture complexity, and cut development costs to deliver more intuitive, helpful experiences. It’s not just about adding features—it’s about making them work better together, so drivers get real value from the technology built into their cars. Learn how to power end-to-end value chain optimization with AI/ML, advanced analytics, and real-time data processing for innovative automotive applications. Visit our manufacturing and automotive web page. Want to get hands-on experience? Explore our GitHub repository for an in-depth guide on implementing this solution.

May 13, 2025

Introducing Automated Risk Analysis in Relational Migrator

When planning a complex home renovation, homeowners often turn to a team of experts to evaluate the project. Architects sketch out designs, structural engineers assess a house’s structure and foundation, and contractors estimate renovation costs and timelines. This process can take weeks or even months before construction begins, consuming valuable time and precious resources. The same is true for database migration projects. Solution architects planning a migration from legacy relational databases to modern platforms like MongoDB rely heavily on manual assessments by expert teams. These assessments—which involve analyzing database schemas to identify potential risks—can drain resources and delay progress. That’s where the new Pre-Migration Analysis feature in MongoDB Relational Migrator comes in. First, it uses advanced algorithms to automate much of the pre-migration evaluation process by analyzing a database’s schema. It then provides a detailed, customized report that highlights inconsistencies, flags potential issues like incompatible data types, and recommends actionable steps to ensure a successful migration to MongoDB. The report is dynamic, allowing you to refine it by marking items as completed or triaged. By providing a clear roadmap, this feature empowers you to plan and execute migrations with confidence while saving time and minimizing risk. Why use Pre-Migration Analysis? There are a number of benefits associated with using the new Pre-Migration Analysis tool, from faster migrations to the convenience of reporting. Here are some of the areas Pre-Migration Analysis can help with: Minimized disruption: Migration planning often feels overwhelming because it risks diverting focus from your core business operations. A precise, detailed analysis upfront helps you avoid unnecessary disruptions by providing recommendations for success, saving your team from spinning its wheels and wasting resources. Accurate resource allocation: Without a proper assessment, it’s hard to gauge the time, skills, and budget needed for the migration. Automating the evaluation process can give you a head start and allow you to allocate resources more effectively. Faster time to value: An automated assessment accelerates the process and decision-making, letting you kick-start the migration process sooner and modernize your application modernization initiative. Reduced technical debt: Without a precise evaluation, migrations can introduce inefficiencies or unresolved issues. By analyzing your ecosystem upfront, you ensure a smooth transition with fewer hiccups and better long-term stability. Stronger business case: Having a detailed shareable assessment report in hand makes it easier to justify the migration to stakeholders, showing the effort is well-planned, the risks are understood, and the potential ROI is worth the investment. How Pre-Migration Analysis works The Pre-Migration Analysis tool connects to your relational database and extracts the structure of tables, routines, and other components. It then applies automated rules to identify potential migration issues. These rules help flag areas that may need attention when transitioning to MongoDB. Each rule includes: Category: The type of issue (such as incompatible data types). Difficulty level: An estimate of the effort required to resolve the issue. Required action: Guidance on what actions are necessary, optional, or unnecessary. These actions are categorized into “Tasks,” which are necessary actions; “Risks,” or optional actions that may have some risk associated with them; and “Notes,” or how Relational Migrator will handle the migration. Using these rules, the tool generates a detailed migration risk assessment report, complete with actionable recommendations to help ensure a successful migration. Figure 1. Pre-migration impact analysis The impact analysis shows all objects that require action before performing the migration. Additionally, the tool provides a “traffic light” migration confidence level, indicating the overall readiness of the migration. The migration confidence level is based on the types of identified issues and their complexity, giving you a clear indication of how prepared you are for the migration. Figure 2. Pre-migration analysis summary The pre-migration analysis summary gives an overview of the compatibility for your project and how many objects need attention before migrating. You can learn more about how Pre-Migration works in our documentation . For a demo of Pre-Migration Analysis in action, check out this video from the MongoDB product team. Pre-Migration Analysis is now in Public Preview. Download Relational Migrator now to check it out!

May 12, 2025

People Who Ship: Building Centralized AI Tooling

Welcome to People Who Ship! In this new video and blog series, we'll be bringing you behind-the-scenes stories and hard-won insights from developers building and shipping production-grade AI applications using MongoDB. In each month's episode, your host—myself, Senior AI Developer Advocate at MongoDB—will chat with developers from both inside and outside MongoDB about their projects, tools, and lessons learned along the way. Are you a developer? Great! This is the place for you; People Who Ship is by developers, for developers. And if you're not (yet) a developer, that's great too! Stick around to learn how your favorite applications are built. In this episode, John Ziegler , Engineering Lead on MongoDB's internal generative AI (Gen AI) tooling team, shares technical decisions made and practical lessons learned while developing a centralized infrastructure called Central RAG (RAG = Retrieval Augmented Generation ), which enables teams at MongoDB to rapidly build RAG-based chatbots and copilots for diverse use cases. John’s top three insights During our conversation, John shared a number of insights learned during the Central RAG project. Here are the top three: 1. Enforce access controls across all operations Maintaining data sensitivity and privacy is a key requirement when building enterprise-grade AI applications. This is especially important when curating data sources and building centralized infrastructure that teams and applications across the organization can use. In the context of Central RAG, for example, users should only be able to select or link data sources that they have access to, as knowledge sources for their LLM applications. Even at query time, the LLM should only pull information that the querying user has access to, as context to answer the user's query. Access controls are typically enforced by an authentication service using access control lists (ACLs) that define the relationships between users and resources. In Central RAG, this is managed by Credal’s permissions service . You can check out this article that shows you how to build an authentication layer using Credal’s permissions service, and other tools like OpenFGA. 2. Anchor your evaluations in the problem you are trying to solve Evaluation is a critical aspect of shipping software, including LLM applications. It is not a one-and-done process—each time you change any component of the system, you need to ensure that it does not adversely impact the system's performance. The evaluation metrics depend on your application's specific use cases. For Central RAG, which aims to help teams securely access relevant and up-to-date data sources for building LLM applications, the team incorporates the following checks in the form of integration and end-to-end tests in their CI/CD pipeline: Ensure access controls are enforced when adding data sources. Ensure access controls are enforced when retrieving information from data sources. Ensure that data retention policies are respected, so that removed data sources are no longer retrieved or referenced downstream. LLM-as-a-judge to evaluate response quality across various use cases with a curated dataset of question-answer pairs. If you would like to learn more about evaluating LLM applications, we have a detailed tutorial with code . 3. Educate your users on what’s possible and what’s not User education is critical yet often overlooked when deploying software. This is especially true for this new generation of AI applications, where explaining best practices and setting clear expectations can prevent data security issues and user frustration. For Central RAG, teams must review the acceptable use policies, legal guidelines, and documentation on available data sources and appropriate use cases before gaining access to the platform. These materials also highlight scenarios to avoid, such as connecting sensitive data sources, and provide guidance on prompting best practices to ensure users can effectively leverage the platform within its intended boundaries. John’s AI tool recommendations The backbone of Central RAG is a tool called Credal . Credal provides a platform for teams to quickly create AI applications on top of their data. As maintainers of Central RAG, Credal allows John’s team to create a curated list of data sources for teams to choose from and manage applications created by different teams. Teams can choose from the curated list or connect custom data sources via connectors, select from an exhaustive list of large language models (LLMs), configure system prompts, and deploy their applications to platforms like Slack, etc., directly from the Credal UI or via their API. Surprising and delighting users Overall, John describes his team’s goal with Central RAG as “making it stunningly easy for teams to build RAG applications that surprise and delight people.” We see several organizations adopting this central RAG model to both democratize the development of AI applications and to reduce the time to impact of their teams. If you are working on similar problems and want to learn about how MongoDB can help, submit a request to speak with one of our specialists. If you would like to explore on your own, check out our self-paced AI Learning Hub and our gen AI examples GitHub repository .

May 12, 2025

Capgemini & MongoDB: Smarter AI and Data for Business

AI is reshaping the way enterprises operate, but one fundamental challenge still exists: Most applications were not built with AI in mind. Traditional enterprise systems are designed for transactions, not intelligent decision-making, making it difficult to integrate AI at scale. To bridge this gap, MongoDB and Capgemini are enabling businesses to modernize their infrastructure, unify data platforms, and power AI-driven applications. This blog explores the trends driving the AI revolution and the role that Capgemini and MongoDB play in powering AI solutions. The Challenge: Outdated infrastructure is slowing AI innovation In talking to many customers across industries, we have heard the following key challenges in adopting AI: Data fragmentation: Organizations have long struggled with siloed data, where operational and analytical systems exist separately, making it difficult to unify data for AI-driven insights. In fact, according to the Workday Global survey , 59% of C-suite executives said their organizations' data is somewhat or completely siloed, which results in inefficiencies and lost opportunities. Moreover, AI workloads such as retrieval-augmented generation (RAG), semantic search , and recommendation engines require vector databases, yet most traditional data architectures fail to support these new AI-driven capabilities. Lack of AI-ready data infrastructure: The lack of AI-ready data infrastructure forces developers to work with multiple disconnected systems, adding complexity to the development process. Instead of seamlessly integrating AI models, developers often have to manually sync data, join query results across multiple platforms, and ensure consistency between structured and unstructured data sources. This not only slows down AI adoption but also significantly increases the operational burden. The solution: AI-Ready data infrastructure with MongoDB and Capgemini Together, MongoDB and Capgemini provide enterprises with the end-to-end capabilities needed to modernize their data infrastructure and harness AI's full potential. MongoDB provides a flexible document model that allows businesses to store and query structured, semi-structured, and unstructured data seamlessly, a critical need for AI-powered applications. Its vector search capabilities enable semantic search, recommendation engines, RAG, and anomaly detection, eliminating the need for complex data pipelines while reducing latency and operational overhead. Furthermore, MongoDB’s distributed and serverless architecture ensures scalability, allowing businesses to deploy real-time AI workloads like chatbots, intelligent search, and predictive analytics with the agility and efficiency needed to stay competitive. Capgemini plays a crucial role in this transformation by leveraging AI-powered automation and migration frameworks to help enterprises restructure applications, optimize data workflows, and transition to AI-ready architectures like MongoDB. Using generative AI, Capgemini enables organizations to analyze existing systems, define data migration scripts, and seamlessly integrate AI-driven capabilities into their operations. Real-world use cases Let's explore impactful real-world use cases where MongoDB and Capgemini have collaborated to drive cutting-edge AI projects. AI-powered field operations for a global energy company: Workers in hazardous environments, such as oil rigs, previously had to complete complex 75-field forms, which slowed down operations and increased safety risks. To streamline this process, the company implemented a conversational AI interface, allowing workers to interact with the system using natural language instead of manual form-filling. This AI-driven solution has been adopted by 120,000+ field workers, significantly reducing administrative workload, improving efficiency, and enhancing safety in high-risk conditions. AI-assisted anomaly detection in the automotive industry: Manual vehicle inspections often led to delays in diagnostics and high maintenance costs, making it difficult to detect mechanical issues early. To address this, an automotive company implemented AI-powered engine sound analysis, which used vector embeddings to identify anomalies and predict potential failures before they occurred. This proactive approach has reduced breakdowns, optimized maintenance scheduling, and improved overall vehicle reliability, ensuring cost savings and enhanced operational efficiency. Making insurance more efficient: GenYoda, an AI-driven solution developed by Capgemini, is revolutionizing the insurance industry by enhancing the efficiency of professionals through advanced data analysis. By harnessing the power of MongoDB Atlas Vector Search, GenYoda processes vast amounts of customer information including policy statements, premiums, claims histories, and health records to provide actionable insights. This comprehensive analysis enables insurance professionals to swiftly evaluate underwriters' reports, construct detailed health summaries, and optimize customer interactions, thereby improving contact center performance. Remarkably, GenYoda can ingest 100,000 documents within a few hours and deliver responses to user queries in just two to three seconds, matching the performance of leading AI models. The tangible benefits of this solution are evident; for instance, one insurer reported a 15% boost in productivity, a 25% acceleration in report generation—leading to faster decision-making—and a 10% reduction in manual efforts associated with PDF searches, culminating in enhanced operational efficiency. Conclusion As AI becomes operational, real-time, and mission-critical for enterprises, businesses must modernize their data infrastructure and integrate AI-driven capabilities into their core applications. With MongoDB and Capgemini, enterprises can move beyond legacy limitations, unify their data, and power the next generation of AI applications. For more, watch this TechCrunch Disrupt session by Steve Jones (EVP, Data-Driven Business & Gen AI at Capgemini) and Will Shulman (former VP of Product at MongoDB) to learn about more real world use cases. And discover how Capgemini and MongoDB are driving innovation with AI and data solutions.

May 8, 2025

Reimagining Investment Portfolio Management with Agentic AI

Risk management in capital markets is becoming increasingly complex for investment portfolio managers. The need to process vast amounts of data—from real-time market to unstructured social media data—demands a level of flexibility and scalability that traditional systems struggle to keep up with. AI agents —a type of artificial intelligence that can operate autonomously and take actions based on goals and real-world interactions—are set to transform how investment portfolios are managed. According to Gartner, 33% of enterprise software applications will include agentic AI by 2028, up from less than 1% in 2024. At least 15% of day-to-day work decisions are being made autonomously through AI agents. 1 MongoDB empowers AI agents to effectively transform the landscape of investment portfolio management. By leveraging the combination of large language models (LLMs), retrieval-augmented generation (RAG), and MongoDB Atlas Vector Search , AI agents are enabled to analyze vast financial datasets, detect patterns, and adapt in real time to changing conditions dynamically. This advanced intelligence elevates decision-making and empowers portfolio managers to enhance portfolio performance, manage market risks more effectively, and perform precise asset impact analysis. Intelligent investment portfolio management Investment portfolio management is the process of selecting, balancing, and monitoring a mix of financial assets—such as stocks, bonds, commodities, and derivatives—to achieve a higher return on investment (ROI) while managing risk effectively and proactively. It involves thoughtful asset allocation, diversification to mitigate market volatility, continuous monitoring of market conditions, and the performance of underlying assets to stay aligned with investment objectives. To stay relevant today, investment portfolio management requires the integration of diverse unstructured alternative data like financial news, social media sentiment, and macroeconomic indicators, alongside structured market data such as price movements, trading volumes, index, spreads, and historical execution records. This complex data integration presents a new level of sophistication in portfolio analytics, as outlined in Figure 1. It requires a flexible, scalable, unified data platform that can efficiently store, retrieve, and manage such diverse datasets, and pave the way for building next-gen portfolio management solutions. Figure 1. Investment portfolio analysis Incorporating MongoDB’s flexible schema accelerates data ingestion across various data sources—such as real-time market feeds, historical performance records, and risk metrics. New portfolio management solutions enabled with alternative data supports more intelligent decision-making and proactive market risk mitigation. This paradigm shift realizes deeper insights, enhances alpha generation, and refines asset reallocation with greater precision, underscoring the critical role of data in intelligent portfolio management. How MongoDB unlocks AI-powered portfolio management AI-powered portfolio asset allocation has become a desirable characteristic of modern investment strategies. By leveraging AI-based portfolio analysis, portfolio managers gain access to advanced tools that provide insights tailored to specific financial objectives and risk tolerances. This approach optimizes portfolio construction by recommending an alternate mix of assets—ranging from equities and bonds to ETFs and emerging opportunities—while continuously assessing the evolving market conditions. Figure 2 illustrates a proposed workflow for AI-powered investment portfolio management that brings diverse market data, including stock price, volatility index (VIX), and macroeconomic indicators such as GDP, interest rate, and unemployment rate, into an AI analysis layer to generate actionable and more intelligent insights. Figure 2. AI-powered investment portfolio management MongoDB’s versatile document model unlocks a more intuitive way for the storage and retrieval of structured, semi-structured, and unstructured data. This is aligned with the way developers structure the objects inside the applications. In capital markets, time series are often used to store time-based trading data and market data. MongoDB time series collections are optimal for analyzing data over time, they are designed to efficiently ingest large volumes of market data with high performance and dynamic scalability. Discovering insights and patterns from MongoDB time series collections is easier and more efficient due to faster underlying ingestion and retrieval mechanisms. Taking advantage of MongoDB Atlas Charts' business intelligence dashboard and evaluating advanced AI-generated investment insights, portfolio managers gain access to sophisticated capabilities that integrate high-dimensional insights derived from diverse datasets, revealing new patterns that can lead to enhanced decision-making for alpha generation and higher portfolio performance. MongoDB Atlas Vector Search plays a critical role inthe analysis of market news sentiment by enabling context-aware retrieval of related news articles. Traditional keyword-based searches often fail to capture semantic relationships between news stories, while vector search, powered by embeddings, allows for a more contextual understanding of how different articles relate to a stock sentiment. Storing news as vectors: When stock-related news are ingested, each news article is vectorized as a high-dimensional numerical representation using an embedding model. These embeddings encapsulate the meaning and context of the text, rather than just individual words. The raw news articles are embedded and stored in MongoDB as vectors. Finding related news: Vector search is used to find news articles based on similarity algorithms, even if they don’t contain the exact same stock information. This helps in identifying patterns and trends across multiple news articles based on contextual similarity. Enhancing sentiment calculation: Instead of relying on a single news sentiment, a final sentiment score is aggregated from multiple related news sources with similar and relevant content. This prevents one individual outlier news from influencing the result and provides a more holistic view of market news sentiment. Agentic AI foundation Agentic AI incorporates an orchestrator layer that manages task execution in workflows. AI Agents can operate either fully autonomous or semi-autonomous with a human-in-the-loop (HITL). AI agents are equipped with advanced tools, models, memory, and data storage. Memory leverages both long and short-term contextual data for informed decision-making and continuity of the interactions. Tools and models enable the AI agents to decompose tasks into steps and execute them cohesively. The data storage and retrieval are pivotal to AI agent effectiveness and can be advanced by embedding and vector search capabilities. Figure 3. Agentic AI foundation AI agents’ key characteristics: Autonomy: The ability to make decisions based on the situation dynamically and to execute tasks with minimal human intervention. Chain of thoughts: The ability to perform step-by-step reasoning and breaking complex problems into logical smaller steps for better judgement and decision-making. Context aware: AI agents continuously adapt their actions based on the environment changing conditions. Learning: AI agents improve their performance over time by adapting and enhancing. Intelligent investment portfolio management with AI agents AI agents are positioned to revolutionize portfolio management by shifting from rule-based to adaptive, context aware, and AI-powered decision-making. AI-enabled portfolio management applications continuously learn, adapt, and optimize investment strategies more proactively and effectively. The future isn’t about AI replacing portfolio managers, but rather humans and AI working together to create more intelligent, adaptive, and risk-aware portfolios. Portfolio managers who leverage AI, gain a competitive edge and deeper insights to significantly enhance portfolio performance. The solution, illustrated in Figure 4 below, includes a data ingestion application, three AI Agents, and a market insight application that work in harmony to create a more intelligent, insights-driven approach to portfolio management. Data ingestion application The data ingestion application runs continuously, captures various market data, and stores it in time series or as standard collections in MongoDB. Market data: Collects and processes real-time market data, including prices, volumes, trade activity, and volatility index. Market news: Captures and extracts market and stock-related news. News data is vectorized and stored in MongoDB. Market indicators: Retrieves key macroeconomic and financial indicators, such as GDP, interest rate, and unemployment rate. AI agents In this solution, there are 3 AI agents. Market analysis agent and market news agent have AI analytical workflows. They run based on a daily schedule in a fully automated fashion, producing the expected output and storing it in MongoDB. Market assistant agent has a more dynamic workflow and is designed to play the role of an assistant to a portfolio manager. It works based on prompt engineering and agentic decision making. Market assistant agent is capable of responding to questions about asset reallocation and market risks based on current market conditions and bringing the new AI-powered insights to the portfolio managers. Market analysis agent: Analyzes market trends, volatility, and patterns to generate insights related to the risk of portfolio assets. Market news agent: Assesses the news sentiment for each of assets by analyzing news that directly and indirectly can impact the portfolio performance. This agent is empowered by MongoDB vector search. Market assistant agent: On demand and through a prompt, answers portfolio manager’s questions about market trends, risk exposure, and portfolio allocation by using data sources and insights that other agents create. Market insight application The market insight application is a visualization layer that provides charts, dashboards, and reports for portfolio managers, a series of actionable investment insights from the outputs created by AI agents. This information is generated based on a predetermined daily schedule automatically and presented to portfolio managers. Figure 4. Investment portfolio management powered by MongoDB AI agents AI agents enable portfolio managers to have an intelligent and risk-based approach by analyzing the impact of market conditions on the portfolio and its investment goals. The AI Agents capitalize on MongoDB’s powerful capabilities, including the aggregation framework and vector search, combined with embedding and generative AI models to perform intelligent analysis and deliver insightful portfolio recommendations. Next steps According to Deloitte, by 2027, AI-driven investment tools will become the primary source of advice for retail investors, with AI-powered investment management solutions projected to grow to around 80% by 2028. 2 By leveraging AI agents and MongoDB, financial institutions can unlock the full potential of AI-driven portfolio management to obtain advanced insights that allow them to stay ahead of market shifts, optimize investment strategies, and manage risk with greater confidence. MongoDB lays a strong foundation for Agentic AI journey and the implementation of next-gen investment portfolio management solutions. To learn more about how MongoDB can power AI innovation, check out these additional resources: Transforming capital markets with MongoDB | Solutions page Launching an agentic RAG chatbot with MongoDB and Dataworkz | Solutions page Demystifying AI Agents: A Guide for Beginners 7 Practical Design Patterns for Agentic Systems 1 Sun, D., " Capitalize on the AI Agent Opportunity ”, Gartner, February 27, 2025. 2 AI, wealth management and trust: Could machines replace human advisors? , World Economic Forum, Mar 17, 2025.

May 7, 2025

MongoDB 8.0, Predefined Roles Now Available on DigitalOcean

I’m pleased to announce that MongoDB 8.0 is now available on DigitalOcean Managed MongoDB, bringing enhanced performance, scalability, and security to DigitalOcean’s fully managed MongoDB service. This update improves query efficiency, expands encryption capabilities, and optimizes scaling for large workloads. Additionally, DigitalOcean Managed MongoDB now includes role-based access control (RBAC) with predefined roles, making it easier to manage access control, enhance security, and streamline database administration across MongoDB clusters on DigitalOcean. DigitalOcean is one of MongoDB’s premier Certified by MongoDBaaS partners, and since launching our partnership in 2021, developer productivity has been the core focus of MongoDB and DigitalOcean’s partnership together. These new enhancements to DigitalOcean Managed MongoDB are a testament to the importance of enabling developers, startups, and small and medium-sized businesses to rapidly build, deploy, and scale applications to accelerate innovation and increase productivity and agility. What’s new in MongoDB 8.0? MongoDB 8.0 features several upgrades designed to enhance its performance, security, and ease of use. Whether you’re managing high-throughput applications or looking for better query optimization, these improvements make DigitalOcean Managed MongoDB even more powerful: Higher throughput and improved replication performance: Dozens of architectural optimizations in MongoDB 8.0 have improved query and replication speed across the board. Better time series handling: Store and manage time series data more efficiently, helping to enable higher throughput with lower resource usage and costs. Expanded Queryable Encryption: MongoDB 8.0 adds range queries to Queryable Encryption, enabling new use cases for secure data operations. With encrypted searches that don’t expose sensitive data, MongoDB 8.0 enhances both privacy and compliance. Greater performance control: Set default maximum execution times for queries and persist query settings after restarts, providing more predictable database performance. MongoDB 8.0 features 36% better read throughput, 59% faster bulk writes, 200% faster time series aggregations, and new sharding capabilities that distribute data across shards up to 50 times faster—making MongoDB 8.0 the most secure, durable, available, and performant version of MongoDB yet. Learn more about MongoDB 8.0 on our release page. Benefits of RBAC for DigitalOcean Managed MongoDB Managing database access across organizations can be a challenge, especially as teams grow and security requirements become more complex. Without a structured approach, organizations risk unauthorized access, operational inefficiencies, and compliance gaps. With RBAC now available in their MongoDB environments, DigitalOcean Managed MongoDB users can avoid these risks and enforce clear, predefined access policies, helping to ensure secure, efficient, and scalable database management. Here’s how RBAC can benefit your business : Stronger data protection: Keep your sensitive information secure by ensuring that only authorized users have access, reducing the risk of data breaches and strengthening overall security. Less manual work, fewer errors: Predefined roles make it easier to manage user access, cutting down on time-consuming manual tasks and minimizing the risk of mistakes. Easier compliance management: Stay ahead of industry regulations with structured access controls that simplify audits and reporting, giving you peace of mind. Lower costs & reduced risk: Automating access management reduces administrative overhead and helps prevent costly security breaches. Seamless scalability: As your business grows, easily adjust user permissions to match evolving team structures and operational needs. Simplified access control: Manage database access efficiently by assigning roles at scale, making administration more intuitive and governance more effective. DigitalOcean Managed MongoDB: Better than ever With the introduction of MongoDB 8.0 and RBAC, DigitalOcean Managed MongoDB is now more powerful, secure, and efficient than ever. Whether you’re scaling workloads, optimizing queries, or strengthening security, these updates empower you to manage your MongoDB clusters with greater confidence and ease. Get started today and take full advantage of these cutting-edge enhancements in DigitalOcean’s Managed MongoDB! To create a new cluster with MongoDB 8.0, or to upgrade your existing cluster through the DigitalOcean Control Panel or API, check out the DigitalOcean site . Ream more about these new features in DigitalOcean's blog about MongoDB 8.0 and RBAC , 0r simply try DigitalOcean Managed MongoDB by getting started here !

May 7, 2025

Ubuy Scales E-Commerce Globally and Unlocks AI With MongoDB

In today’s digital era, global e-commerce presents a major growth opportunity. This is particularly acute for businesses looking to expand beyond their local markets. While some companies thrive by serving domestic customers, others capitalize on cross-border e-commerce to reach a wider audience. Ubuy , a Kuwait-based e-commerce company, is tapping into this opportunity. Operating in over 180 countries, Ubuy enables customers worldwide to purchase products that may not be available in their local markets. The Ubuy App, which is also available to users on both iOS and Android, supports over 60 languages internationally and is a popular way to access Ubuy’s platform. Ubuy simplifies logistics, customs, and shipping to create a seamless shopping experience. It acts as a bridge between customers and international sellers. Unlike traditional marketplaces, Ubuy provides end-to-end services, from sourcing products and performing quality checks to handling shipping and customs. This ensures that products, even those newly launched in international markets, are accessible to buyers globally with minimal hassle. Founded in 2012, Ubuy initially focused on the Gulf Cooperation Council region. Having identified a gap in international products’ availability there, it used the next three years to expand its services. Today, Ubuy offers a diverse catalog of over 300 million products, and customers worldwide can access products from nine warehouses strategically located in the United States, the United Kingdom, China (including in Hong Kong), Turkey, Korea, Japan, Germany, and Kuwait. Scaling such a vast operation represented a significant technological challenge. MongoDB Atlas proved critical in enabling Ubuy to scale its operations and address specific search performance and inventory management issues. Overcoming search and scalability challenges Before adopting MongoDB Atlas, Ubuy relied on MySQL to manage product data and search functions. However, this model’s limitations led to performance bottlenecks - it couldn’t handle large-scale search operations, lacked high availability, and struggled to manage complex search queries from customers across different markets. Slow query responses, averaging as much as 4–5 seconds per search, impacted the user experience, making it critical for Ubuy to identify a more scalable and performant solution. Ubuy migrated to MongoDB Atlas and implemented both MongoDB Atlas Search and MongoDB Atlas Vector Search to overcome these hurdles. By using these products, Ubuy significantly improved search efficiency, reducing response times to milliseconds. The company can now ensure high search relevancy, enabling users to find products more accurately and quickly. Migrating a large platform to MongoDB Atlas At Ubuy’s scale, the migration to MongoDB Atlas required careful planning. In March 2023, the team conducted a proof of concept to test MongoDB Atlas’s capabilities in handling its vast inventory. A month later, the migration was complete: Ubuy had transitioned from MySQL to a fully managed MongoDB Atlas environment. The transition was seamless, with no downtime. The MongoDB team provided ongoing guidance to help Ubuy optimize search filters and facilitate a smooth integration with its existing e-commerce systems. The result was an improved customer experience through faster and more relevant search results. Ubuy chose MongoDB Atlas for three key reasons: Scalability: MongoDB Atlas provides the ability to handle massive data loads efficiently, enabling smooth search performance even during peak traffic. High availability: As a fully managed cloud database, MongoDB Atlas provides resilience and reduces downtime. AI-powered search: The use of MongoDB Atlas Search improves Ubuy’s product discovery experience, helping customers find the right products without seeing unnecessary results. Additionally, MongoDB Atlas Vector Search provides semantic search capabilities. This enables more intuitive product discovery based on intent rather than merely on keywords, enhancing customer satisfaction. Using AI-powered enhancements to drive customer engagement Beyond improving search performance, Ubuy has been enhancing its customers’ shopping experience through AI. Ubuy integrated AI-powered search and recommendation systems with MongoDB Atlas’s vector database capabilities. This enabled a transition from simple keyword-based searches to a more intuitive, intent-driven discovery experience. For example, when a user searches for a specific keyword, like “Yamaha guitar,” the AI-enhanced product page now provides structured information on this product’s suitability for beginners, professionals, and trainers. This improves user experience and enhances SEO visibility, driving organic traffic to Ubuy’s platform. “With MongoDB Atlas Search and Atlas Vector Search, we are able to deliver personalized product recommendations in real-time, making it easier for customers to find what they need faster than ever before,” said Mr. Omprakash Swami, Head of IT at Ubuy. Achieving response speed and business growth Since implementing MongoDB Atlas and AI-driven enhancements, Ubuy has seen remarkable improvements: Search response time reduced from 4–5 seconds to milliseconds Over 150 million search queries handled annually with improved relevancy Higher engagement on product pages due to AI-enriched content Ability to scale inventory beyond 300 million products with zero performance concerns “Moving to MongoDB Atlas and being able to use features such as Atlas Vector Search have been a game changer,” said Swami. “The ability to handle massive search queries in milliseconds while maintaining high relevancy has dramatically improved our customer experience and business operations. The flexibility of MongoDB Atlas has not only improved our search performance but also set the stage for AI-powered innovations that were previously impossible with our relational database setup.” Enhancing the future of e-commerce Looking ahead, Ubuy aims to optimize search by consolidating inventory visibility across multiple stores. The goal is to enable users to search across all warehouses from a single interface, delivering even greater convenience. Ubuy’s transformation showcases how employing MongoDB Atlas, along with its fully-integrated search capabilities and AI-driven insights, can significantly enhance global e-commerce operations. By addressing scalability and search relevance challenges, the company has positioned itself as a leader in cross-border e-commerce. With a relentless focus on innovation, Ubuy is set to redefine how consumers access international products. Together, Ubuy and MongoDB are helping make shopping across borders effortless and efficient. Visit our product page to learn more about MongoDB Atlas Search. Check out our Atlas Vector Search Quick Start Guide to get started with Vector Search today. Boost your MongoDB skills with our Atlas Learning Hub .

May 6, 2025

MongoDB Atlas 與生成式AI完美結合,永豐銀行數位金融服務再進化

在金融科技蓬勃發展的背景下, 永豐銀行 積極運用巨量資料分析、AI、區塊鏈技術,推出獨具特色的的金融產品和服務,全力提升消費者體驗、實現普惠金融。永豐銀行已獲得眾多年輕族群青睞,在競爭日益激烈的金融市場中佔有一席之地。 永豐銀行資訊處資深專案副理楊文淵表示:「企業推動生成式AI的速度,將決定其在產業中的競爭力。藉由MongoDB Atlas與生成式AI完美結合,我們順利推動永豐雲 Chat! 和投資水晶球等兩大專案,不僅提升同仁的工作效率,也讓消費者享有更好的金融服務。」 生成式AI席捲全球 發展創新服務迫在眉睫 在各方面都展現強大威力的生成式AI,近來已成為企業發展創新服務的重要技術。面對消費者需求難以捉模,加上金融產業競爭日趨激烈,永豐銀行積極思考運用生成式AI發展創新服務,以及提升員工工作效率的方法。 永豐銀行於2023年第一季導入 Azure OpenAI 企業版 GPT-3.5之後,隨即在公司舉辦「永豐銀行 Let’s Chat: GAI 提案競賽」,當時已運用MongoDB Enterprise Advanced 推出智慧收支帳本 。得知MongoDB Atlas支援Vector Search後,永豐銀行資訊處決定引進該服務,進一步推動「永豐雲Chat!」和「投資水晶球」專案。 彈性資料結構與高可用度是提升應用服務品質關鍵 MongoDB Atlas 成為永豐銀行首選資料庫平台的原因,首先在於其「彈性資料結構」,使開發團隊能快速實現和驗證新想法,MongoDB Atlas同時提供Atlas Search & Vector Search功能一次到位的資料庫應用平台, 讓永豐雲的開發團隊可以一站式實踐GAI內容生式計畫也替維運團隊省下維護兩個資料庫的人力與時間。 永豐銀行資訊處專案副理范姜峻浩指出,雲端服務的優點是「 調整資源方便」,開發團隊可快速部署雲資源驗證想法可行性之外,日後應用服務上線時也可根據負載量自動擴展 (Auto Scaling) ,因應臨時的大量讀寫挑戰。 當然,先前「智慧收支帳本 」專案的成功經驗,也是我們進一步選擇MongoDB Atlas的主因。」 在MongoDB Atlas助攻下,永豐雲 Chat! 和投資水晶球服務分別於2024年3月、7月上線。永豐雲 Chat! 提供生成文案、文字摘要、語言翻譯等功能,有助於員工提升辦公效率。至於查找規章、辦法、業務窗口、申請書、文件等服務,則為員工省下繁瑣資料搜尋時間。截至2024年上半年為止,永豐雲 Chat!的銀行部門覆蓋率達到100%、8月份呼叫次數累計達到28,908次。 為永豐銀行用戶提供的Orbit.AI投資水晶球服務,從全球500多個媒體平台的新聞資料庫中,分類並篩選出與三大市場最相關新聞。藉由運用GAI生成式技術協助,加入時間序、新聞權重及永豐投顧的專業展望等參數,產出近5日內最貼近現況的重要市場資訊,助客戶快速掌握趨勢。 此外,永豐銀行資訊處專案副理張傳銘表示,「MongoDB Atlas Vector Search除了支援儲存向量,也可直接針對向量資料進行搜尋,作為發展檢索增強生成(RAG)、語意搜尋、推薦引擎、動態個性化等服務的基礎。加上統一由MongoDB進行管理,也成為我們採用的關鍵。」 「在消費者體驗至上的年代,金融產業加快推出創新服務的速度,全力打造最佳消費者體驗。從2023年的智慧帳本服務,到2024年的永豐雲 Chat!、投資水晶球,我們將持續運用生成式AI推出創新金融服務,MongoDB將是我們不可或缺的夥伴。」-永豐銀行資訊處資深專案副理楊文淵

May 6, 2025

Teach & Learn with MongoDB: Professor Chanda Raj Kumar

Welcome to the second edition of our series highlighting how educators and students worldwide are using MongoDB to transform learning. In this post, we chat with Professor Chanda Raj Kumar of KL University Hyderabad. The MongoDB for Educators program provides free resources like curriculum materials, MongoDB Atlas credits, certifications, and access to a global community of more than 700 universities—helping educators teach practical database skills and inspire future tech talent. Applied learning: Using MongoDB in real-world teaching Chanda Raj Kumar , Assistant Professor at KLEF Deemed to be University, Hyderabad, India, is a MongoDB Educator and Leader of the MongoDB User Group—Hyderabad. With ten years of teaching experience, he empowers students to gain hands-on experience with MongoDB in their projects. Thanks to his mentorship, during last semester’s Skill Week, 80% of his students earned MongoDB certifications, preparing them for careers in tech. His dedication earned him the 2024 Distinguished Mentor Award from MongoDB. His story shows how educators can use MongoDB to inspire students and prepare them for careers in tech. Tell us about your educational and professional journey and what initially sparked your interest in databases and MongoDB. My educational journey consists of an undergraduate degree from Kakatiya University. Following that, I pursued an M.Tech from Osmania University, where I gained immense knowledge in the landscape of computer science, which aided in laying a strong foundation for my technical expertise. Currently, I am pursuing a PhD from Annamalai University, focusing my research on machine learning. Additionally, qualifying exams like UGC NET and TSET have further strengthened my understanding of databases and why they are a core aspect of developing an application. Over the past ten years, I have gained extensive experience in academia and industry, and I currently serve as an Assistant Professor at KL University, Hyderabad. My interest in databases stems from their universal presence in almost every application. Early on, when I first dabbled into the world of databases, I was intrigued by how efficient storage mechanisms severely impact the speed and accuracy of data retrieval and other operations that will be performed on data through our application. While working with relational databases, I encountered challenges related to fixed schemas—certain data insertions were not feasible due to strict structural constraints or the unavailability of data types corresponding to spatial and vectorial data. This led me to delve into MongoDB, where the flexible JSON-based document structure provided a more scalable and dynamic approach to data management, along with MongoDB Atlas conforming to the rapidly evolving cloud computing of today's time. What courses related to databases and MongoDB are you currently teaching? At my university, I teach database-related courses across different levels. As a core course, I teach Database Management Systems (DBMS), covering database fundamentals and operations. I also handle Python Full Stack, MERN Stack, and Java Full Stack Development, integrating MongoDB with modern frameworks. Additionally, I conduct MongoDB certification courses, helping students gain industry-standard knowledge in database technologies. What motivated you to incorporate MongoDB into your curriculum? My journey with databases began when I realized the challenges of relational databases like SQL, with their rigid schema and complex queries. This led me to explore MongoDB, which offers a more flexible, user-friendly approach to data management. I actively advocate for adding MongoDB to the college curriculum to prepare students for the growing demand for NoSQL technologies. By teaching MongoDB alongside relational databases, I aim to help students build practical skills to design and manage modern, dynamic applications. You have successfully built an active student community around MongoDB on your campus. Can you share some insights into how you achieved this and the impact it's had on students? Building an active student community around MongoDB on campus has not only been an exciting journey, but a very enlightening one as well. I had concentrated on a step-by-step teaching approach, beginning with the basics and slowly making my way up to more complex topics. This helped students build a strong foundation while feeling confident and thorough about the things they were learning. One of the main ways I involved students was by incorporating MongoDB into different courses, where they could work on hands-on projects that required using the database. I also encouraged students to earn certifications like Developer and DBA, which gave them valuable credentials and a nod to their MongoDB skills. Furthermore, I arranged group discussions where students brainstormed, solved problems together, and stayed actively engaged in their learning. On top of that, I held special training sessions each semester called “Skill Weeks” that lasted a week to make sure that everyone was aware of the ongoing MongoDB advancements while also teaching newcomers. How do you design your course content to integrate MongoDB in a way that engages students and ensures practical learning experiences? I often begin by building a strong foundation, going over fundamental concepts such as document-oriented storage, collections, indexing, and CRUD operations to ensure students grasp the essentials. Once a solid base has been established, I introduce advanced concepts like aggregation pipelines, indexing, query optimization techniques, and sharding, whilst putting utmost emphasis on hands-on learning with real datasets to further fortify understanding. I also incorporate real-world projects where students design and build complete applications that integrate MongoDB in the backend and thereby, simulate industry use cases to enhance their problem-solving in a professional environment. As for the certification component, I include model quizzes, practice tests, and assignments to evaluate their knowledge and ensure they are job-ready with a validated skill set. How has MongoDB supported you in enhancing your teaching methodologies and upskilling your students? The curated learning paths and comprehensive resources through MongoDB Academia , such as PowerPoint presentations for educators, have best supported me and my teaching methods. The platform offers a wide variety of materials, covering basic to advanced concepts, often accompanied by visual aids that make complex concepts easier to grasp. The learning paths also provide a set of practice questions for the students that can reinforce their understanding. Moreover, the availability of the Atlas free cluster allows students to experiment with real-world database operations without cost, providing a practical experience. These resources offered by MongoDB have significantly reshaped my pedagogy to better accommodate practical elements. Have you conducted any projects or studies on students' experiences with MongoDB? If so, what key insights have you discovered, and how can they benefit other educators? Through surveys, Q&A sessions, and project reviews, I have identified students' strengths and weaknesses in working with MongoDB. Many students find the document-oriented model intuitive and appreciate the flexibility of schema design, but often struggle with optimizing queries, indexing strategies, and understanding aggregation pipelines. These insights have helped me refine and iterate my teaching style by focusing more on demonstrations, interactive exercises, and explanations targeted at complex topics. Other educators can benefit from these conclusions I have arrived at by incorporating regular feedback sessions and adapting their teaching methods to address these loopholes. Could you share a memorable experience or success story of a project from your time teaching MongoDB that stands out to you? One of the most memorable experiences from my time teaching MongoDB was during Skill Week, where 80% of my students earned MongoDB certifications. The structured pedagogy I implemented—combining hands-on learning, real-world projects, and guided problem-solving—played a crucial role in their success. This success was further recognized when I received an award last semester for my contributions to MongoDB education, further proving the impact of my teaching approach. Seeing students excel, gain industry-recognized skills, and confidently apply MongoDB skills in their careers has been incredibly rewarding for me. How has your role as a MongoDB Educator impacted your professional growth and the growth of the student community at your university? I have been able to demonstrate the power of non-relational databases, breaking the initial stigma about NoSQL databases and helping students see the advantages of flexible, scalable data models. This journey has also helped me secure my position as a subject matter expert, allowing me to lead discussions on advanced database concepts and real-world applications. As a MongoDB User Group (MUG) leader, I have built a global network, collaborating with educators, developers, and industry professionals. Additionally, conducting mentoring workshops at other colleges has strengthened my leadership skills while expanding MongoDB awareness beyond the scope of my institution. Most importantly, this role has provided students with direct industry exposure, which I believe plays a pivotal role in the growth of their careers. What advice would you give to educators who are considering integrating MongoDB into their courses to ensure a successful and impactful learning experience for students? My advice is to build upon students’ pre-existing knowledge while gradually introducing the transition or shift to NoSQL concepts. Since most students start with relational databases, it’s important to first highlight the key differences between SQL and NoSQL, and to explain when to use each. Given that students are generally inclined toward SQL (as it’s often the first database they work with), introducing MongoDB as a schema-less, document-oriented database makes the transition smoother. Once the basics are covered, progressing to advanced topics like data modeling, aggregation pipelines, and indexing ensures students gain a deeper understanding of database optimization and performance tuning. By adopting this structured approach, educators can provide a comprehensive, real-world learning experience that prepares students for industry use cases. To learn more, apply to the MongoDB for Educators program and explore free resources for educators crafted by MongoDB experts to prepare learners with in-demand database skills and knowledge.

May 5, 2025

Announcing the MongoDB MCP Server

Today, MongoDB is pleased to share the MongoDB Model Context Protocol (MCP) Server in public preview. The MongoDB MCP Server enables AI-powered development by connecting MongoDB deployments—whether they’re on MongoDB Atlas, MongoDB Community Edition, or MongoDB Enterprise Advanced—to MCP-supported clients like Windsurf, Cursor, GitHub Copilot in Visual Studio Code, and Anthropic’s Claude. Using MCP as the two-way communication protocol, the MongoDB MCP Server makes it easy to interact with your data using natural language and perform database operations with your favorite agentic AI tools, assistants, and platforms. Originally introduced by Anthropic, the Model Context Protocol has been gaining traction as an open standard for connecting AI agents and diverse data systems. The growing popularity of MCP comes at a pivotal moment as LLMs and agentic AI are reshaping how we build and interact with applications. MCP unlocks new levels of integrated functionality, ensuring that the LLMs behind agentic workflows have access to the most recent and contextually relevant information. And it makes it easier than ever for developers to take advantage of the fast-growing and fast-changing ecosystem of AI technologies. The MongoDB MCP Server: Connecting to the broader AI ecosystem The MongoDB MCP Server enables developer tools with MCP clients to interact directly with a MongoDB database and to handle a range of administrative tasks, such as managing cluster resources, as well as data-related operations like querying and indexing. Figure 1. Overview of MongoDB MCP Server integration with MCP components. Forget separate tools, custom integrations, and manual querying. With the MongoDB MCP Server, developers can leverage the intelligence of LLMs to perform crucial database tasks directly within their development environments, with access to the most recent and contextually relevant data. The MongoDB MCP Server enables: Effortless data exploration: Ask your AI to "show the schema of the 'users' collection" or "find the most active users in the collection." Streamlined database management: Use natural language to perform database administration tasks like "create a new database user with read-only access" or "list the current network access rules." Context-aware code generation: Describe the data you need, and let your AI generate the MongoDB queries and even the application code to interact with it. AI-powered software development with Windsurf and MongoDB To make it easier for developers everywhere to use the MongoDB MCP Server right away, we've made it available out of the box in Windsurf , an AI code editor used by over a million developers and counting. Developers building with MongoDB can leverage Windsurf's agentic AI capabilities to streamline their workflows and accelerate application development. “MongoDB is aligned with Windsurf’s mission of empowering everyone to continuously dream bigger,” said Rohan Phadte, Product Engineer at Windsurf. “Through our integration with the MongoDB MCP Server, we’re helping innovators to create, transform, and disrupt industries with software in this new age of development. Developers can get started today by accessing the MongoDB MCP Server through our official server templates, and take advantage of the combined power of Windsurf and MongoDB for building their next project.” Figure 2. Windsurf MCP server templates. The MongoDB MCP Server in action Check out the videos below to see how to use the MongoDB MCP Server with popular tools like Claude, Visual Studio Code, and Windsurf. Using the MongoDB MCP Server for data exploration With an AI agent capable of directly accessing and exploring your database guided by natural language prompts, you can minimize context switching and stay in the flow of your work. Using the MongoDB MCP Server for database management The MongoDB MCP Server enables AI agents to interact directly with MongoDB Atlas or self-managed MongoDB databases, making it easier to automate manual tasks around cluster and user management. Using the MongoDB MCP Server for code generation Using LLMs and code agents has become a core part of developers’ workflows. Providing context, such as schemas and data structures, enables more accurate code generation, reducing hallucinations and enhancing agent capabilities. The future of software development is agentic The MongoDB MCP Server is a step forward in MongoDB’s mission to empower developers with advanced technologies to effortlessly bring bold ideas to life. By providing an official MCP server release, we’re meeting developers in the workflows and tools they rely on to build the future on MongoDB. As MCP adoption continues to gain momentum, we’ll continue to actively listen to developer feedback and to prioritize enhancements to our MCP implementation. If you have input on the MongoDB MCP Server, please create an issue on GitHub . And to stay abreast of the latest news and releases from MongoDB, make sure you check out the MongoDB blog . Check out the MongoDB MCP Server on GitHub and give it a try—see how it can accelerate your development workflow!

May 1, 2025

The Best Solutions Architects Work at MongoDB

I originally wrote this article after five years of working at MongoDB. Seven years later, the post is still as accurate as the day it was published. This article highlights the broad range of technology and sales skills required to be successful as a Solutions Architect (SA) at MongoDB, skills which led me to claim (and still claim) that you have to be the best of the best to be an SA here. However, MongoDB has changed, the technology landscape has changed, the Solution Architect role has changed, and I have changed along with them. This change has been pioneered by the SA organization. Before taking a look at how SAs pioneered some of MongoDB’s evolution, let’s take a look at some of the biggest changes at MongoDB since I wrote this article: MongoDB Atlas : Atlas has become the platform by which most organizations use MongoDB for production workloads. Unified database platform: MongoDB has transitioned from a database company to a full-featured unified database platform expanding to include full-text and vector search , stream processing , and time series data, thus enabling a single platform and query language to support all required data operations to build modern applications. Generative AI: MongoDB has become the preferred way many organizations build AI applications and use AI to rapidly modernize legacy applications and eliminate technical debt. In terms of technical knowledge, MongoDB SAs now need to have a detailed understanding of everything I listed in the original article, plus: A technical understanding of all three cloud providers, cloud security, and how to securely deploy applications in Atlas. SAs are strongly encouraged to obtain or expand their cloud certifications during their MongoDB tenure. An understanding of how to build applications using all the capabilities of the unified database platform beyond the core database. An understanding of how to build AI applications using LLMs and surrounding GenAI technologies. The MAAP reference architectures provide as much guidance to our customers as to new MongoDB SAs. It can be a lot, but it is also a great way to expand your technical expertise, grow your career, and ensure that your skills are current. One perspective I failed to capture in the original article is how the Solutions Consulting organization has been a centerpiece in driving the evolution of MongoDB. This is probably due to the unique position SAs play at the intersection of customer sales and delivery conversations, product marketing feedback, and technology. Consider MongoDB’s support for Generative AI, exemplified by the Modernization Factory and MAAP programs. The kernel of the idea that led to the Modernization Factory came from one of my colleagues. After some hard internal selling, the SA organization held a hackathon to brainstorm and conceptualize what type of application modernizations were possible with GenAI. (For those curious, the hackathon was set up as North America versus Europe with each region building 4 applications and was judged by a cross-functional team including many members of MongoDB’s executive staff, including our CEO, Dev Ittycheria. North America won) The power of the ideas generated by the hackathon led to the first Modernization Factory projects, and those successes led to the expansion of the Modernization Factory program. The genesis of the MAAP program followed a similar trajectory: internal hackathons expanding our understanding of how MongoDB could be leveraged as the platform for GenAI applications. It also produced a set of compelling customer demos that have been leveraged to expand customer understanding of the value of using MongoDB as the data platform for AI applications. This set of initial ideas and proof of concepts was then formalized by MongoDB’s product and partner teams into MAAP. The one thing that has always been constant at MongoDB is change—I’m certain that the Solutions Consulting organization will continue to play a role in pioneering it. The best solutions architects work at MongoDB Despite the bravado in the title, the purpose of this article is not to say that MongoDB Solutions Architects (SAs) are better than those working at other organizations. Rather, this article argues that the unique challenges encountered by SAs at MongoDB imply that successful MongoDB SAs are some of the best in the business. This assertion is derived from the unique challenges encountered by both supporting MongoDB customers and the MongoDB sales organization, and the breadth and depth of skills and knowledge required to be successful. To see why this is the case, let’s explore the role of an SA at MongoDB and the wide range of skills a Solutions Architect must master. A MongoDB SA (sometimes called a Sales Engineer in other organizations) is an engineer who supports the sales organization. The role is multi-faceted. A solutions architect must have: In-depth technical knowledge to both understand a customer’s technical challenges and to articulate how MongoDB addresses them Communication skills to present technical concepts in a clear and concise manner while tactfully dealing with skeptics and those more familiar with other technologies Sales skills to engage a prospect to learn their business challenges and the technical capabilities required to address those challenges Design and troubleshooting skills to assist prospects with designing solutions to complex problems and getting them back on track when things go wrong. The description above may make the MongoDB Solutions Architect role sound like other similar roles, but there are unique features of MongoDB (the product) and its competitive situation that make this role extremely challenging. We will explore this in the sections below. Technology While the strength of MongoDB and a major factor in its success has been the ease with which it can be adopted by developers, MongoDB is a complex product. Presenting MongoDB, answering questions, brainstorming designs, and helping resolve problems requires a wide range of knowledge, including: The MongoDB query language Application development with MongoDB’s drivers in 10+ different programming languages Single and multi-data center architectures for high availability Tuning MongoDB to achieve the required level of performance, read consistency, and write durability Scaling MongoDB to manage TBs of data and thousands of queries per second Estimating the size of a cluster (or the cloud deployment costs) required to meet application requirements Best practices for MongoDB schema design and how to design the best MongoDB schema for a given application MongoDB Enterprise operations tools: Ops Manager , Compass , etc. Atlas : MongoDB’s Database as a Service Offering MongoDB’s various connectors: BI/Atlas SQL , Spark , and Kafka . Migration strategies from RDBMS (and other databases) to MongoDB and Relational Migrator . This is a lot to know, and there is a lot of complexity. In addition to the core knowledge listed above, understanding the internal workings of MongoDB is essential when designing applications with high-performance and scalability requirements. Therefore, most Solutions Architects understand MongoDB’s internal architecture, such as how the WiredTiger storage engine works or how a MongoDB cluster manages connections. To make the SA role even more challenging, organizations often choose MongoDB after failing with some other technology. (Maybe their RDBMS didn’t scale, or it was too difficult to expand to handle new sources of data, or Hadoop processing did not meet real-time requirements, or some other NoSQL solution did not provide the required query expressibility and secondary indexes.) This means that MongoDB is often used for bleeding-edge applications that have never been built before. One of the roles of an SA is to understand the application requirements and help the application team come up with an initial design that will ensure their success 1 . It is probably obvious to experienced SAs, but SAs need to understand the capabilities, strengths, and weaknesses of all competing and tangential solutions as well. MongoDB’s biggest competitors are Oracle, Amazon, and Microsoft – all of whom are constantly evolving their product offerings and marketing strategies. An SA must always keep their knowledge up to date as the market evolves. Communication Being a great technologist is not enough. An SA spends at least as much time communicating with customers as they do working with technology. Communication is sometimes in the form of a standard presentation or demo, but it most often entails detailed technical conversations about how MongoDB works or how MongoDB can be used to address a particular problem. Concise technical explanations that address customer questions using language tailored to their particular situation and frame of reference are the hallmark of an SA. MongoDB SAs have to be comfortable communicating with a wide range of people, not just development teams. They must engage operations, line of business stakeholders, architects, and technology executives in sales discovery conversations and present the technical aspects of MongoDB of most concern at the appropriate level of detail. For example, an SA must be able to provide technology executives with an intuitive feel for why their development teams will be significantly more productive with MongoDB or will be able to deploy a solution that can meet scalability and performance requirements unattainable with previous technology approaches. Similarly, an SA must learn an operations team’s unique challenges related to managing MongoDB and describe how tools like Ops Manager and Atlas address these requirements. Public speaking skills are also essential. Solutions Architects deliver webinars, speak at conferences, write blog posts, and lead discussions and MongoDB User Groups (MUGs). Sales An SA is a member of the Sales organization, and “selling” is a big part of the role. Selling involves many aspects. First, SAs assist the MongoDB Account Executives with discovery and qualification. They engage the customer in conversations to understand what their current problems are, their desired solution, the business benefits of the solution, the technical capabilities required to implement this solution, and how they'll measure success. After every customer conversation, SAs work with their Account Executives to refine their understanding of the customer’s situation and identify information that they want to gather at future meetings. Once the required technical capabilities are understood, it is the SA’s role to lead the sales activities that prove to the customer that (1) MongoDB meets all their required capabilities and (2) MongoDB meets these capabilities better than competing solutions. Most of the time, this is accomplished via customer conversations, presentations, demonstrations, and design brainstorming meetings. Finally, customers sometimes want to test or validate that MongoDB will meet their technical required capabilities. This is often in the form of a proof of concept (POC) that might test MongoDB performance or scalability, the ease of managing MongoDB clusters with its operations tools, or that MongoDB’s BI Connector and Atlas SQL provide seamless connectivity with industry-standard BI Tools, such as Tableau . SAs lead these POC efforts. They work with prospects to define and document the scope and success criteria and work with the prospect during the course of a POC to ensure success. Design and troubleshooting I alluded to this in the “Technology” section: helping prospects with creative problem solving distinguishes SAs at MongoDB. Organizations will choose MongoDB if they believe they understand how it will help them succeed. Imparting this understanding (a big part of the Solutions Architect’s role) is typically done by helping an organization through some of the more thorny design challenges and implementation decisions. Organizations will choose MongoDB when they understand the framework of a good MongoDB design for their use case and believe all their design requirements will be met. Designing a solution is not a yes-or-no question that can be researched in the documentation, but rather one that is found through deep technical knowledge, careful analysis, and trade-offs among many competing requirements. The best answer is often found through a collaborative process with the customer. SAs often lead these customer discussions, research solutions to the most challenging technical problems, and help craft the resulting design. Solutions Architects are also a source of internal innovation at MongoDB. Since Solutions Architects spend a significant amount of time speaking with customers, they are the first to realize when marketing or technical material is not resonating with customers or is simply difficult to understand. The pressure of short timelines and the desire to be successful often results in innovative messaging and slides that MongoDB’s Product Marketing organization often adopts. Similar innovation often occurs with respect to MongoDB feature requests and enhancements. SAs are continually working with customers to help them solve problems, and they quickly identify areas where MongoDB’s enhancements would provide significant value. The identification of these areas and specific recommendations from SAs on what product enhancements are required have played a big role in focusing the feature set of future MongoDB releases. Project management Lastly, SAs often support a number of Account Executives and work on several dozen sales opportunities per quarter. This means that SAs are working a large number of opportunities simultaneously and must be highly organized to ensure that they are prepared for each activity and complete every follow-up item in a timely manner. It is not possible for an SA manager to track or completely understand every sales opportunity, so SAs must be self-motivated and manage all their own activities. Summary Solutions Architecture at MongoDB is a challenging and rewarding role. The wide range of technical knowledge, plus sales and communication skills required to be successful, is common to SA roles. When you combine this with the need for SAs to design innovative solutions to complex (often previously unsolvable problems), the SAs have the set of skills and the track record of success that makes them the “best” in the business. If you want to join the best, check out the open roles within Solutions Consulting at MongoDB. About the author Jay Runkel is a Distinguished Solutions Architect at MongoDB, where he has spent the past 12 years helping organizations harness the power of modern data solutions. For the past two years, he has served as the North American Technical Lead for MongoDB’s Modernization Factory program, an initiative that leverages generative AI to accelerate the transformation of legacy applications to MongoDB. Prior to joining MongoDB, Jay was a Principal Technologist at MarkLogic, where he worked with financial services, healthcare, and media companies to build operational systems for analytics and custom publishing. His career spans a wide range of technology domains, including encryption asset management, automated underwriting, product information management, and CRM solutions. Jay holds a B.S. in Applied Mathematics from Carnegie Mellon University and a Master’s in Computer Science from the University of Michigan. 1 My favorite part of the job is to get locked in a conference room and whiteboard for 4 hours with a development team to brainstorm the MongoDB solution/design for a particular use case. The most valuable end product of this session is not the design, but the development’s belief that they will be successful with MongoDB and that the development process will be easier than they expected.

April 30, 2025