Romina Lopez Carranza

3 results

Multi-Agentic Systems in Industry with XMPro & MongoDB Atlas

In 2025, agentic AI applications are no longer pet projects—companies around the world are investing in software to incorporate AI agents into their business workflows. The most common use of an AI agent is to assist with research analysis or writing code. LangChain’s recent survey of over 1000 professionals across multiple industries showed that over 51% have already deployed agents in production, with 60% using the agents for research and summarization tasks. However, leveraging an AI agent for more complex tasks than research and summarization—and implementing them in industrial environments like manufacturing—presents certain challenges. For example, as new technology is introduced into already established companies, the visibility of brownfield deployments increases. This installation and configuration of new hardware or software must coexist with legacy IT systems. And, while it is easy to run an AI agent in a sandboxed environment, it is harder to integrate agents with machines and Operational Technology (OT) systems speaking industrial protocols like Modbus, PROFINET, and BACnet due to existing legacy infrastructure and an accumulation of tech debt. To ensure governance and security in industrial environments, data security policies, regulatory compliance, and governance models are essential. Agent profiles with defined goals, rules, responsibilities, and constraints must be established before agents are deployed. Additionally, addressing real-world constraints—like LLM latency—and strategically selecting use cases and database providers can enhance Al agent effectiveness and optimize response times. What’s more, the successful implementation of AI agents in industrial environments requires a number of foundational elements, including: Flexible data storage and scalability: An agent requires different types of data to function, such as agent profile, short-term memory, and long-term memory. Industrial AI agents require even more types of data, such as time series data from sensors and PLCs. They require efficient and scalable data storage that adapts to the dynamic needs of the environment. Continuous monitoring and analysis: An agent deployed in a manufacturing environment requires real-time observability of ever-changing data generated by the factory. It also needs to keep humans in the loop for any critical decisions that might affect production. High availability: Industrial environments demand near-zero downtime, making system resilience and failover capabilities essential. XMPro joins forces with MongoDB To address these challenges, we are pleased to announce XMPro’s partnership with MongoDB. XMPro offers APEX AI , a low-code control room for creating and managing advanced AI agents for industrial applications. To ensure seamless control over these autonomous agents, XMPro APEX serves as the command center for configuring, monitoring, and orchestrating agent activities, empowering operators to remain in control. Figure 1. XMPro APEX AI platform working with MongoDB Atlas. APEX AI, combined with MongoDB Atlas and MongoDB Atlas Vector Search , addresses a variety of challenges faced by developers when building AI agents for industrial environments. XMPro complements this by seamlessly integrating with industrial equipment such as SCADA systems, PLCs, IoT sensors, and ERPs, enabling continuous monitoring of operations. This integration ensures real-time data acquisition, contextualization, and advanced analytics, transforming raw data into actionable insights. XMPro’s capabilities include condition monitoring, predictive maintenance, anomaly detection, and process optimization, which help reduce downtime and improve operational efficiency while maintaining compliance and safety standards. XMPro’s industrial AI agents rely on memory persistence for contextual decision-making. MongoDB Atlas acts as the database for storing and retrieving agent memories. Using a flexible document database for storing agentic memories enables agents to store different types of data, such as conversational logs, state transitions, and telemetry data, without requiring schema re-design. The capabilities of MongoDB Atlas Vector Search empower APEX AI agents with a retrieval-augmented generation (RAG) tool, which helps to reduce LLM hallucinations. This integration allows agents to access and retrieve verified data, grounding their responses. Having database and vector search tools together in MongoDB Atlas also helps reduce agent latency and speeds up development. APEX AI-enabled multi-agent systems working together in an industrial setting. These context-aware agents can work in tandem, retrieving relevant knowledge stored in MongoDB Atlas to enable meaningful collaboration and better decision-making. XMPro APEX AI also leverages MongoDB Atlas’s robust security and high availability to ensure that agents can securely access and leverage data in real time; features such as role-based access controls, network isolation, encryption in transit and at-rest are key to why this agent-based AI solution is ideal for securing industrial production environments. MongoDB’s highly available and horizontal scalability ensures seamless data access at scale as organizations scale up their APEX AI deployments. Unlocking the future of AI in industrial automation XMPro APEX AI and MongoDB Atlas are a winning combination that paves the way for a new era of industrial automation. By tackling the core challenges of AI agents' deployment in industrial environments, we’re enabling organizations to deploy robust, intelligent, and autonomous industrial AI agents at scale. To learn more about MongoDB’s role in the manufacturing industry, please visit our manufacturing and automotive webpage . Ready to boost your MongoDB skills? Head over to our MongoDB Atlas Learning Hub to start learning today.

April 29, 2025

Why MongoDB is the Perfect Fit for a Unified Namespace

Smart manufacturing is transforming the industrial world by combining IoT, AI, and cloud technologies to create connected, data-driven production environments. Manufacturers embracing this shift are seeing real, measurable benefits: Deloitte reports that smart factory initiatives can boost manufacturing productivity by up to 12% and improve overall equipment effectiveness by up to 20%. But achieving these gains isn’t always straightforward. Many manufacturers still face the challenge of siloed data and legacy systems, making it difficult to get a real-time, holistic view of operations. Shop floor data, enterprise resource planning (ERP) systems, manufacturing execution system (MES) platforms, and other sources often operate in isolation, limiting the potential for optimization. The concept of a Unified Namespace model, which provides a single source of truth for all operational data, is a game-changing approach that helps unify these siloed systems into a cohesive ecosystem. MongoDB, with its powerful document-based model , is perfectly suited to be the backbone of this Unified Namespace model, acting as a flexible, scalable, highly available, and real-time repository that can seamlessly integrate and manage complex manufacturing data. In this blog post, we’ll explore how MongoDB’s architecture and capabilities align perfectly with the needs of a UNS, and how our "Leafy Factory" demo serves as a strong proof point of this alignment. Understanding the Unified Namespace and its importance in manufacturing A Unified Namespace (UNS) is an architecture in which production data across an organization is consolidated into one central data repository. In a manufacturing setup, a UNS enables the integration of diverse sources like ERP for business operations, MES for production monitoring, and real-time shop floor data. This centralized model provides a single, consistent view of data, allowing teams across the organization to access reliable information for decision-making. By unifying data from various systems, a UNS makes it significantly easier to connect disparate systems and ensures that data can be shared seamlessly across platforms, reducing complexity and integration overhead. Unlike the traditional automation pyramid, in which information flows hierarchically from sensors up through control systems, MES, and finally to ERP, the UNS breaks down these layers. It creates a flat, real-time data model that allows every system to access and contribute to a shared source of truth—eliminating delays, redundancies, and disconnects between layers. One of the most impactful advantages of a UNS is real-time data visibility. By centralizing live data streams from the production floor, it provides stakeholders—from operators to executives—with up-to-the-second insights. This immediacy empowers teams to make informed decisions quickly, respond to issues as they arise, and continuously optimize operations. And because the UNS consolidates all data into one namespace, it also unlocks cross-functional insights. Teams can correlate metrics across departmental boundaries—for instance, comparing machine uptime with production targets and financial performance. This integrated perspective enables more strategic planning, better alignment across departments, and continuous improvement initiatives grounded in data. The importance of flexible data to UNS success A key prerequisite for a successful UNS implementation is high adaptability. The model must be capable of easily incorporating new data sources, machines, or production lines without requiring a complete overhaul of the data architecture. This flexibility ensures that as operations evolve and scale, the UNS can grow with them—maintaining a unified and responsive data environment. While the UNS itself does not perform functions like predictive maintenance or cost optimization, it serves as the foundational data layer that enables such advanced applications. By centralizing and contextualizing historical and real-time data on machinery, materials, and production workflows, a UNS provides the essential infrastructure for building IoT-based solutions. With this data in place, manufacturers can develop predictive maintenance strategies, detect anomalies, and optimize costs—leading to reduced downtime, better resource utilization, and smarter decision-making. Together, these capabilities make the Unified Namespace a foundational element for smart manufacturing—bridging systems, enhancing visibility, and enabling data-driven transformation at scale. Figure 1. The automation pyramid versus a Unified Namespace. MongoDB as the ideal central repository for a UNS model The requirements of a UNS model map directly to MongoDB's strengths, making it an ideal choice for manufacturing environments seeking to unify their data. Manufacturing environments always deal with highly variable and constantly evolving data structures, ranging from raw machine sensor data to structured ERP records. This diversity presents a challenge for traditional relational databases, which rely on rigid schemas that are difficult to adapt. MongoDB, with its document-oriented design, offers a more flexible solution. By storing data in JSON-like structures, it allows manufacturers to easily accommodate changes—such as adding new sensors or modifying machine attributes—without the need to redesign a fixed schema. Another key requirement in smart manufacturing is the ability to process data in real time. With streaming data from multiple sources flowing into a UNS, manufacturers can maintain up-to-date information that supports timely interventions and data-driven decision-making. MongoDB supports real-time data ingestion through technologies like Kafka, change streams, and MQTT. This makes it simple to capture live data directly from shop floor machines into a time series collection and synchronize it with information from ERP and MES. Live shopfloor data, ERP, and MES information in one database—combined with MongoDB’s powerful querying, aggregating and analytics capabilities—allows teams to analyze and correlate diverse data streams in one platform. For instance, production teams can cross-reference MES quality metrics with sensor data to uncover patterns that lead to improved quality control. Finance teams can blend ERP cost data with MES output to gain a more comprehensive view of operational efficiency and cost drivers. What’s more, MongoDB’s distributed architecture supports horizontal scaling, which is crucial for large manufacturing operations where data volumes grow quickly. As more machines and production lines are brought online, MongoDB clusters can be expanded seamlessly, ensuring the UNS remains performant and responsive under increasing load. And by serving as a central repository for historical machine sensor data, a UNS allows manufacturers to analyze long-term patterns, detect anomalies, and anticipate maintenance needs. This approach helps reduce unplanned downtime, optimize maintenance schedules, and ultimately lower operational costs. However, with a UNS acting as a centralized data hub, high availability becomes critical—since any failure could disrupt the entire data ecosystem. MongoDB addresses this with replica sets, which provide ultra-high availability and allow updates without any downtime, eliminating the risk of a single point of failure. Proof point: Building a UNS on MongoDB in the "leafy factory" As shown below, MongoDB’s "Leafy Factory" demo offers a hands-on example of how MongoDB serves as an ideal central repository within a UNS for manufacturing. The demo simulates a realistic industrial environment, combining data from SQL-based ERP and MES systems with real-time MQTT streams from shop floor machines. This setup showcases MongoDB’s ability to consolidate and manage diverse data types into a single, accessible, and continuously updated source of truth. Figure 2. Leafy factory UNS architecture. In the demo, SQL data from a simulated MES is ingested into MongoDB. This includes key production planning, monitoring, and quality metrics—all seamlessly captured using MongoDB’s flexible, document-based JSON format. This structure allows the MES data to remain both organized and accessible for real-time analysis and reporting. Similarly, SQL-based ERP data (like work orders, material tracking, and cost breakdowns) is integrated using a combination of Kafka change streams and the MongoDB Sink connector . The SQL data is captured into Kafka topics using the Debezium connector, with SQL acting as a Kafka producer. The data is then consumed, transformed, and inserted into MongoDB via the MongoDB Sink connector, creating a seamless connection between SQL, Kafka, and MongoDB. This approach keeps ERP data continuously synchronized in MongoDB, demonstrating its reliability as a live source of business-critical information. At the same time, simulated MQTT data streams feed real-time shop floor data into the database, including machine status, quality outputs, and sensor readings like temperature and vibration. MongoDB’s support for real-time ingestion ensures that this data is immediately available, enabling up-to-date machine monitoring and faster response times. Change streams play a central role by enabling real-time data updates across systems. For instance, when a work order is updated in the ERP system, the change is automatically reflected downstream in MES and shop floor views—illustrating MongoDB’s capability for bi-directional data flows and live synchronization within a unified data model. Another critical capability shown in the demo is data contextualization and enrichment. As data enters the UNS, MongoDB enriches it with metadata such as machine ID, operator name, and location according to the ISA95 structure. This enriched model allows for fine-grained analysis and filtering, which is crucial for generating actionable, cross-functional insights across manufacturing, operations, and business teams. Together, the Leafy Factory demo not only validates MongoDB’s technical strengths—like real-time processing, flexible data modeling, and scalable architecture—but also demonstrates how these capabilities come together to support a robust, dynamic, and future-ready Unified Namespace for smart manufacturing. Conclusion A Unified Namespace is essential for modern manufacturing, offering a single, consistent view of data that drives operational efficiency, cross-functional insights, and cost savings. MongoDB, with its flexible schema, real-time data processing, and scalability, is uniquely suited to serve as the central repository in a UNS. The Leafy Factory demo showcases MongoDB’s potential in consolidating ERP, MES, and shop floor data, illustrating how MongoDB can transform manufacturing data management, enabling real-time insights and data-driven decision-making. In choosing MongoDB as the backbone of a UNS, manufacturers gain a powerful data infrastructure that not only meets current operational needs but also scales with future growth, creating an agile, responsive, and insight-driven manufacturing environment. Set up the use case shown in this article using our repository . And, to learn more about MongoDB’s role in the automotive industry, please visit our manufacturing and automotive webpage .

April 3, 2025

Multi-Agent Collaboration for Manufacturing Operations Optimization

While there are some naysayers across the media landscape who doubt the potential impact of AI innovations, for those of us immersed in implementing AI on a daily basis, there’s wide agreement that its potential is huge and world-altering. It’s now generally accepted that Large Language Models (LLMs) will eventually be able to perform tasks as well—if not better—than a human. And the size of the potential AI market is truly staggering. Bain’s AI analysis estimates that the total addressable market (TAM) for AI and gen AI-related hardware and software will grow between 40% and 55% annually, reaching between $780 billion and $990 billion by 2027. This growth is especially relevant to industries like manufacturing, where generative AI can be applied across the value chain. From inventory categorization to product risk assessments, knowledge management, and predictive maintenance strategy generation, AI's potential to optimize manufacturing operations cannot be overstated. But in order to realize the transformative economic potential of AI, applications powered by LLMs need to evolve beyond chatbots that leverage retrieval-augmented generation (RAG). Truly transformative AI-powered applications need to be objective-driven, not just responding to user queries but also taking action on behalf of the user. This is crucial in complex manufacturing processes. In other words, they need to act like agents. Agentic systems, or compound AI systems, are currently emerging as the next frontier of generative AI applications. These systems consist of a single or multiple AI agents that collaborate with each other and use tools to provide value. An AI agent is a computational entity containing short- and long-term memory, which enables it to provide context to an LLM. It also has access to tools, such as web search and function calling, that enable it to act upon the response from an LLM or provide additional information to the LLM. Figure 1. Basic components of an agentic system. An agentic system can have more than one AI agent. In most cases, AI agents may be required to interact with other agents within the same system or external systems., They’re expected to engage with humans for feedback or review of outputs from execution steps. AI agents can also comprehend the context of outputs from other agents and humans, and change their course of action and next steps. For example, agents can monitor and optimize various facets of manufacturing operations simultaneously, such as supply chain logistics and production line efficiency. There are certain benefits of having a multi-agent collaboration system instead of having one single agent. You can have each agent customized to do one thing and do it well. For example, one agent can create meeting minutes while another agent writes follow-up emails. It can also be implemented on predictive maintenance, with one agent analyzing machine data to find mechanical issues before they occur while another optimizes resource allocation, ensuring materials and labor are utilized efficiently. You can also provision dedicated resources and tools for different agents. For example, one agent uses a model to analyze and transcribe videos while the other uses models for natural language processing (NLP) and answering questions about the video. Figure 2. Multi-agent collaboration system. MongoDB can act as the memory provider for an agentic system. Conversation history alongside vector embeddings can be stored in MongoDB leveraging the flexible document model. Atlas Vector Search can be used to run semantic search on stored vector embeddings, and our sharding capabilities allow for horizontal scaling without compromising on performance. Our clients across industries have been leveraging MongoDB Atlas for their generative AI use cases , including agentic AI use cases such as Questflow , which is transforming work by using multi-agent AI to handle repetitive tasks in strategic roles. Supported by MiraclePlus and MongoDB Atlas, it enables startups to automate workflows efficiently. As it expands to larger enterprises, it aims to boost AI collaboration and streamline task automation, paving the way for seamless human-AI integration. The concept of a multi-agent collaboration system is new, and it can be challenging for manufacturing organizations to identify the right use case to apply this cutting-edge technology. Below, we propose a use case where three agents collaborate with each other to optimize the performance of a machine. Multi-agent collaboration use case in manufacturing In manufacturing operations, leveraging multi-agent collaboration for predictive maintenance can significantly boost operational efficiency. For instance, consider a production environment where three distinct agents—predictive maintenance, process optimization, and quality assurance—collaborate in real-time to refine machine operations and maintain the factory at peak performance. In Figure 3, the predictive maintenance agent is focused on machinery maintenance. Its main tasks are to monitor equipment health by analyzing sensor data generated from the machines. It predicts machine failures and recommends maintenance actions to extend machinery lifespan and prevent downtime as much as possible. Figure 3. A multi-agent system for production optimization. The process optimization agent is designed to enhance production efficiency. It analyzes production parameters to identify inefficiencies and bottlenecks, and it optimizes said parameters by adjusting them (speed, vibration, etc.) to maintain product quality and production efficiency. This agent also incorporates feedback from the other two agents while making decisions on what production parameter to tune. For instance, the predictive maintenance agent can flag an anomaly in a milling machine temperature sensor reading; for example, if temperature values are going up, the process optimization agent can review the cutting speed parameter for adjustment. The quality assurance agent is responsible for evaluating product quality. It analyzes optimized production parameters and checks how those parameters can affect the quality of the product being fabricated. It also provides feedback for the other two agents. The three agents constantly exchange feedback with each other, and this feedback is also stored in the MongoDB Atlas database as agent short-term memory. In contrast, vector embeddings and sensor data are persisted as long-term memory. MongoDB is an ideal memory provider for agentic AI use case development thanks to its flexible document model, extensive security and data governance features, and horizontal scalability. All three agents have access to a "search_documents" tool, which leverages Atlas Vector Search to query vector embeddings of machine repair manuals and old maintenance work orders. The predictive maintenance agent leverages this tool to figure out additional insights while performing machine root cause diagnostics. Set up the use case shown in this article using our repo . To learn more about MongoDB’s role in the manufacturing industry, please visit our manufacturing and automotive webpage . To learn more about AI agents, visit our Demystifying AI Agents guide .

February 19, 2025