MongoDB Blog

Announcements, updates, news, and more

Workload Isolation for More Scalability and Availability: Search Nodes Now on Google Cloud

Today we’re excited to take the next step in bringing scalable, dedicated architecture to your search experiences with the introduction of Atlas Search Nodes, now in public preview for Google Cloud. This post is also available in: Deutsch , Français , Español , Português , Italiano , 한국어 , 简体中文 . Since our initial announcement of Search Nodes in June of 2023, we’ve been rapidly accelerating access to the most scalable dedicated architecture, starting with general availability on AWS and now expanding to public preview on Google Cloud. We'd like to give you a bit more context on what Search Nodes are and why they're important to any search experience running at scale. Search Nodes provide dedicated infrastructure for Atlas Search and Vector Search workloads to enable even greater control over search workloads. They also allow you to isolate and optimize compute resources to scale search and database needs independently, delivering better performance at scale and higher availability. One of the last things developers want to deal with when building and scaling apps is having to worry about infrastructure problems. Any downtime or poor user experiences can result in lost users or revenue, especially when it comes to your database and search experience. This is one of the reasons developers turn to MongoDB, given the ease of use of having one unified system for your database and search solution. With the introduction of Atlas Search Nodes, we’ve taken the next step in providing our builders with ultimate control, giving them the ability to remain flexible by scaling search workloads without the need to over-provision the database. By isolating your search and database workloads while at the same time automatically keeping your search cluster data synchronized with operational data, Atlas Search and Atlas Vector Search eliminate the need to run a separate ETL tool, which takes time and effort to set up and is yet another fail point for your scaling app. This provides superior performance and higher availability while reducing architectural complexity and wasted engineering time recovering from sync failures. In fact, we’ve seen a 40% to 60% decrease in query time for many complex queries, while eliminating the chances of any resource contention or downtime. With just a quick button click, Search Nodes on Google Cloud offer our existing Atlas Search and Vector Search users the following benefits: Higher availability Increased scalability Workload isolation Better performance at scale Improved query performance We offer both compute-heavy search-specific nodes for relevance-based text search, as well as a memory-optimized option that is optimal for semantic and retrieval augmented generation (RAG) production use cases with Atlas Vector Search. This makes resource contention or availability issues a thing of the past. Search Nodes are easy to opt into and set up — to start, jump on into the MongoDB UI and follow the steps do the following: Navigate to your “Database Deployments” section in the MongoDB UI Click the green “+Create” button On the “Create New Cluster” page, change the radio button for Google Cloud for “Multi-cloud, multi-region & workload isolation” to enable Toggle the radio button for “Search Nodes for workload isolation” to enable. Select the number of nodes in the text box Check the agreement box Click “Create cluster” For existing Atlas Search users, click “Edit Configuration” in the MongoDB Atlas Search UI and enable the toggle for workload isolation. Then the steps are the same as noted above. Jump straight into our docs to learn more!

March 28, 2024
Updates

Building AI With MongoDB: How DevRev is Redefining CRM for Product-Led Growth

OneCRM from DevRev is purpose-built for Software-as-a-Service (SaaS) companies. It brings together previously separate customer relationship management (CRM) suites for product management, support, and software development. Built on a foundation of customizable large language models (LLMs), data engineering, analytics, and MongoDB Atlas , it connects end users, sellers, support, product owners, and developers. OneCRM converges multiple discrete business apps and teams onto a common platform. As the company states on its website “Our mission is to connect makers (Dev) to customers (Rev) . When every employee adopts a “product-thinking” mindset, customer-centricity transcends from a department to become a culture.” DevRev was founded in October 2020 and raised over $85 million in seed funding from investors such as Khosla Ventures and Mayfield. At the time, this made it the largest seed in the history of Silicon Valley. The company is led by its co-founder and CEO, Dheeraj Pandey, who was previously the co-founder and CEO of Nutanix, and by Manoj Agarwal, DevRev's co-founder and former SVP of Engineering at Nutanix. DevRev is headquartered in Palo Alto and has offices in seven global locations. Check out our AI resource page to learn more about building AI-powered apps with MongoDB. CRM + AI: Digging into the stack DevRev’s Support and Product CRM serve over 4,500 customers: Support CRM brings support staff, product managers, and developers onto an AI-native platform to automate Level 1 (L1), assist L2, and elevate L3 to become true collaborators. Product CRM brings product planning, software work management, and product 360 together so product teams can assimilate the voice of the customer in real-time. Figure 1: DevRev’s real-time dashboards empower product teams to detect at-risk customers, monitor product health, track development velocity, and more. AI is central to both the Support and Product CRMs. The company’s engineers build and run their own neural networks, fine-tuned with application data managed by MongoDB Atlas. This data is also encoded by open-source embedding models where it is used alongside OpenAI models for customer support chatbots and question-answering tasks orchestrated by autonomous agents. MongoDB partner LangChain is used to call the models, while also providing a layer of abstraction that frees DevRev engineers to effortlessly switch between different generative AI models as needed. Data flows across DevRev’s distributed microservices estate and into its AI models are powered by MongoDB change streams . Downstream services are notified in real-time of any data changes using a fully reactive, event-driven architecture. MongoDB Atlas: AI-powered CRM on an agile and trusted data platform MongoDB is the primary database backing OneCRM, managing users, customer and product data, tickets, and more. DevRev selected MongoDB Atlas from the very outset of the company. The flexibility of its data model, freedom to run anywhere, reliability and compliance, and operational efficiency of the Atlas managed service all impact how quickly DevRev can build and ship high-quality features to its customers. The flexibility of the document data model enables DevRev’s engineers to handle the massive variety of data structures their microservices need to work with. Documents are large, and each can have many custom fields. To efficiently store, index, and query this data, developers use MongoDB’s Attribute pattern and have the flexibility to add, modify, and remove fields at any time. The freedom to run MongoDB anywhere helps the engineering team develop, test, and release faster. Developers can experiment locally, then move to integration testing, and then production — all running in different environments — without changing a single line of code. This is core to DevRev’s velocity in handling over 4,000 pull requests per month: Developers can experiment and test with MongoDB on local instances — for example adding indexes or evaluating new query operators, enabling them to catch issues earlier in the development cycle. Once unit tests are complete, developers can move to temporary instances in Docker containers for end-to-end integration testing. When ready, teams can deploy to production in MongoDB Atlas. The multi-cloud architecture of Atlas provides flexibility and choice that proprietary offerings from the hyperscalers can’t match. While DevRev today runs on AWS, in the early days of the company, they evaluated multiple cloud vendors. Knowing that MongoDB Atlas could run anywhere gave them the confidence to make a choice on the platform, knowing they would not be locked into that choice in the future. With MongoDB Atlas, our development velocity is 3-4x higher than if we used alternative databases. We can get our innovations to market faster, providing our customers with even more modern and useful CRM solutions. Anshu Avinash, Founding Engineer, DevRev The HashiCorp Terraform MongoDB Atlas Provider automates infrastructure deployments by making it easy to provision, manage, and control Atlas configurations as code. “The automation provided by Atlas and Terraform means we’ve avoided having to hire a dedicated infrastructure engineer for our database layer,” says Anshu. “This is a savings we can redirect into adding developers to work on customer-facing features.” Figure 2: The reactive, event-driven microservices architecture underpinning DevRev’s AI-powered CRM platform Anshu goes on to say, “We have a microservices architecture where each microservice manages its own database and collections. By using MongoDB Atlas, we have little to no management overhead. We never even look at minor version upgrades, which Atlas does for us in the background with zero downtime. Even the major version upgrades do not require any downtime, which is pretty unique for database systems.” Discussing scalability, Anshu says, “As the business has grown, we have been able to scale Atlas, again without downtime. We can move between instance and cluster sizes as our workloads expand, and with auto-storage scaling, we don’t need to worry about disks getting full.” DevRev manages critical customer data, and so relies on MongoDB Atlas’ native encryption and backup for data protection and regulatory compliance. The ability to provide multi-region databases in Atlas means global customers get further control over data residency, latency, and high availability requirements. Anshu goes on to say, “We also have the flexibility to use MongoDB’s native sharding to scale-out the workloads of our largest customers with complete tenant isolation.” DevRev is redefining the CRM market through AI, with MongoDB Atlas playing a critical role as the company’s data foundation. You can learn more about how innovators across the world are using MongoDB by reviewing our Building AI case studies . If your team is building AI apps, sign up for the AI Innovators Program . Successful companies get access to free Atlas credits and technical enablement, as well as connections into the broader AI ecosystem.

March 27, 2024
Artificial Intelligence

Architecting Success as a Woman in Tech

Celia Halenke , Solutions Architect at MongoDB, shares insight into the skills, experiences, and aspirations that shape her MongoDB journey in the dynamic world of technology. Plus, learn about her advice for teams wanting to build more inclusive environments for women in tech sales. Mastering the balance: Technical and communication skills In my role, it's all about having a strong blend of technical skills and effective communication. To succeed as a pre-sales Solutions Architect, you need to blend both seamlessly. I’ve had to learn the ins and outs of MongoDB's technology, but equally important is grasping the unique challenges and objectives my clients face. This allows me to craft solutions that are not just tailored but perfectly aligned with their needs. Communication is just as important. From running demos to conducting workshops with people from diverse backgrounds, clear and concise communication is a must. It's not just about showcasing the technology; it's about ensuring everyone is on the same page. Team collaboration is another vital aspect of my role. Working closely with sales reps, CSMs, product managers, and engineers requires building strong relationships. These connections are not just essential for success but play a significant role in personal growth. Celia and team members Fostering inclusivity in tech Being a woman in tech, I can't stress enough the importance of seeing more women in leadership roles. It's not just about breaking stereotypes; it's about having role models who inspire and motivate. That's why promoting women into leadership is crucial. Mentorship and leadership programs specifically designed for women can make a significant impact, providing the support and guidance needed to thrive in a historically male-dominated industry tech. I'm proud to be part of MongoDB, where employee resource groups for women and other communities create a supportive environment. More companies should consider implementing similar initiatives to foster inclusivity and provide platforms for sharing experiences. Celebrating success One of the highlights of my journey at MongoDB has been working closely with the Product Led Sales team. They have recognized me for my efforts for two consecutive quarters, which is a testament to the trust and collaboration I’ve built within the team. It feels really good! Knowing that my work is valued and appreciated motivates me to keep pushing boundaries. I encourage women to make time to celebrate their accomplishments. The joys of customer interaction What I love most about my customer-facing role is the direct interaction with our customers. Understanding their projects, and the problems they aim to solve, and then offering them the perfect MongoDB Atlas feature brings me immense satisfaction. Recently, I had the opportunity to visit clients on-site during a business trip to Latin America. I enjoyed this experience and it changed my perspective on customer interactions: though not as quick as hopping on a video conference, in-person sessions are some of the most engaging. Celia in Latin America Aspirations and future growth Looking ahead, my goal is to continue growing as a Solutions Architect at MongoDB! Embracing the evolving challenges of my role allows me to constantly learn and enhance my communication and technical skills. I aspire to work with larger customers, witnessing firsthand the positive impact MongoDB's applications can have on people's lives. As I gather more experience, I'm eager to take on a leadership role, guiding others in their MongoDB journeys . My journey at MongoDB is a testament to the ever-evolving landscape of technology, where success is not just about technical expertise but also about building meaningful connections, fostering inclusivity, and celebrating every milestone along the way. Learn more about Sell Like a Girl and MDBWomen, Employee Resource Groups supporting a community of women around the world at MongoDB.

March 26, 2024
Culture

AI-powered SQL Query Converter Tool is Now Available in Relational Migrator

When I traveled to Japan for the first time it was shortly after translation apps on smartphones had really taken off. Even though I knew enough phrases to get by as a tourist I was amazed at how empowered I was by being able to have smoother conversations and read signs more easily. The power of AI helped me understand a language I had only a passing familiarity with and drastically improved my experience in another country. I was able to spend more time enjoying myself and spend less time looking up common words and sentences in a phrase book. So what does this have to do with application modernization? Transitioning from relational databases as part of a modernization effort is more than migrating data from a legacy database to a modern one. There is all the planning, designing, testing, refactoring, validating, and ongoing operation that makes modernization efforts a complex project to navigate successfully. MongoDB’s free Relational Migrator tool has helped with many of these tasks including schema design, data migration, and code generation, but we know this is just the beginning. One of the most common challenges of migrating legacy applications to MongoDB is working with SQL queries, triggers, and stored procedures that are often undocumented and must be manually converted to MongoDB Query API syntax. This requires deep knowledge of both SQL and the MongoDB Query API, which is rare if teams are used to only using one system or the other. In addition, teams often have hundreds, if not thousands of queries, triggers, and stored procedures that must be converted, which is extremely time-consuming and tedious. Doing these conversions manually would be like traveling abroad and looking up each object one by one in a phrase book instead of using a translation app. Thankfully with generative AI, we are finally able to get the modern version of the translation app on your phone. The latest release of Relational Migrator is able to use generative AI to help your developers quickly convert existing SQL queries, triggers, and stored procedures to work with MongoDB using your choice of programming language (JavaScript, C#, or Java). By automating the generation of development-ready MongoDB queries, your team can be more efficient by redirecting their time to more important testing and optimization efforts — accelerating your migration project. Teams that are familiar with SQL can also use the Query Converter to help close their MongoDB knowledge gap. The SQL objects they're familiar with are translated, making it easier to learn the new syntax by seeing them next to each other. Let’s take a closer look at how Query Converter can convert a SQL Server stored procedure to work with MongoDB. Figure 1: The MongoDB Query Converter Dashboard We’ll start by importing the stored procedure from the relational database into our Relational Migrator project. This particular stored procedure joins the results from two tables, performs some arithmetic on some of the columns, and filters the results based on an input parameter. CREATE PROCEDURE CustOrdersDetail @OrderID int AS SELECT ProductName, UnitPrice=ROUND(Od.UnitPrice, 2), Quantity, Discount=CONVERT(int, Discount * 100), ExtendedPrice=ROUND(CONVERT(money, Quantity * (1 - Discount) * Od.UnitPrice), 2) FROM Products P, [Order Details] Od WHERE Od.ProductID = P.ProductID and Od.OrderID = @OrderID Developers who are experienced with the MongoDB aggregation framework would know that the equivalent method to join data from two collections is to use the $lookup stage. However, when migrating a relational database to MongoDB, it often makes sense to consolidate data from multiple tables into a single collection. In this example, we are doing exactly that, by combining data from the Orders , Order Details , and Products table into a single orders collection. This means that, when considering the changes to the schema, we do not actually need a $lookup stage at all, as the data from each of the required tables has already been merged into a single collection. Relational Migrator’s Query Converter works alongside the schema mapping functionality and automatically adjusts the generated query to work against your chosen schema. With JavaScript chosen as our target language, the converted query avoids the need for a costly join and includes MongoDB equivalents of our original SQL arithmetic functions. The query is now ready to test and include in our modernized app. const CustOrdersDetail = async (db, OrderID) => { return await db.collection('orders').aggregate([ { $match: { orderId: OrderID } }, { $unwind: '$lineItems' }, { $project: { ProductName: '$product.productName', UnitPrice: { $round: ['$lineItems.unitPrice', 2] }, Quantity: '$lineItems.quantity', Discount: { $multiply: ['$lineItems.discount', 100] }, ExtendedPrice: { $round: [ { $multiply: [ '$lineItems.quantity', { $subtract: [1, '$lineItems.discount'] }, '$lineItems.unitPrice' ] }, 2 ] } } } ]).toArray(); }; Relational Migrator does more than just query conversion, it also assists with app code generation, data modeling, and data migration, which drastically cuts down on the time and effort required to modernize your team's applications. Just like a language translation app while traveling abroad it can drastically improve your experience converting and understanding a new language or technology. The new Query Converter tool is now available for free for anyone to try as part of a public preview in the Relational Migrator tool. Download Relational Migrator and try converting your SQL queries and stored procedures today.

March 25, 2024
Updates

Transforming Industries with MongoDB and AI: Telecommunications and Media

This is the second in a six-part series focusing on critical AI use cases across the manufacturing and motion, financial services, retail, telecommunications and media, insurance, and healthcare industries. Read part one here. The telecommunications industry operates in a landscape characterized by tight profit margins, particularly in commoditized communication and connectivity services where differentiation is minimal. With offerings such as voice, data, and internet access being largely homogeneous, telecom companies need to differentiate and diversify revenue streams to create value and stand out in the market. As digital natives disrupt traditional business models with agile and innovative approaches, established companies are not only competing among themselves but also with newcomers to deliver enhanced customer experiences and adapt to evolving consumer demands. To thrive in an environment where advanced connectivity is increasingly expected, telecom operators must prioritize cost efficiency in their Operations Support Systems (OSS) and Business Support Systems (BSS), elevate customer service standards, and enhance overall customer experiences to secure market share and gain a competitive edge. They’re not alone — media publishers, too, must streamline operations through automation while strengthening reader relationships to foster a willingness to pay for personalized and relevant content. Service assurance Telecommunications providers need to deliver network services at optimal quality and performance levels to meet customer expectations and service level agreements. Key aspects of service assurance include performance monitoring, quality of service (QoS) management, and predictive analytics to anticipate potential service degradation or network failures before they occur. With the increasing complexity of telecommunications networks and the growing expectations of customers for high-quality, always-on services, a new bar has been set for service assurance, requiring companies to invest heavily in solutions that can automate and optimize these processes and maintain a competitive edge. Service assurance is revolutionized by artificial intelligence (AI) through several key capabilities: Machine learning (ML) can be a powerful foundation for predictive maintenance, analyzing patterns, and predicting network failures before they occur, allowing for preemptive maintenance and significantly reducing downtime; AI techniques can also sift through complex network systems to accurately identify the root causes of issues, improving the effectiveness of troubleshooting efforts; and, with network optimization, analyzing log data to identify opportunities for improvement, raising efficiency and thus reducing operational costs and optimizing network performance in real-time. MongoDB Atlas ’s JSON-based document model is the ideal data foundation to underpin intelligent applications. It enables developers to store log data from various systems without the need for time-intensive upfront data normalization efforts and with the flexibility to deal with a wide variety of different data structures, even as they change over time. By vectorizing the data with an appropriate ML model, it's possible to reflect the healthy system state and identify log information that shows abnormal system behavior. Atlas Vector Search allows for conducting the required K-Nearest Neighbors (KNN) search in an effective way and as a fully included service of the MongoDB Atlas developer data platform . Finally, using LLM, information about the error, including the analysis of the root cause, can be expressed in natural language, making the job of understanding and fixing the problem much easier for the staff who are in charge of maintenance. Fraud detection and prevention Telecom providers today are utilizing an advanced array of techniques for detecting and preventing fraud, constantly adjusting to the dynamic nature of threat actors. Routine activities for detecting fraud consist of tracking unusual call trends and data usage, along with safeguarding against SIM swap incidents, a method frequently used for identity theft. To prevent fraud, strategies are applied at various levels, starting with stringent verification for new customers during SIM swaps or for transactions with elevated risk, taking into account the unique risk profile of each customer. Machine learning offers telecommunications companies a powerful tool to enhance their fraud detection and prevention capabilities by training ML models on historical data like call detail records (CDR). Moreover, these algorithms can assess the individual risk profile of each customer, tailoring detection and prevention strategies to their specific patterns of use. The models can adapt over time, learning from new data and emerging fraud tactics, thus enabling real-time detection and the automation of fraud prevention measures, reducing manual checks, and speeding up response times. To succeed in fraud detection, many data dimensions need to be considered, making the reaction time a critical factor in preventing the worst things from happening. So, the solution must also support fast, sub-second decisions. By vectorizing the data with an appropriate ML model, normal (healthy) business can be defined, and in turn, deviations from the norm identified, such as suspicious user activities. In addition to Atlas Vector Search, the MongoDB Query API supports stream processing , simplifying data ingestion from various sources and detecting fraud in real-time. Content discovery Today’s media organizations are expected to offer a high degree of content personalization, from streaming services to online publications and more. Viewers want intelligently selected and suggested content tailored to their interests. Using AI can significantly enhance the process of suggesting the next best article to read or show to stream. The most powerful implementations of content personalization track the behavior of the user, such as what content was searched for, how long was content displayed before the next click happened, and the categories the search falls under. Based on these parameters, similar content can be presented, or, as an alternative strategy, content from unseen areas of the portal so the user may discover new types of media and decide if they like it. To bring the right content to the right people at the right time, an automated system needs to maintain a multitude of information facets, which will lay the foundation for proper suggestions. With MongoDB and its document model, all required data points can be easily and flexibly stored in a user’s profile, in content, and in media. Ultimately, by vectorizing the content, an even more powerful system of content suggestions can be built with Atlas Vector Search, which allows for a similarity search that goes well beyond comparing just keywords or a list of attributes. Other notable use cases Differential Pricing: Gather insights into what customers are willing to spend on content or a service by conducting A/B tests and analyzing the data with an ML algorithm. This method facilitates the adoption of dynamic pricing models instead of sticking to a standard price list, thereby enhancing revenue and increasing the paying customer base. Content Summarization and Reformatting: Design a smart assistant tailored for writers, capable of providing automatic suggestions for content summaries, identifying suitable SEO keywords, and adapting articles for various specific audiences. Search Generative Experiences (SGE): Provide more dynamic, personalized, and contextually relevant search results, thus making information retrieval not only more efficient but also more engaging and useful. This can include personalization and summarization elements, as well. In conclusion, the telecommunications industry faces challenges of differentiation and revenue diversification amidst commoditized services and disruptive market forces. To thrive, telecom operators must prioritize cost efficiency, elevate customer service, and enhance experiences. Leveraging AI, MongoDB Atlas offers solutions like service assurance, fraud detection, and content discovery, empowering companies to navigate the complexities of the digital landscape, innovate, and deliver value-added services. From predictive maintenance to personalized content recommendations, MongoDB Atlas stands as a foundational tool for telecom and media companies, driving efficiency, agility, and competitiveness in a rapidly evolving market. Learn more about AI use cases for top industries in our new white paper, “ How Leading Industries are Transforming with AI and MongoDB Atlas .”

March 22, 2024
Artificial Intelligence

Introducing Semantic Caching and a Dedicated MongoDB LangChain Package for Gen AI Apps

We are in an unprecedented time in history where developers can build transformative AI applications quickly, without being AI experts themselves. This ability is enabling new classes of applications that can better serve customers with conversational AI for assistance and automation, advanced reasoning and analysis using AI-powered retrieval, and recommendation systems. Behind this revolution are large language models (LLMs) that can be prompted to solve for a wide range of use cases. However, LLMs have various limitations, like knowledge cutoff and a tendency to hallucinate. To overcome these limitations, they must be integrated with proprietary enterprise data sources to build reliable, relevant, and high-quality generative AI applications. That’s where MongoDB plays a critical role in the modern generative AI stack. Developers use MongoDB Atlas Vector Search as a vital part of the generative AI technique known as retrieval-augmented generation (RAG). RAG is the process of feeding LLMs the supplementary data necessary to ground their responses, ensuring they're dependable and precise. LangChain has been a critical part of this journey since the public launch of Atlas Vector Search, enabling developers to build better retriever systems powered by vector search and store conversation history in the operational database. Today, we are excited to announce support for two enhancements: Semantic cache powered by Atlas vector search, which improves the performance of your apps A dedicated LangChain-MongoDB package for Python and JS/TS developers, enabling them to build advanced applications even more efficiently The MongoDB Atlas integration with LangChain can now power all the database requirements for building modern generative AI applications: vector search, semantic caching (currently only available in Python), and conversation history. Earlier, we announced the launch of MongoDB LangChain Templates , which enable the developers to quickly deploy RAG applications, and provided a reference implementation of a basic RAG template using MongoDB Atlas Vector Search and OpenAI and a more advanced Parent-document Retrieval RAG template using MongoDB Atlas Vector Search. We are excited about our partnership with LangChain and will continue innovating. Improve LLM application performance with semantic cache Semantic cache improves the performance of LLM applications by caching responses based on the semantic meaning or context within the queries themselves. This is different from a traditional cache that works based on exact keyword matching. In the era of LLM the value of semantic cache is increasing tremendously, enabling sophisticated user experiences that closely mimic human interactions. For example, if two different users enter two different prompts, “give me suggestions for a comedy movie” and “recommend a comedy movie”, the semantic cache can understand that the intent behind the queries are same and return a similar response, even though different keywords are used, whereas a traditional cache will fail. Figure 1: Semantic cache using MongoDB Atlas Vector Search Check out this video walkthrough for the semantic cache: Accelerate development with a dedicated package With a dedicated LangChain-MongoDB package, MongoDB is even more deeply integrated with LangChain. The Python and Javascript packages contain the following LangChain Integrations: MongoDBAtlasVectorSearch ( Vector stores ) and MongoDBChatMessageHistory ( Chat Messages Memory ). In addition, the Python package includes the MongoDBAtlasSemanticCache ( LLM Caching ). The new package langchain-mongodb contains all the MongoDB-specific implementations and needs to be installed separately from langchain, which includes all the core abstractions. Earlier, everything was in the same package, making it challenging to correctly version and communicate what version should be used and whether any breaking changes were made. Find out more about the langchain-mongodb package: Python: Source code , LangChain docs , MongoDB docs Javascript: Source code , LangChain.js docs , MongoDB docs Get started today Check out this accompanying tutorial and notebook on building advanced RAG with MongoDB and LangChain, which contains a walkthrough and use cases for using semantic cache, vector search, and chat message history. Check out the “ PDFtoChat ” app to see langchain-mongodb JS in action. It allows you to have a conversation with your proprietary PDFs using AI and is built with MongoDB Atlas, LangChain.js, and TogetherAI. It’s an end-to-end SaaS-in-a-box app and includes user authentication, saving PDFs, and saving chats per PDF. Read the excellent overview of semantic caching using LangChain and MongoDB.

March 20, 2024
Updates

New Data Modeling Learning Path and Certification

This post is also available in: Deutsch , Français , Español , Português , Italiano , 한국어 , 简体中文 . Data modeling is a crucial part of software development. It's what gives structure to the data so that it can be analyzed, used to make decisions, and built into useful applications. It can be complex and challenging, especially when you're juggling performance and maintainability. But here's the good news: a well-designed data model can boost your app's performance. That's why we've put together a new learning path and certification to help you level up your data modeling skills. Whether you're a seasoned developer looking to deepen your understanding or a newcomer eager to learn, this content is designed to give you the tools you need to create efficient and effective data models. Let's dive in and start building better apps together. Deep dive into data modeling with MongoDB The new free, online MongoDB Data Modeling Path will help you build your knowledge from the ground up and prepare you for the certification exam. The curated sequence of videos, hands-on labs, and quizzes provides you with a guided journey to learning data modeling at your own pace. The learning path will show you a step-by-step method for creating effective data models. You'll learn how to identify entities and workloads, map relationships, and when to embed or reference them. In addition, you'll learn different schema design patterns, how to recognize and address MongoDB anti-patterns, and the basics of schema lifecycle management. And, here's a little something extra: Once you complete the learning path, you'll automatically score a 50% discount on the new certification ! Learn fast, certify faster With an extensive array of new educational resources on data modeling for MongoDB, we are thrilled to introduce our new credential, the MongoDB Associate Data Modeler certification , which will allow you to validate and showcase your expertise in the field. Our certifications are officially recognized by professional institutions, validating and acknowledging your MongoDB expertise. They're a valuable asset for advancing in your role and enhancing your marketability for future positions. Certified individuals earn bragging rights, inclusion in the Credly Talent Directory , and a distinguished Credly badge, making it easy to share your achievements. The MongoDB Associate Data Modeler certification is designed for experienced users. Candidates should have some familiarity with JSON, and the MongoDB Query API , including aggregations and data modeling on MongoDB. You should also understand the tradeoffs of simplicity versus performance of data modeling techniques. This certification attests to your competency in designing, building, and evolving effective data models in MongoDB, and incorporating data governance into your work. Explore MongoDB’s new learning path and certification designed to elevate and validate your data modeling skills.

March 20, 2024
News

Transforming Industries with MongoDB and AI: Manufacturing and Motion

This is the first in a six-part series focusing on critical AI use cases across several industries . The series covers the manufacturing and motion, financial services, retail, telecommunications and media, insurance, and healthcare industries. The integration of artificial intelligence (AI) within the manufacturing and automotive industries has transformed the conventional value chain, presenting a spectrum of opportunities. Leveraging Industrial IoT, companies now collect extensive data from assets, paving the way for analytical insights and unlocking novel AI use cases, including enhanced inventory management and predictive maintenance. Inventory management Efficient supply chains can control operational costs and ensure on-time delivery to their customers. Inventory optimization and management is a key component in achieving these goals. Managing and optimizing inventory levels, planning for fluctuations in demand, and of course, cutting costs are all imperative goals. However, efficient inventory management for manufacturers presents complex data challenges too, primarily in forecasting demand accurately and optimizing stock levels. This is where AI can help. Figure 1: Gen AI-enabled demand forecasting with MongoDB Atlas AI algorithms can be used to analyze complex datasets to predict future demand for products or parts. Improvement in demand forecasting accuracy is crucial for maintaining optimal inventory levels. AI-based time series forecasting can assist in adapting to rapid changes in customer demand. Once the demand is known, AI can play a pivotal role in stock optimization. By analyzing historical sales data and market trends, manufacturers can determine the most efficient stock levels and even reduce human error. On top of all this existing potential, generative AI can help with generating synthetic inventory data and seasonally adjusted demand patterns. It can also help with creating scenarios to simulate supply chain disruptions. MongoDB Atlas makes this process simple. At the warehouse, the inventory can be scanned using a mobile device. This data is persisted in Atlas Device SDK and synced with Atlas using Device Sync, which is used by MongoDB customers like Grainger . Atlas Device Sync provides an offline-first seamless mobile experience for inventory tracking, making sure that inventory data is always accurate in Atlas. Once data is in Atlas, it can serve as the central repository for all inventory-related data. This repository becomes the source of data for inventory management AI applications, eliminating data silos and improving visibility into overall inventory levels and movements. Using Atlas Vector Search and generative AI, manufacturers can easily categorize products based on their seasonal attributes, cluster products with similar seasonal demand patterns, and provide context to the foundation model to improve the accuracy of synthetic inventory data generation. Predictive maintenance The most basic approach to maintenance today is reactive — assets are deliberately allowed to operate until failures actually occur. The assets are maintained as needed, making it challenging to anticipate repairs. Preventive maintenance, however, allows systems or components to be replaced based on a conservative schedule to prevent commonly occurring failures — although predictive maintenance is expensive to implement due to frequent replacement of parts before end-of-life. Figure 2: Audio-based anomaly detection with MongoDB Atlas. Scan the QR code to try it out yourself. AI offers a chance to efficiently implement predictive maintenance using data collected from IoT sensors on machinery trained to detect anomalies. ML/AI algorithms like regression models or decision trees are trained on the preprocessed data, deployed on-site for inference, and continuously analyzed sensor data. When anomalies are detected, alerts are generated to notify maintenance personnel, enabling proactive planning and execution of maintenance actions to minimize downtime and optimize equipment reliability and performance. A retrieval-augmented generation (RAG) architecture can be deployed to generate or curate the data preprocessor removing the need for specialized data science knowledge. The domain expert can provide the right prompts for the large language model. Once the maintenance alert is generated by an AI model, generative AI can come in again to suggest a repair strategy, taking spare parts inventory data, maintenance budget, and personal availability into consideration. Finally, the repair manuals can be vectorized and used to power a chatbot application that guides the technician in performing the actual repair. MongoDB documents are inherently flexible while allowing data governance when required. Since machine health prediction models require not just sensor data but also maintenance history and inventory data, the document model is a perfect fit to model such disparate data sources. During the maintenance and support process of a physical product, information such as product information and replacement parts documentation must be available and easily accessible to support staff. Full-text search capabilities provided by Atlas Search can be integrated with the support portal and help staff retrieve information from Atlas clusters with ease. Atlas Vector Search is a foundational element for effective and efficiently powered predictive maintenance models. Manufacturers can use MongoDB Atlas to explore ways of simplifying machine diagnostics. Audio files can be recorded from machines, which can then be vectorized and searched to retrieve similar cases. Once the cause is identified, they can use RAG to implement a chatbot interface that the technician can interact with and get context-aware, step-by-step guidance on how to perform the repair. Autonomous driving With the rise of connected vehicles, automotive manufacturers have been compelled to transform their business models into software-first organizations. The data generated by connected vehicles is used to create better driver assistance systems, paving the way for autonomous driving applications. However, it is challenging to create fully autonomous vehicles that can drive safer than humans. Some experts estimate that the technology to achieve level 5 autonomy is about 80% developed — but the remaining 20% will be extremely hard to achieve and will take a lot of time to perfect. Figure 3: MongoDB Atlas’s Role in Autonomous Driving AI-based image and object recognition in automotive applications face uncertainties, but manufacturers must utilize data from radar, LiDAR, cameras, and vehicle telemetry to improve AI model training. Modern vehicles act as data powerhouses, constantly gathering and processing information from onboard sensors and cameras, generating significant Big Data. Robust storage and analysis capabilities are essential to manage this data, while real-time analysis is crucial for making instantaneous decisions to ensure safe navigation. MongoDB can play a significant role in addressing these challenges. The document model is an excellent way to accommodate diverse data types such as sensor readings, telematics, maps, and model results. New fields to the documents can be added at run time, enabling the developers to easily add context to the raw telemetry data. MongoDB’s ability to handle large volumes of unstructured data makes it suitable for the constant influx of vehicle-generated information. Atlas Search provides a performant search engine to allow data scientists to iterate their perception AI models. Finally, Atlas Device Sync can be used to send configuration updates to the vehicle's advanced driving assistance system Other notable use cases AI plays a critical role in fulfilling the promise of Industry 4.0. Numerous other use cases of AI can be enabled by MongoDB Atlas, some of which include: Logistics Optimization: AI can help optimize routes resulting in reduced delays and enhanced efficiency in day-to-day delivery operations. Quality Control and Defect Detection: Computer or machine vision can be used to identify irregularities in the products as they are manufactured. This ensures that product standards are met with precision. Production Optimization: By analyzing time series data from sensors installed on production lines, waste can be identified and reduced, thereby improving throughput and efficiency. Smart After Sales Support: Manufacturers can utilize AI-driven chatbots and predictive analytics to offer proactive maintenance, troubleshooting, and personalized assistance to customers. Personalized Product Recommendations: AI can be used to analyze user behavior and preferences to deliver personalized product recommendations via a mobile or web app, enhancing customer satisfaction and driving sales. The integration of AI in manufacturing and automotive industries has revolutionized traditional processes, offering a plethora of opportunities for efficiency and innovation. With industrial IoT and advanced analytics, companies can now harness vast amounts of data to enhance inventory management and predictive maintenance. AI-driven demand forecasting ensures optimal stock levels, while predictive maintenance techniques minimize downtime and optimize equipment performance. Moreover, as automotive manufacturers work toward autonomous driving, AI-powered image recognition and real-time data analysis become paramount. MongoDB Atlas emerges as a pivotal solution, providing flexible document modeling and robust storage capabilities to handle the complexities of Industry 4.0. Beyond the manufacturing and automotive sectors, the potential of AI-enabled by MongoDB Atlas extends to logistics optimization, quality control, production efficiency, smart after-sales support, and personalized customer experiences, shaping the future of Industry 4.0 and beyond. Learn more about AI use cases for top industries in our new white paper, “ How Leading Industries are Transforming with AI and MongoDB Atlas .”

March 19, 2024
Artificial Intelligence

Announcing Search Index Management in MongoDB Compass

You can now create and manage Atlas Search and Atlas Vector Search indexes on the interface many of you know and love: MongoDB Compass . Seamlessly build full-text and semantic search applications on top of your Atlas database, delivering swift and relevant results for a range of use cases including e-commerce sites, customer support chatbots, recommendation systems, and more. Gone are the days of juggling multiple tools to bring your search queries to fruition. And, with a variety of templates to choose from, Compass simplifies learning search index syntax so you can focus on what’s most important to you: building exceptional end-user experiences on top of your search queries. Try it out To get started, connect to an Atlas cluster from Compass. If you don’t have one, sign up . From there, simply navigate to Compass’ Indexes tab and select Create Search Index . It’s easy to build your first search index using one of our templates. Select either Search or Vector Search, and use the appropriate template. In this example, we’re going to create a Vector Search index. Once you're satisfied with your index definition, click Aggregate to start testing out your pipeline in Compass. Compass’ new search index experience leads you to results in just three guided steps, all without leaving the comfort of Compass. To learn more about search indexing in Compass, visit our documentation . If you have feedback about Compass’ search index experience, let us know on our feedback forum . Happy indexing!

March 18, 2024
Updates

From Relational Databases to AI: An Insurance Data Modernization Journey

Imagine you’re a data architect, a developer, or a data engineer at an insurance company. Management has asked you and your team to build a new AI claim adjustment system, a customer-facing LLM-powered chatbot, and an application to streamline the underwriting process. However, doing so is far from straightforward due to the challenges you face on a daily basis. The bulk of your time is spent navigating your company’s outdated legacy systems, which were built in the 1970s and 1980s. Some of these legacy platforms were written in COBOL and CICS, and today very few people on your team know how to develop and maintain those technologies. Moreover, the data models you work with are another source of frustration. Every interaction with them is a reminder of the intricate structures that have evolved over time, making data manipulation and analysis a nightmare. In sum, legacy systems are preventing your team—and your company—from innovating and keeping up with both your industry and customer demands. Whether you’re trying to modernize your legacy systems to improve operational efficiency, or to boost developer productivity, or if you want to build AI-powered apps that integrate with large language models (LLMs), MongoDB has a solution for that. In this post, we’ll walk you through a journey that starts with a relational data model refactored into MongoDB collections, vectorization and querying of unstructured data and, finally, retrieval augmented generation (RAG) : asking large language models (LLMs) questions about data in natural language. Identifying, modernizing, and storing the data Our journey starts with an assessment of the data sources we want to work with. As shown below, we can bucket the data into three different categories: Structured legacy data: Tables of claims, coverages, billings, and more. Is your data locked in rigid relations schemas? This tutorial is a step-by-step guide on how to migrate a real-life insurance relational model with the help of MongoDB Relational Migrator , refactoring 21 tables to only five MongoDB collections. Structured data (JSON): You might have files of policies, insurance products, or forms in JSON format. Check out our docs to learn how to insert those into a MongoDB collection. Unstructured data (PDFs, Audios, Images, etc.): If you need to create and store a numerical representation (vector embedding) of, for instance, claim-related photos of accidents or PDFs of policy guidelines, you can have a look at this blog that will walk you through the process of generating embeddings of pictures of car crashes and persisting them alongside existing fields in a MongoDB collection. Figure 1: Storing different types of data into MongoDB Regardless of the original format or source, our data has finally landed into MongoDB Atlas into what we call a Converged AI Data Store, which is a platform that centrally integrates and organizes enterprise data, including vectors, that enable the development of ML- and AI-powered applications. Accessing, experimenting and interacting with the data It’s time to put the data to work. The Converged AI Data Store unlocks a plethora of use cases and efficiency gains, both for the business and for developers. The next step of the journey is about the different ways we can interact with our data: Database and Full Text Search: Learn how to run database queries, start from the basics and move up to advanced features such as facets, fuzzy search, autocomplete, highlighting, and more with Atlas Search . Vector Search: We can finally leverage unstructured data. The Image Search blog we mentioned earlier also explains how to create a Vector Search index and run vector queries against embeddings of photos. RAG: Combining Vector Search and the power of LLMs, it is possible to interact in natural language with our data (see Figure 2 below), asking complex questions and getting detailed answers. Follow this tutorial to become a RAG expert. Figure 2: Retrieval augmented generation (RAG) diagram where we dynamically combine our custom data with the LLM to generate reliable and relevant outputs Having explored all the different ways we can ask questions of the data, we made it to the end of our journey. You are now ready to modernize your company’s systems and finally be able to keep up with the business’ demands. What will you build next? If you would like to discover more about Converged AI and Application Data Stores with MongoDB, take a look at the following resources: AI, Vectors, and the Future of Claims Processing: Why Insurance Needs to Understand The Power of Vector Databases Build a ML-Powered Underwriting Engine in 20 Minutes with MongoDB and Databricks

March 14, 2024
Applied

Using Generative AI and MongoDB to Tackle Cybersecurity’s Biggest Challenges

This post is also available in: Deutsch , Français , Español , Português , Italiano , 한국어 , 简体中文 . In the ever-evolving landscape of cybersecurity, organizations face a multitude of challenges that demand innovative solutions harnessing cutting-edge technologies. One of the most pressing issues is the increasing sophistication of cyber threats, including malware, ransomware, and phishing attacks, which are becoming more difficult to detect and mitigate. Additionally, the rapid expansion of digital infrastructures has widened the attack surface, making it harder for security teams to monitor and protect every entry and egress point. Another significant challenge is the shortage of skilled cybersecurity professionals — estimated by independent surveys to number around 4 million staff worldwide 1 — which leaves many organizations vulnerable to attack. These challenges underscore the need for advanced technologies that can augment human efforts to secure digital assets and data. How can generative AI help? Generative AI (gen AI) has emerged as a powerful tool in addressing these cybersecurity challenges. By leveraging large language models (LLMs) to generate new data or patterns based on existing datasets, generative AI can provide innovative solutions in several key areas: Enhanced threat detection and response Generative AI can be used to create simulations of cyber threats, including sophisticated malware and phishing attacks. These simulations can help in training machine learning models to detect new and evolving threats more accurately. Furthermore, gen AI can aid in the development of automated response systems that react to threats in real time. While this will never eliminate the need for human oversight, it will reduce the need for manual intervention and toil, allowing for quicker mitigation of attacks. For example, with the appropriate oversight it can automatically apply patches to vulnerable systems or adjust firewall rules to block attack vectors. This automated rapid response capability is particularly valuable in mitigating zero-day vulnerabilities, where the window between the discovery of a vulnerability and its exploitation by attackers can be very short. Actionable learnings from security event postmortems In the aftermath of a cybersecurity incident, conducting a thorough postmortem analysis is crucial for understanding what happened, why it happened, and how similar events can be prevented in the future. Generative AI can play a pivotal role in this process by synthesizing and summarizing complex data from a multitude of sources, including logs, network traffic, and security alerts. By analyzing this data, gen AI can identify patterns and anomalies that may have contributed to the security breach, offering insights that might be overlooked by human analysts due to the sheer volume and complexity of the information. Furthermore, it can generate comprehensive reports that highlight key findings, causative factors, and potential vulnerabilities, streamlining the postmortem process. This capability not only accelerates the recovery and learning process but also enables organizations to implement more effective remediation strategies, ultimately strengthening their cybersecurity posture. Generating synthetic data for deep model training The shortage of real-world data for training cybersecurity systems is a significant hurdle. Gen AI can create realistic, synthetic data sets that mirror genuine network traffic and user behavior without exposing sensitive information. This synthetic data can be used to train detection systems, improving their accuracy and effectiveness without compromising privacy or security. Automating phishing detection Phishing remains one of the most common attack vectors. Gen AI can analyze patterns in phishing emails and websites, generating models that predict and detect phishing attempts with high accuracy. By integrating these models into email systems and web browsers, organizations can automatically filter out phishing content, protecting users from potential threats. Putting it all together: The opportunities and the risks Generative AI holds the promise of transforming cybersecurity practices by automating complex processes, enhancing threat detection and response, and providing a deeper understanding of cyber threats. As the industry continues to integrate gen AI into cybersecurity strategies, it's crucial to remain vigilant about the ethical use of this technology and the potential for misuse. Nevertheless, the benefits it offers in strengthening digital defenses are undeniable, making it an invaluable asset in the ongoing battle against cyber threats. How does MongoDB help? With MongoDB, your development teams can build and deploy robust, correct, and differentiated real-time cyber defenses faster, and at any scale. To understand how MongoDB does this, consider that the the AI technology stack comprises three layers: The underlying compute (GPUs) and LLMs The tooling to fine-tune models along with the tooling for in-context learning and inference against the trained models The AI applications and related end-user experiences MongoDB operates at the second layer of the stack. It enables customers to bring their own proprietary data to any LLM running on any computing infrastructure to build gen AI-powered cybersecurity applications. MongoDB does this by addressing the hardest problems when adopting gen AI for cybersecurity. MongoDB Atlas securely unifies operational data, unstructured data, and vector data in a single, fully managed multi-cloud platform, avoiding the need to copy and sync data between different systems. MongoDB’s document-based architecture also allows development teams to easily model relationships between your application data and vector embeddings. This allows deeper and faster analytics and insights against security-related data. Figure 1: MongoDB Atlas brings together all of the data services needed to build modern cyber security applications in a unified API and developer data platform. MongoDB’s open architecture is integrated with a rich ecosystem of AI developer frameworks, LLMs, and embedding providers. This, combined with our industry-leading multi-cloud capabilities, allows your development teams the flexibility to move quickly and avoid lock-in to any particular cloud provider or AI technology in this rapidly evolving space. Check out our AI resource page to learn more about building AI-powered apps with MongoDB. Applying gen AI and MongoDB to real world cybersecurity applications Threat intelligence ExTrac utilizes AI-powered analytics and MongoDB Atlas to predict public safety risks by analyzing data from thousands of sources. The platform initially helped Western governments foresee conflicts but is expanding to enterprises for reputational management and more. MongoDB's document data model allows ExTrac to manage complex data efficiently, enhancing real-time threat identification. Atlas Vector Search aids in augmenting language models and managing vector embeddings for texts, images, and videos, speeding up feature development. This approach enables ExTrac to efficiently model trends, track evolving narratives, and predict risk for its customers, leveraging the flexibility and power of MongoDB to handle data of any shape and structure. Learn more in our ExTrac case study . Cybersec assessments VISO TRUST leverages AI to streamline the assessment of third-party cyber risks, making complex vendor security information quickly accessible for informed decision-making. Utilizing Amazon Bedrock and MongoDB Atlas, VISO TRUST's platform automates the due diligence of vendor security, significantly reducing the workload for security teams. Its AI-powered approach involves artifact intelligence that classifies security documents, detects organizations, and predicts security control locations within artifacts. MongoDB Atlas hosts text embeddings for a dense retrieval system that enhances the accuracy of LLMs through retrieval-augmented generation (RAG), providing instant, actionable security insights. This innovative use of technology enables VISO TRUST to offer rapid, scalable cyber risk assessments, boasting significant reductions in work and time for enterprises like InstaCart and Upwork. MongoDB's flexible document database and Atlas Vector Search play critical roles in managing and querying the vast amounts of data, supporting VISO TRUST's mission to deliver comprehensive cyber risk intelligence. Learn more in our Viso Trust case study . Steps to get started Generative AI powered by LLMs augmented with your own operational data encoded as vector embeddings is opening up many new possibilities in cyber security. If you want to learn more about the technology and its possibilities, take a look at our Atlas Vector Search learning byte . In just 10 minutes you’ll get an overview of different use cases and how to get started. 1 Hill, M. (2023, April 10). Cybersecurity workforce shortage reaches 4 million despite significant recruitment drive . CSO.

March 13, 2024
Artificial Intelligence

How MongoDB Enables Digital Twins in the Industrial Metaverse

The integration of MongoDB into the metaverse marks a pivotal moment for the manufacturing industry, unlocking innovative use cases across design and prototyping, training and simulation, and maintenance and repair. MongoDB's powerful capabilities — combined with Augmented Reality (AR) or Virtual Reality (VR) technologies — are reshaping how manufacturers approach these critical aspects of their operations, while also enabling the realization of innovative product features. But first: What is the metaverse, and why is it so important to manufacturers? We often use the term, "digital twin" to refer to a virtual replication of the physical world. It is commonly used for simulations and documentation. The metaverse goes one step further: Not only is it a virtual representation of a physical device or a complete factory, but the metaverse also reacts and changes in real time to reflect a physical object’s condition. The advent of the industrial metaverse over the past decade has given manufacturers an opportunity to embrace a new era of innovation, one that can enhance collaboration, visualization, and training. The industrial metaverse is also a virtual environment that allows geographically dispersed teams to work together in real time. Overall, the metaverse transforms the way individuals and organizations interact to produce, purchase, sell, consume, educate, and work together. This paradigm shift is expected to accelerate innovation and affect everything from design to production across the manufacturing industry. Here are some of the ways the metaverse — powered by MongoDB — is having an impact manufacturing. Design and prototyping Design and prototyping processes are at the core of manufacturing innovation. Within the metaverse, engineers and designers can collaborate seamlessly using VR, exploring virtual spaces to refine and iterate on product designs. MongoDB's flexible document-oriented structure ensures that complex design data, including 3D models and simulations, is efficiently stored and retrieved. This enables real-time collaboration, accelerating the design phase while maintaining the precision required for manufacturing excellence. Training and simulation Taking a digital twin and connecting it to physical assets enables training beyond traditional methods and provides immersive simulations in the metaverse that enhance skill development for manufacturing professionals. VR training, powered by MongoDB's capacity to manage diverse data types — such as time-series, key-values and events — enables realistic simulations of manufacturing environments. This approach allows workers to gain hands-on experience in a safe virtual space, preparing them for real-world challenges without affecting production cycles. Gamification is also one of the most effective ways to learn new things. MongoDB's scalability ensures that training data, including performance metrics and user feedback, is efficiently handled to continuously enlarge the training modules and the necessary resources for the ever-increasing amount of data. Maintenance and repair Maintenance and repair operations are streamlined through AR applications within the metaverse. The incorporation of AR and VR technologies into manufacturing processes amplifies the user experience, making interactions more intuitive and immersive. Technicians equipped with AR devices can access real-time information overlaid onto physical equipment, providing step-by-step guidance for maintenance and repairs. MongoDB's support for large volumes of diverse data types, including multimedia and spatial information, ensures a seamless integration of AR and VR content. This not only enhances the visual representation of data from the digital twin and the physical asset but also provides a comprehensive platform for managing the vast datasets generated during AR and VR interactions within the metaverse. Additionally, MongoDB's geospatial capabilities come into play, allowing manufacturers to manage and analyze location-based data for efficient maintenance scheduling and resource allocation. The result is reduced downtime through more efficient maintenance and improved overall operational efficiency. From the digital twin to metaverse with MongoDB The advantages of a metaverse for manufacturers are enormous, and according to Deloitte many executives are confident the industrial metaverse “ will transform research and development, design, and innovation, and enable new product strategies .” However, the realization is not easy for most companies. Challenges include managing system overload, handling vast amounts of data from physical assets, and creating accurate visualizations. The metaverse must also be easily adaptable to changes in the physical world, and new data from various sources must be continuously implemented seamlessly. Given these challenges, having a data platform that can contextualize all the data generated by various systems and then feed that to the metaverse is crucial. That is where MongoDB Atlas , the leading developer data platform, comes in, providing synchronization capabilities between physical and virtual worlds, enabling flexible data modeling, and providing access to the data via a unified query interface as seen in Figure 1. Figure 1: MongoDB connecting to a physical & virtual factory Generative AI with Atlas Vector Search With MongoDB Atlas, customers can combine three systems — database, search engine, and sync mechanisms — into one, delivering application search experiences for metaverse users 30% to 50% faster . Atlas powers use cases such as similarity search, recommendation engines, Q&A systems, dynamic personalization, and long-term memory for large language models (LLMs). Vector data is integrated with application data and seamlessly indexed for semantic queries, enabling customers to build easier and faster. MongoDB Atlas enables developers to store and access operational data and vector embeddings within a single unified platform. With Atlas Vector Search , users can generate information for maintenance, training, and all the other use cases from all possible information that is accessible. This information can come from text files such as Word, from PDFs, and even from pictures or sound streams from which an LLM then generates an accurate semantic answer. It’s no longer necessary to keep dozens of engineers busy, just creating useful manuals that are outdated at the moment a production line goes through first commissioning. Figure 2: Atlas Vector Search Transforming the manufacturing industry with MongoDB In the digital twin and metaverse-driven future of manufacturing, MongoDB emerges as a linchpin, enabling cost-effective virtual prototyping, enhancing simulation capabilities, and revolutionizing training processes. The marriage of MongoDB with AR and VR technologies creates a symbiotic relationship, fostering innovation and efficiency across design, training, and simulation. As the manufacturing industry continues its journey into the metaverse, the partnership between MongoDB and virtual technologies stands as a testament to the transformative power of digital integration in shaping the future of production. Learn more about how MongoDB is helping organizations innovate with the industrial metaverse by reading how we Build a Virtual Factory with MongoDB Atlas in 5 Simple Steps , how IIoT data can be integrated in 4 steps into MongoDB, or how MongoDB drives Innovations End-To-End in the whole Manufacturing Chain .

March 12, 2024
Applied

Ready to get Started with MongoDB Atlas?

Start Free