MongoDB Blog
Articles, announcements, news, updates and more
10,000 SI Certifications and Counting: MongoDB and System Integrator Partners Reach Major Milestone Supporting GenAI Initiatives
Today, tens of thousands of customers trust MongoDB with their businesses’ most critical workloads, while MongoDB’s system integrator (SI) partner technologies and services play a critical role in their ability to innovate. Customers rely on global and regional SIs to help get the most out of MongoDB and quickly build sophisticated applications. In late 2022, as VP of global system integrator and consulting partners at MongoDB, I led the team that launched our System Integrators Certification Program to ensure our partners were equipped to help customers get the most out of MongoDB. The System Integrators Certification Program, designed around two core certifications, gives MongoDB’s SI partners both an introduction to MongoDB and a technical deep-dive that qualifies participants to architect MongoDB for modern, AI-enriched applications. Since its inception, this program has quickly gathered steam, and, today, we are proud to say that we’ve issued over 10,000 certifications to associates and architects across MongoDB’s various system integrator partners. Achieving 10K certifications is a big step, and I am truly thrilled that over 2,500 Capgemini consultants have participated in this program and achieved MongoDB SI certification over the last ten months,” said Eric Reich, global AI and data engineering head at Capgemini. “SI Certifications are helping our consultants keep abreast of the latest architectures on cloud and data modernization. More specifically, these certifications help MongoDB and Capgemini engage in joint engineering collaboration, building accelerators to support our customers in optimizing business outcomes from data- and AI-powered applications leveraging MongoDB. The certification program is an exclusive enablement offering for our SI partners and can be completed on-demand or via instructor-led training. The rapid success and popularity of this certification program reinforces the value of working with MongoDB for our SI partners, the commitment of these partners to truly understand our product, and the desire of joint customers to do more with MongoDB. MongoDB is a key partner for Accenture as its technology enables our clients in their modernization journey. MongoDB's importance to us continues to grow as more organizations are working with Accenture to become more efficient with their data platforms while leveraging their AI investments,” said Ram Ramalingam, senior managing director and global lead of platform engineering and intelligent edge at Accenture. “Skills building and certifications are a significant value to our partnership as this enables my team to provide expertise to our clients for their modernization needs. In the near future, we plan to expand our partner certification offerings, including the addition of a new MongoDB Application Delivery Certification. This certification will include exclusive, partner-only, online learning and hands-on labs that teach application delivery fundamentals and implementation best practices. It will equip our SI partners with the ability to architect MongoDB and deliver at scale. The team and I can’t wait to see what our customers and system integrators build next. To learn more about MongoDB’s partners — who help boost developer productivity with a range of proven technology integrations — visit the MongoDB Partner Ecosystem . Current SI partners can register for the MongoDB Certification Program .
Collaboration for Breakfast: MongoDB and Partners Share AI Insights at AWS re:Invent
I’m old enough to remember when every tech conversation didn’t include the term “AI.” Hardly a day goes by without some mention of AI or generative AI (gen AI). But don’t just take my word for it: Google News search results for the phrase, “generative AI” have grown more than 2000% since ChatGPT was launched in November 2022. The AI excitement is more than just hype. For example, we’re seeing widespread adoption of AI across MongoDB’s tens of thousands of customers. Meanwhile, a recent GitHub survey showed 92% of developers have already incorporated gen AI into their work, and Gartner predicts that by 2027, 90% of new applications will incorporate machine learning models or services. MongoDB and our partners tapped into this excitement during AWS re:Invent. On November 29 — the same morning the company announced the integration of MongoDB Atlas Vector Search and Amazon Bedrock (and less than a week before Atlas Vector Search was made generally available ) — MongoDB held an AI-themed breakfast that reinforced the importance of partnerships during this transformative time. During the breakfast, MongoDB product leaders sat down with leaders from four of the company’s partners — Gradient, LangChain, Nomic, and Unstructured—to share insights about building the next generation of AI applications. Despite its 7 a.m. start time, the breakfast was packed with attendees from a range of industries and geographies — no small feat given re:Invent’s busy schedule — and excitement for AI was palpable. Given the broad interest in all things generative AI, organizations of all sizes want to learn how they can build the applications of tomorrow. This is where MongoDB and partners come in: MongoDB provides an integrated developer data platform that accelerates innovation by simplifying the application development process. To streamline AI innovation, MongoDB partners with organizations that offer complementary technology solutions, interoperability, flexibility, and reliability. Partnering to deliver a complete AI toolkit For example, Unstructured works with MongoDB to help organizations connect enterprise data stored in difficult formats like PDF and PNG to AI models. And the combination of MongoDB and LangChain's application framework makes it possible to build solutions that leverage proprietary company data. Meanwhile, with MongoDB Atlas Vector Search and Gradient , organizations can build, customize, and run private AI applications that leverage industry expert large language models (LLMs) to enhance performance. And last but hardly least, Nomic's tools allow users to visualize the unstructured data they store in MongoDB, to make AI more explainable and accessible. All told each partner’s offerings work with MongoDB products to create a comprehensive set of tools with which developers can build AI applications. At the breakfast, company leaders shared their thoughts on the current AI landscape, how their organizations collaborate with MongoDB, and what they see as the future of AI tools. “At AWS re:Invent, we showed how MongoDB is the best platform for building enterprise-ready generative AI apps,” said Andrew Davidson, senior vice president of product management at MongoDB. “Our powerful developer data platform — which works seamlessly with cutting-edge AI ecosystem partners to enable openly composable architecture and design — empowers developers to create compelling AI apps and experiences with greater interoperability, simplification, flexibility, and choice, pushing the boundaries of what's possible.” For example, LangChain Founding Software Engineer Jacob Lee noted that “it’s so, so early for generative AI. Most attendees at re:Invent had only just begun to consider principles and use cases for the technology. There is so much opportunity and potential impact yet to emerge that it will truly take the entire ecosystem's talents and creativity to explore it all.” “In short, the most important thing is to support each other and just keep building cool things,” said Lee. Brian Raymond, founder and CEO of Unstructured, agreed that it's very early for generative AI. "We should start seeing incremental, yet exciting, gains in the performance of multimodal foundation models as well as increased focus on smaller models that are cheaper to run at scale," Raymond added. "It's likely going to take more time to mature the emerging foundation model stack (marked by retrieval-augmented generation ) into a performant and cost-effective option for most organizations." Creating a seamless AI development experience Overall, the re:Invent breakfast conversation conversation highlighted how MongoDB and its partners are working together to create a holistic, seamless AI development experience. By working closely with partner organizations to augment its industry-leading solutions , MongoDB ensures enterprises have access to everything they need in one place to develop cutting-edge, modern AI applications that are scalable, secure, and enterprise-grade. “Gradient's mission is to democratize AI by making it more accessible to enterprises and developers,” said Chris Chang, CEO and co-founder of Gradient. “However in AI, data itself can be challenging which is why our partnership with MongoDB will allow users to make the most out of their data and leverage a best-of-breed technology to help power new AI features.” To learn more about MongoDB’s artificial intelligence solutions—including resources to build next-generation applications — visit MongoDB for Artificial Intelligence . If your organization wants to build the next big thing in AI with MongoDB, consider applying for the MongoDB AI Innovators Program .
Navigating the Landscape of Artificial Intelligence: How Can The Financial Sector Make Use of Generative AI
In the ever-evolving landscape of financial technology, the conversation around artificial intelligence (AI), and particularly generative AI is gaining momentum. AI has been part of the financial landscape for decades but with the advances in generative AI come greater benefits but also risks that financial institutions need to consider in such a regulated industry. While the potential benefits of generative AI are significant and the adoption by many is still being considered, a measured approach is needed when moving from proof of concept into production. In an edition of the Fintech Finance News Virtual Arena , several notable industry thought leaders from HSBC, Capgemini, and MongoDB came together to explore how the financial sector can make use of generative AI and what financial institutions must consider in their AI strategy. Watch the panel discussion How can the financial sector make use of generative AI today with HSBC, MongoDB and Capgemini . Hear from: EJ Achtner, Office of Applied Artificial Intelligence at HSBC Dan Pears, Vice President, UK Practice Lead at Capgemini Wei You Pan, Director, Financial Services Industry Solutions at MongoDB Doug Mackenzie, Chief Content Officer at FF News Addressing the challenges of generative AI While financial technologists have always had to deal with persistent issues like risk management and governance, the adoption of generative AI in fintech introduces new challenges that AI specialists have always dealt with, like inherent biases and ethical concerns. One challenge that stands out for generative AI is hallucination — the generation of content that is not accurate, factual, or reflective of the real world. AI models may produce information that sounds plausible but is entirely fictional. Generative AI models, especially in natural language processing, might generate text that is coherent and contextually appropriate but lacks factual accuracy. This poses challenges in different domains, including misinformation and content reliability. Examples of such challenges or risks may include: Misleading financial planning advice: In financial advisory services, hallucinated information may result in misleading advice leading to unexpected risks or missed opportunities. Incorrect risk assessments for lending: Inaccurate risk profiles may lead to poor risk assessments for loan applicants that can cause a financial institution to approve a loan at a higher risk of default than the firm would normally accept. Sensitive information in generated text: When generating text, models may inadvertently include sensitive information from the training data. Adversaries can craft input prompts to coax the model into generating outputs that expose confidential details present in the training corpus. It is thus paramount that financial institutions understand the technological impact, scale, and complexity associated with AI, especially the generative AI strategy. A strategic and comprehensive approach that encompasses various aspects of technology, data, ethics, and organizational readiness is critical. Here are some key considerations financial institutions must consider when adopting such a strategy: Hallucination mitigation: Mitigating hallucination in generative AI is a challenging task, but several strategies and techniques can be employed to reduce the risk of generating inaccurate or misleading information. One promising strategy is to make use of the Retrieval Augmented Generation (RAG) approach to mitigate hallucination in generative AI models. This approach involves incorporating information retrieval mechanisms to enhance the generation process, ensuring that generated content is grounded in real-world knowledge. Vector Search is a popular mechanism to support the implementation of the RAG architecture which uses vector search to retrieve relevant documents based on the input query. It then provides these retrieved documents as context to the large language models (LLM) to help generate a more informed and accurate response. Data quality and availability: Take a step back before adopting AI to ensure the quality, relevance, and accuracy of data being used for AI training and decision-making can be accessed in real time. Education: Investing in training programs is key to addressing the skills gap in AI, ensuring the workforce is equipped to manage, interpret, and collaborate with AI technologies. For the adoption of AI to be successful, a culture of learning and development is vital, providing employees with the tools needed to be the absolute best that they can be for their personal and professional development. Furthermore, promoting awareness about potential vulnerabilities and continuously refining models to enhance their resilience against hallucination, biases, adversarial manipulation, and other weaknesses are essential to ensure success in generative AI applications. Develop new governance, frameworks, and controls: Before going live, create safe and secure environments for testing and learning that allow you to fail fast in a safe manner. Moving headfirst into production with direct contact with customers can result in the wrong governance methods being implemented. Monitoring and continuous improvement: Implement robust monitoring systems to measure and understand financial impacts, change impacts, scale, and complexity associated with the adoption of AI. Scalability and integration: Design AI systems with scalability in mind to accommodate growing datasets and evolving requirements. Security and privacy: Implement robust cybersecurity measures to safeguard AI models and the data they rely on. Techniques such as adversarial training, input sanitization, and incorporating privacy-preserving mechanisms can help mitigate the risk of generative AI inadvertently revealing private data. Incident response plans should be part of the cybersecurity measures, as well as regular education of the relevant stakeholders on security and privacy. How MongoDB can help you overcome your data challenges In the realm of adopting advanced technologies like AI and ML, organizations often grapple with the challenge of integrating these innovations into legacy systems, particularly when it comes to use cases such as fraud prevention where the platform is integrated with external sources for accurate data analysis on complete data. The inflexibility of existing systems poses a significant pain point, hindering the seamless incorporation of cutting-edge technologies. MongoDB serving as the operational data store (ODS) with a flexible document model enables financial institutions to efficiently handle large volumes of data in real time. By integrating MongoDB with AI/ML platforms, businesses can develop models trained on the most accurate and up-to-date data, thereby addressing the critical need for adaptability and agility in the face of evolving technologies. Legacy systems, marked by their inflexibility and resistance to modification present another challenge in the pursuit of leveraging AI to enhance customer experiences and improve operational efficiency. Integration struggles also persist, especially in the financial sector, where the uncertainty of evolving AI models over time requires a scalable infrastructure. MongoDB's developer data platform future-proofs businesses with its flexible data schema capable of accommodating any data structure, format, or source. This flexibility facilitates seamless integration with different AI/ML platforms, allowing financial institutions to adapt to changes in the AI landscape without extensive modifications to the infrastructure. Concerns regarding the security of customer data, especially when shared with third parties through APIs, further complicate the adoption of innovative AI technologies. Legacy systems can stand in the way of innovation as they are often more vulnerable to security threats due to outdated security measures. MongoDB’s modern developer data platform addresses these challenges with built-in security controls across all data. Whether managed in a customer environment or through MongoDB Atlas , a fully managed cloud service, MongoDB ensures robust security with features such as authentication (single sign-on and multi-factor authentication), role-based access controls, and comprehensive data encryption. These security measures act as a safeguard for sensitive financial data, mitigating the risk of unauthorized access from external parties and providing organizations with the confidence to embrace AI and ML technologies. If you would like to discover more about building AI-enriched applications with MongoDB, take a look at the following resources: Mitigate hallucination of Generative AI by using RAG with Atlas Vector Search, LangChain, and OpenAI Deliver AI-enriched apps with the right security controls in place, and at the scale and performance users expect Sign up for our Atlas for Industries programme to get access to our solution accelerators to drive innovation
Evolve Your Data Models as You Modernize with Hackolade and Relational Migrator
Application modernization has always been a constant. For many developers and database administrators, the realization that their legacy relational databases that have served their apps well to this point are no longer as easy and fast to work with has become glaringly apparent as they strive to incorporate emerging use cases like generative AI, search, and edge devices into their customer experience at an increasing rate. While many are turning to MongoDB Atlas for the flexible document model and wide range of integrated data services, migrations are often seen as daunting projects. MongoDB Relational Migrator has simplified several of the key tasks required to successfully migrate from today's popular relational databases to MongoDB. With Relational Migrator, teams can design their target MongoDB schema using their existing relational one as a blueprint, migrate their data to MongoDB while transforming it to their newly designed schema, and get a head start on app code modernization through code template generation and query conversion. But as organizations scale their MongoDB footprint through migrations and new app launches, a new challenge emerges: managing and evolving data models with more teams and stakeholders. Sooner or later, modernization becomes as much about change management as it does technology — keeping teams aligned is critical for keeping everyone moving forward. This is where Hackolade comes in. Hackolade Studio is a visual data modeling and schema design application that enables developers to design and document their MongoDB data models, and more importantly, use those entity-relationship diagrams (ERDs) to collaborate with their counterparts in other areas of the business, like database administration, architecture, and product management. MongoDB data model in Hackolade Studio No database is an island, and the teams working with MongoDB cannot afford to work in isolation. With Hackolade Studio, database teams can use these ERDs to translate their point-of-view to others, making hand-offs and handshakes with other teams like operations more seamless, driving developer productivity, and accelerating new feature builds. Jump from Relational Migrator to Hackoldate Studio with ease Hackolade Studio is now making it even easier to transition to their application after using MongoDB Relational Migrator to complete their migrations. Teams can now use Hackolade Studio’s reverse-engineering feature to import their Relational migrator project (.relmig) files, bringing their MongoDB schema directly over into Hackolade Studio. With this integration, teams can start with Relational Migrator to build their initial schema and execute their data migration, then transition to Hackolade Studio to document, manage, and evolve their schema going forward - giving them a greater degree of control, visibility, and collaboration needed to support modernization initiatives that include many migrations across several applications, teams, and legacy relational environments. MongoDB Relational Migrator, showing a relational schema on the left and its transformed MongoDB schema on the right Getting started is incredibly easy. First, you’ll need your Relational Migrator project file, which can be exported from Relational Migrator to your local device. Then in Hackolade Studio, use the reverse-engineering workflow to import your .relmig file into a new or existing data model. For a detailed walkthrough, dive into Hackolade’s documentation for this integration. Importing Relational Migrator files in Hackolade Studio As MongoDB adoption grows within your organization, more apps and more teams will need to interact with your MongoDB data models. With Relational Migrator and Hackolade together, you will have the tools at your disposal to not only kickstart migration projects but also manage MongoDB data models at scale, giving your teams the insights and visibility needed to drive performance and guide app modernization initiatives. Learn more about how Hackolade can maximize developer productivity and support your modernization to MongoDB initiatives. Download MongoDB Relational Migrator for free to get started with migrating your first databases.
A Discussion with VISO TRUST: Expanding Atlas Vector Search to Provide Better-Informed Risk Decisions
We recently caught up with the team at VISO TRUST to check in and learn more about their use of MongoDB and their evolving search needs (if you missed our first story, read more about VISO TRUST’s AI use cases with MongoDB on our first blog ). VISO TRUST is an AI-powered third-party cyber risk and trust platform that enables any company to access actionable vendor security information in minutes. VISO TRUST delivers the fast and accurate intelligence needed to make informed cybersecurity risk decisions at scale for companies at any maturity level. Since our last discussion back in September 2023, VISO TRUST has adopted our new dedicated Search Nodes architecture, as well as scaled up both dense and sparse embeddings and retrieval to improve the user experience for their customers. We sat down for a deeper dive with Pierce Lamb, Senior Software Engineer on the Data and Machine Learning team at VISO TRUST to hear more about the latest exciting updates. Check out our AI resource page to learn more about building AI-powered apps with MongoDB. How have things been evolving at VISO TRUST? What are some of the new things you're excited about since we spoke last? There have definitely been some exciting developments since we last spoke. Since then, we’ve implemented a new technique for extracting information out of PDF and image files that is much more accurate and breaks extractions into clear semantic units: sentences, paragraphs, and table rows. This might sound simple, but correctly extracting semantic units out of these PDF files is not an easy task by any means. We tested the entire Python ecosystem of PDF extraction libraries, cloud-based OCR services, and more, and settled on what we believe is currently state-of-the-art. For a retrieval augmented generation (RAG) system , which includes vector search, the accuracy of data extraction is the foundation on which everything else rests. Improving this process is a big win and will continue to be a mainstay of our focus. Last time we spoke, I mentioned that we were using MongoDB Atlas Vector Search to power a dense retrieval system and that we had plans to build a re-ranking architecture. Since then I’m happy to confirm we have achieved this goal. In our intelligent question-answering service, every time a question is asked, our re-ranking architecture provides four levels of ranking and scoring to a set of possible contexts in a matter of seconds to be used by large language models (LLMs) to answer the question. One additional exciting announcement is we’re now using MongoDB Atlas Search Nodes , which allow workload isolation when scaling search independently from our database. Previously, we were upgrading our entire database instance solely because our search needs were changing so rapidly (but our database needs were not). Now we are able to closely tune our search workloads to specific nodes and allow our database needs to change at a much different pace. As an example, retraining is much easier to track and tune with search nodes that can fit the entire Atlas Search Index in memory (which has significant latency implications). As many have echoed recently, our usage of LLMs has not reduced or eliminated our use of discriminative model inference but rather increased it. As the database that powers our ML tools, MongoDB has become the place we store and retrieve training data, which is a big performance improvement over AWS S3. We continue to use more and more model inference to perform tasks like classification that the in-context learning of LLMs cannot beat. We let LLMs stick to the use cases they are really good at like dealing with imperfect human language and providing labeled training data for discriminative models. VISO TRUST's AI Q&A feature being asked a security question You mentioned the recent adoption of Search Nodes. What impacts have you seen so far, especially given your existing usage of Atlas Vector Search? We were excited when we heard the announcement of Search Nodes in General Availability , as the offering solves an acute pain point we’d been experiencing. MongoDB started as the place where our machine learning and data team backed up and stored training data generated by our Document Intelligence Pipeline. When the requirements to build a generative AI product became clear, we were thrilled to see that MongoDB had a vector search offering because all of our document metadata already existed in Atlas. We were able to experiment with, deploy, and grow our generative AI product right on top of MongoDB. Our deployment, however, was now serving multiple use cases: backing up and storing data created by our pipeline and also servicing our vector search needs. The latter forced us to scale the entire deployment multiple times when our original MongoDB use case didn’t require it. Atlas Search Nodes enable us to decouple these two use cases and scale them independently. It was incredibly easy to deploy our search data to Atlas Search Nodes, requiring only a few button clicks. Furthermore, the memory requirements of vector search can now match our Atlas Search Node deployment exactly; we do not need to consider any extra memory for our storage and backup use case. This is a crucial consideration for keeping vector search fast and streamlined. Can you go into a bit more detail on how your use cases have evolved with Vector Search, especially as it relates to dense and sparse embeddings and retrieval? We provide a Q&A system that allows clients to ask questions of the security documents they or their vendors upload. For example, if a client wanted to know what one of their vendor’s password policies is, they could ask the system that question and get an answer with cited evidence without needing to look through the documents themselves. The same system can be used to automatically answer third-party security questionnaires our clients receive by parsing the questions out of them and answering those questions using data from our client’s documents. This saves a lot of time because answering security questions can often take weeks and involve multiple departments. The above system relies on three main collections separated via the semantic units mentioned above: paragraphs, sentences, and table rows . These are extracted from various security compliance documents uploaded to the VISO TRUST platform (things like SOC2s, ISOs, and security policies, among others). Each sentence has a field with an ObjectId that links to the corresponding paragraph or table row for easy look-up. To give a sense of size, the sentences collection is in the order of tens of millions of documents and growing every day. When a question request enters the re-ranking system, sparse retrieval (keyword search for similarity) is performed and then dense retrieval using a list of IDs passed by the request to filter to a set of possible documents the context can come from. The document filtering generally takes the scope from tens of millions to tens or hundreds of thousands. Sparse/dense retrieval independently scores and ranks those thousands or millions of sentences, and return the top one hundred in a matter of milliseconds to seconds. The output of these two sets of results are merged into a final set of one hundred favoring dense results unless a sparse result meets certain thresholds. At this point, we have a set of one hundred sentences, scored and ranked by similarity to the question, using two different methods powered by Atlas Search, in milliseconds to seconds. In parallel, we pass those hundred to a multi-representational model and a cross-encoder model to provide their scoring and ranking of each sentence. Once complete, we now have four independent levels of scoring and ranking for each sentence (sparse, dense, multi-representational, and cross-encoder). This data is passed to the Weighted Reciprocal Rank Fusion algorithm which uses the four independent rankings to create a final ranking and sorting, returning the number of results requested by the caller. How are you measuring the impact or relative success of your retrieval efforts? The monolithic collections I spoke about above grow substantially daily, as we’ve almost tripled our sentence volume since first bringing data into MongoDB, while still maintaining the same low latency our users depend on. We needed a vector database partner that allowed us to easily scale as our datasets grow and continue to deliver millisecond-to-second performance on similarity searches. Our system can often have many in-flight question requests occurring in parallel and Atlas has allowed us to scale with the click of a button when we start to hit performance limits. One piece of advice I would give to readers creating a RAG system using MongoDB’s Vector Search is to use ReadPreferences to ensure that retrieval queries and other reads occur primarily on secondary nodes. We use ReadPreferece.secondariesPreferred almost everywhere and this has helped substantially with the load on the system. Lastly, can you describe how MongoDB helps you execute on your goal of helping to better make informed risk assessments? As most people involved in compliance, auditing, and risk assessment efforts will report, these essential tasks tend to significantly slow down business transactions. This is in part because the need for perfect accuracy is extremely high and also because they tend to be human-reliant and slow to adopt new technology. At VISO TRUST , we are committed to delivering that same level of accuracy, but much faster. Since 2017, we have been executing on that vision and our generative AI products represent a leap forward in enabling our clients to assess and mitigate risk at a faster pace with increased levels of accuracy. MongoDB has been a key partner in the success of our generative AI products by becoming the reliable place we can store and query the data for our AI-based results. Getting started Thanks so much to Pierce Lamb for sharing details on VISO TRUST’s AI-powered applications and experiences with MongoDB. To learn more about MongoDB Atlas Search check out our learning byte , or if you’re ready to get started, head over to the product page to explore tutorials, documentation, and whitepapers. You’ll just be a few clicks away from spinning up your own vector search engine where you can experiment with the power of vector embeddings, RAG, and more!
Integrate OPC UA With MongoDB - A Feasibility Study With Codelitt
Open Platform Communications Unified Architecture (OPC UA) is a widely recognized and important communication standard for Industry 4.0 and industrial IoT. It enables interoperability across different machines and equipment, ensuring reliable and secure information sharing within the Operational Technology (OT) layer. By providing a standard framework for communication, OPC UA enhances data integrity, security, and accessibility of data enabling many use cases for Industry 4.0. OPC UA focuses on standard data transmission and information modeling. It uses multiple data encoding methods such as binary or JavaScript Object Notation (JSON) and leverages different levels of security encryption to address security concerns. For information modeling, it adopts an object-oriented approach to abstract and model specific industrial assets such as robots, machines, and processes. Rich data models and object types can be created for a description of machine attributes and composition. Using OPC UA, the traditional context-less time-series machine data is transformed into a semantic-based information model. MongoDB's document model offers a straightforward and compelling approach for storing OPC UA semantic information models due to its flexibility and compatibility with complex data structures. The document model is a superset of all other types of data models, which makes it very popular in the developer community. OPC UA information models contain detailed relationships and hierarchies, making the dynamic schema of MongoDB a natural fit. Fields in the document are extensible at run time making dynamic updates and efficient querying a breeze. For example, consider an OPC UA information model representing an industrial robot. This model will encompass information about the robot's status, current task, operational parameters, and maintenance history. Example OPC UA information model for an Industrial Robot Robot RobotName (Variable) Status (Variable) CurrentTask (Variable) OperationalParameters (Object) MaxSpeed (Variable) PayloadCapacity (Variable) Reach (Variable) MaintenanceHistory (Array of Objects) Timestamp (Variable) Description (Variable) With MongoDB, this model can be easily represented in a document with nested fields. { "_id": ObjectId("654321ab12345abcd6789"), "RobotName": "Robot1", "Status": "Running", "CurrentTask": "Assembling Component ABC", "OperationalParameters": { "MaxSpeed": 80, // in cm/s "PayloadCapacity": 150, // in kg "Reach": 2.65 // in m }, "MaintenanceHistory": [ { "Timestamp": "2023-08-25T10:00:00", "Description": "Routine checkup" }, { "Timestamp": "2023-06-25T14:30:00", "Description": "Replaced worn-out gripper" } ] } This MongoDB document easily captures the complexities of the OPC UA information model. Hierarchical attributes in the model are maintained as objects and arrays can represent historical data and log files. As the robot runs during the production shift, the document can be easily updated with real-time status information. Instead of worrying about creating a complicated Entity Relationship diagram with SQL databases, MongoDB offers a superior alternative to represent digital shadows of industrial equipment. Now that we have seen how easy it is to model OPC UA data in MongoDB, let's talk about how to connect an OPC UA server to MongoDB. One of our partners, Codelitt is developing a connector that can ingest time-series OPC UA data into MongoDB in real time. Codelitt is a custom software strategy, engineering, and design company. The architecture of the end-to-end solution is shown in Figure 1. Figure 1: High-level architecture and data flow In Figure 1: Industrial equipment and controllers will transmit data to local servers using the OPC UA protocol. OPC UA servers will listen to these devices and broadcast them to all subscribed clients. Clients will listen to specific events/variables and queue the event to be stored. The message broker will provide the queue system to digest a large amount of data and provide reliability between the event source and the data storage. MongoDB Atlas will provide the final destination of data, and the ability to do analytics using the aggregation framework and visualization using Atlas Charts. Technical details It is assumed that the user already has machines that have OPC UA server enabled. For the OPC UA client, depending on the client's preferences, the Codelitt solution can switch between a custom-built OPC UA client based on the Node-OPCUA open source project, AWS IoT SiteWise Edge , or a Confluent-based OPC UA source connector . In the case of a custom-built client, it will connect to the machine's OPC UA server using OPC TCP and extract the necessary data that is then transmitted to a broker. The message broker could be any cloud-provided solution (Azure Event Hub, Amazon Kinesis, etc.) or any form of Kafka implementation from Confluence for example. In the case of Kafka, MongoDB Kafka connector can be leveraged to push data to the database. Finally, leveraging the aggregation framework , the operating parameters of each device are queried for visualization via MongoDB Atlas Charts . In summary, the MongoDB document model elegantly mirrors OPC UA information and there are multiple options available to users who would like to push data from their OPC UA servers to MongoDB. To learn more about MongoDB’s role in the manufacturing sector, please visit our manufacturing webpage . To learn more about how Codelitt is digitally transforming industries, please visit their website .
MongoDB Named a Leader in the 2023 Gartner® Magic Quadrant™ for Cloud Database Management Systems
MongoDB is proud to be named a Leader in the 2023 Gartner® Magic Quadrant for Cloud Database Management Systems (CDBMS). We believe this makes MongoDB the only dedicated application database provider recognized as a Leader for two years running. At MongoDB, our focus on serving the needs of the community and adopters of the developer data platform drives our innovation and product ethos. This year it has led to an unprecedented level of new features and capabilities. Most acutely now, senior executives and decision makers want to exploit data and AI to transform their organizations. Yet most enterprises struggle in this endeavor due to the complexity of their database management systems that inhibit their developers from achieving strategic and critical objectives rather than aiding them. The foundational concept behind MongoDB’s developer data platform has been to overcome the most critical challenges around enterprise data, such as the unification of data across multiple applications, data synchronization, complexity with multitudes of workload-specific tools and technologies, vendor lock-in for proprietary formats, poor interoperability, and duplicate efforts across cloud platforms. The most recent innovations on our platform simplify your operational, transactional, and AI-powered workloads and maintain the kind of flexibility and openness that allows your organization to stay agile as you scale with cost-efficiency, transparency, and security in mind. It’s not like there aren’t other phones or other cars, but how it’s all packaged together in a way that’s so compelling and so user-friendly and enables people to do what they want to do, is essentially what differentiates MongoDB. There’s other [tools] out there, but it’s very clunky for a developer to connect their vector data to their metadata, to their core data, to be able to orchestrate all that, then figure out how to do the embeddings and all that. It just becomes a very convoluted process . 1 Dev Ittycheria, MongoDB CEO - November 2023 “Gartner defines the market for cloud database management systems (DBMSs) as the market for software products that store and manipulate data and that are primarily delivered as software as a service (SaaS) in the cloud. Cloud DBMSs may optionally be capable of running on-premises, or in hybrid, multicloud, or intercloud configurations.” To help users understand this emerging technology landscape, Gartner published its first Magic Quadrant for Cloud Database Management Systems back in 2020. Three years on and after evolving criteria to reflect how enterprise needs are changing, Gartner has named MongoDB as a Leader for the second consecutive year for the latest 2023 Magic Quadrant. Source: Gartner We believe MongoDB was named a Leader in this report partly because of new capabilities such as Vector Search and Queryable Encryption , that have been brought to market to meet the ever-increasing demand for new applications and workloads. In addition to listening to developers’ and their organizations’ needs for new capabilities, MongoDB’s product teams have also been obsessed with ensuring new functionality does not equate to increased complexity , nor forsake the tenet of intuitive ease of use. Capabilities such as the Stable API ensure that applications are future-proofed to work with subsequent MongoDB releases, something unprecedented and unique today. Now, Relational Migrator enables new adopters to migrate their existing applications and workloads in a prescriptive and guided manner that encourages best practices and sound architectural design. A tailored holistic approach - people, process, and technology Digital transformation and change management can be an arduous and daunting prospect, particularly in the absence of expertise and guidance from those who have walked the same path previously. While technology is an enabler, such initiatives will fail without due consideration for the appropriate skills and promotion of tried-and-tested best practices. At MongoDB, this is an intrinsic element to how we embrace and welcome new adopters of our developer data platform — a defacto standard for document database-oriented application development. In 2023, we launched two new programs to ensure developers and their organizations new to MongoDB are set up for success. The first was Atlas for Industries to help organizations accelerate cloud adoption and modernization by utilizing industry-specific expertise, programs, partnerships, and integrated solutions. No less than seven vertical-specific solutions were launched in 2023 covering Atlas for Financial Services , Government , Manufacturing and Automotive , Retail , Healthcare , Insurance , and Telecommunications . In each instance, organizations and their teams gain access to expert-led architectural design reviews, partner integrations and toolchains, innovation workshops, professional services teams, and dedicated training to ensure the relevant knowledge is harnessed and proliferates throughout the entire organization. The second initiative was the MongoDB AI Innovators Program , which provides organizations building AI-powered applications and technology with access to expert technical support and guidance, MongoDB Atlas credits, collaboration opportunities via our partner ecosystem, and joint go-to-market activities with MongoDB to accelerate innovation and time to market. The program consists of two tracks: one for early-stage ventures and another for more established companies. Both provide opportunities to join a community of founders and developers while gaining access to dedicated resources, training, and counsel from subject matter experts. Realizing the value of MongoDB The success of those taking advantage of MongoDB’s developer data platform is abundant, as evidenced when one looks at adopters among the most successful of enterprises across all industries: 64 of the Fortune 100 192 of the Fortune 500 457 of the Fortune 2000 The community of MongoDB users have also spoken on Gartner Peer Insights™; 94% of MongoDB users who provided reviews to Gartner Peer Insights platform said they would recommend us 2 (based on 49 ratings with an average of 4.6 out of 5 stars in the last 12 months, as of 15 January, 2023). Source: Gartner Peer Insights An ecosystem of interoperability Having a well-architected data platform is foundational, however working in a vacuum is another potential pitfall. Every organization, new or established, has infrastructure that must integrate and work together. At MongoDB, we realize it is critical that your application database works as a well-oiled component alongside the rest of your tooling ecosystem. It is imperative that technology partners work seamlessly with one another and in a coordinated and consistent manner. For this reason, MongoDB has cultivated a partner ecosystem; a collaborative approach spanning innovative and bleeding edge pioneers of genAI, established and ubiquitous cloud providers, and a further 1000+ technology and service integration partners . The value of system integrators (SI) is something MongoDB does not underestimate. In the last 12 months, MongoDB has certified over 10,000 SI Associates and Architects. These certifications enable our SI partners to accelerate our clients’ AI journeys by enabling sound design approaches, ensuring best practices are followed, and expediting delivery. Data accessibility is a critical element and our certification programs rapidly enable SIs to enable organizations to meet these challenging demands. Quantity is one thing, but what of the quality of these partnerships? It’s worth noting that MongoDB is the only independent software vendor (ISV) featured in each of the top four cloud providers’ consoles and start up programs. Our efforts have not only been recognised by the leading analyst firms but by our top tier technology partners as evidenced by the awards received in 2023 alone: AWS EMEA Marketplace Partner of the Year AWS ISV Partner of the Year for ASEAN AWS ISV Partner of the Year for Chile AWS ISV Partner of the Year for Taiwan Google Cloud Technology Partner of the Year for Data Management Google Cloud Data — Smart Analytics Technology Partner of the Year MongoDB also signed a strategic partnership with Microsoft in 2023. This followed a four-year extension to a strategic global partnership with Alibaba . MongoDB has worked closely with AWS to enhance the value offered via AI-powered AWS CodeWhisperer and Bedrock collaborative efforts. The importance of a unified ecosystem to enable generative AI use cases on the same stack that supports traditional transactional and analytical use cases is critical to ensure simplification, lower latency, and cost containment. Sanjeev Mohan , SanjMo Principal, and former Gartner Research VP - Big Data and Advanced Analytics In his Spotlight document 3 , Mohan lauds MongoDB for its extensive integration with the ever-expanding ecosystem of large language model (LLM) providers as well as ISVs developing and deploying the AI value chain. Evaluating a leader in Magic Quadrant for Cloud Database Management Systems “Which are the competing players in the major technology markets? How are they positioned to help you over the long haul? A Gartner Magic Quadrant is a culmination of research in a specific market, giving you a wide-angle view of the relative positions of the market’s competitors. By applying a graphical treatment and a uniform set of evaluation criteria, a Magic Quadrant helps you quickly ascertain how well technology providers are executing their stated visions and how well they are performing against Gartner’s market view." 4 MongoDB’s take: Given that they encompass technical, commercial, operational and strategic considerations, Gartner’s evaluation of 19 cloud DBMS vendors against 15 criteria (seven on execution and eight on vision) is comprehensive and extensive. Our market presence, above-market growth rates, and establishment as a nonrelational database standard have been consistently highlighted as strengths by several analyst firms and research organizations prior to the publication of this report 5 . It is our opinion that our placement as a Leader in this Magic Quadrant is a reflection of these trends as well as our holistic approach to supporting the MongoDB community. Visit the Gartner website to obtain the full report (requires Gartner subscription) 6 . Getting started on your cloud journey MongoDB Atlas is engineered to help you make the shift to the cloud. We would appreciate the opportunity to understand more about your key transformation initiatives and workshop ideas with your teams to accelerate delivery. In the interim, your engineers and developers can familiarize themselves with MongoDB right away by signing up for a free account on MongoDB Atlas . Have them create a free database cluster, load your own data or our sample data sets, and explore what’s possible within the platform. Following on from our enhanced MongoDB University program , launched in November 2022, and in June 2023 MongoDB partnered with digital learning providers , such as Coursera and LinkedIn Learning, to make our software development courses available to more students worldwide. Additionally, the MongoDB Developer Center hosts an array of resources including tutorials, sample code, videos, and documentation. 1 Source: SiliconANGLE article, November 2023: “ MongoDB looks to enable developers to do more sophisticated analytics”, Ryan Stevens 2 Gartner® and Peer Insights™ are trademarks of Gartner, Inc. and/or its affiliates. All rights reserved. Gartner Peer Insights content consists of the opinions of individual end users based on their own experiences, and should not be construed as statements of fact, nor do they represent the views of Gartner or its affiliates. Gartner does not endorse any vendor, product or service depicted in this content nor makes any warranties, expressed or implied, with respect to this content, about its accuracy or completeness, including any warranties of merchantability or fitness for a particular purpose. 3 Source: SanjMo article, December 2023: “ Spotlight: Evaluating MongoDB Atlas Vector Search ”, Sanjeev Mohan 4 Source: Gartner, January 2023: “ Positioning technology players within a specific market ” 5 Source: Forrester Wave: Translytical Data Platforms, Q4 2022 , ISG (Ventana Research) Operational Data Platforms 2023 Value Index and DB-Engines 6 Gartner and Magic Quadrant are registered trademarks of Gartner, Inc. and/or its affiliates in the U.S. and internationally and is used herein with permission. All rights reserved. Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose. The Gartner logo is a trademark and service mark of Gartner, Inc., and/or its affiliates, and is used herein with permission. All rights reserved.
Introducing the Full Stack FastAPI App Generator for Python Developers
We are thrilled to announce the release of the Full Stack FastAPI, React, MongoDB (FARM) base application generator, coinciding with FastAPI's emerging status as a leading modern Python framework. Known for its high performance and ease of use, FastAPI is quickly becoming a top choice for Python developers. This launch is a significant advancement for Python developers eager to build and maintain progressive web applications using the powerful combination of FastAPI and MongoDB. Bridging the Development Gap While it's always been easy and quick to start building modern web applications with MongoDB and FastAPI, in the past developers still had to make many decisions about other parts of the stack, such as authentication, testing, integration etc., and manually integrate these components in their application. Our new app generator aims to simplify some of this process further. It enables you to quickly spin up a production-grade, full-stack application, seamlessly integrating FastAPI and MongoDB, thereby significantly enhancing the developer experience. Simplifying the Development Journey Now, with the launch of our Full Stack FastAPI App Generator, MongoDB dramatically simplifies the initial stages of project setup for production-grade applications by providing a well-structured app skeleton and reduces the initial learning curve and the time spent setting up the project. For new learners and seasoned developers alike, this means less time figuring out the basics and more time building differentiated experiences for their users. Key Features Included in the App Generator: Complete Web Application Stack: Generates a foundation for your project development, integrating both front-end and back-end components. Docker Compose Integration: Optimized for local development. Built-in Authentication System: Includes user management schemas, models, CRUD, and APIs, with OAuth2 JWT token support and magic link authentication. FastAPI Backend Features: MongoDB Motor for database operations . MongoDB ODMantic for ODM creation . Common CRUD support via generic inheritance. Standards-based architecture, fully compatible with OpenAPI and JSON Schema . Next.js/React Frontend: Middleware authorization for page access control. Form validation using React Hook Form . State management with Redux . CSS and templates with TailwindCSS , HeroIcons , and HeadlessUI . Operational and Monitoring Tools: Includes Celery for task management, Flower for job monitoring, Traefik for seamless load balancing and HTTPS certificate automation, and Github Actions for comprehensive front-end and back-end testing. Start now Accelerate your web application development with MongoDB and FastAPI today. Visit our Github repository for the app generator and start transforming your web development experience.
Panel: How MongoDB is Helping Drive Innovation for Indian Organisations
At MongoDB.local Delhi a panel of CXOs and IT leaders discussed the strategies and challenges of using software to drive innovation in their organisations. Here are the key lessons they shared. Fostering innovation: tips and challenges Our panel, which included representatives from Appy Pie, Autodit, Formidium, and Tech Mahindra, agreed that the rapid development of data analytics technology, and the scarcity of trained talent on the ground, were key challenges when it comes to driving innovation. To stay on top, Tech Mahindra has a dedicated talent acquisition engine to keep tabs on those technologies, and customer requirements. “We imbibe these learnings, so we’re equipped to deliver solutions on the ground,” explained Shuchi Agrawal, Global Head for Data Analytics, Pre-Sales, and Solutioning at Tech Mahindra (IT services and consulting). “When I think of data and MongoDB, I think MongoDB BIRT (business intelligence reporting tools),” says Shuchi. To accelerate their customers’ journey of transformation, for Tech Mahindra, automation must be based on innovations on the baseline of analytics workloads, i.e. data. “That’s why MongoDB is one of the key elements in most of our solution designs when we’re looking for some of the advanced analytics workloads,” says Shuchi. Choosing technology and evaluating products For Vaibhav Agrawal, Executive Vice President of Formidium, selecting technology to drive innovation comes with key caveats: it must be easy to implement, the talent must exist in the market to do the implementation, zero-trust is essential — as data security is paramount for customers — and it must perform in terms of scalability, efficiency, optimization, and monitoring. “If those things are there, you never have to go to other technology,” says Vaibhav. “And MongoDB Atlas comes in on all those check marks perfectly — that's why we chose it.” Enhancing innovation strategies with database features Vaibhav observed two aspects to any innovation: a) having the idea and creating something, and b) re-innovating it. So, for innovation to perpetuate, ideas must be adapted according to your experience, market changes, and changes in technology — and that means reviewing your products’ performance. “MongoDB Atlas has that amazing 360-degree view of your database activities, and the monitoring of your resources,” says Vaibhav. “Plus, it's very easy to get analytics out of it and change the course of your innovation.” “You always need to keep watch on the performance of a database," he adds. "Then you will be able to keep pace with innovation for years.” To see more announcements and get the latest product updates, visit our What's New page.
Building AI With MongoDB: Boosting Productivity and Efficiency with Assistants and Agents
Among generative AI’s (genAI) many predicted benefits, its potential in unlocking new levels of employee productivity and operational efficiency are frequently cited. Over the course of our “Building AI with MongoDB” blog post series, we’ve featured multiple examples of genAI being used to automate repetitive tasks with virtual assistants and intelligent agents. From conversational AI with natural language processing (NLP) to research and analysis, examples from previous posts include: Ada : automating customer service for the likes of Meta, Shopify, and Verizon Eni : supporting its geologists’ research on the company’s path to net zero ExTrac : sifting through online chatter to track emerging threats to public safety Inovaare : transforming complex healthcare compliance processes Zelta : analyzing real-time customer feedback to prioritize product development In today’s roundup of AI builders, I’ll cover three more organizations that are applying genAI-powered assistants and agents. You’ll see how they are freeing staff to focus on more strategic and productive tasks while simplifying previously complex and expensive business processes. Check out our AI resource page to learn more about building AI-powered apps with MongoDB. WINN.AI: The virtual assistant tackling sales admin overhead Salespeople typically spend over 25% of their time on administrative busywork — costing organizations time, money, and opportunity. WINN.AI is working to change that so that sales teams can better invest their working hours in serving customers. At the heart of WINN.AI is an AI-powered real-time sales assistant that joins virtual meetings to detect, interpret, and respond to customer questions. By comprehending the context of a conversation, it can immediately surface relevant information to the salesperson, for example retrieving appropriate customer references or competitive information. It can provide prompts from a sales playbook, and also make sure meetings stay on track and on time. At the end of the meeting, WINN.AI extracts and summarizes relevant information from the conversation and updates the CRM system with follow-on actions. Discussing its AI technology stack, Orr Mendelson, Ph.D., the head of R&D at WINN.AI says: “We started out building and training our own custom NLP algorithms and later switched to GPT 3.5 and 4 for entity extraction and summarization. We orchestrate all of the models with massive automation, reporting, and monitoring mechanisms. This is developed by our engineering teams and assures high-quality AI products across our services and users. We have a dedicated team of AI engineers and prompt engineers that develop and monitor each prompt and response so we are continuously tuning and optimizing app capabilities.” In the ever-changing AI tech market, MongoDB is our stable anchor … my developers are free to create with AI while being able to sleep at night Orr Mendelson, head of R&D at WINN.AI Describing its use of MongoDB Atlas, Mendelson says: “MongoDB stores everything in the WINN.AI platform. The primary driver for selecting MongoDB was its flexibility in being able to store, index, and query data of any shape or structure. The database fluidly adapts to our application’s data objects, which gives us a more agile approach than traditional relational databases.” Mendelson adds, “MongoDB is familiar to our developers so we don’t need any DBA or external experts to maintain and run it safely. We can invest those savings back into building great AI-powered products. MongoDB Atlas provides the managed services we need to run, scale, secure, and back up our data." WINN.AI is part of the MongoDB AI Innovators program , benefiting from access to free Atlas credits and technical expertise. Take a look at the full interview with Mendleson to learn more about WINN.AI and its AI developments. One AI: Providing AI-as-a-Service to deliver solutions in days rather than months The mission at One AI is to bring AI to everyday life by converting natural language into structured, actionable data. It provides seamless integration into products and services, and uses generative AI to redefine human-machine interactions. One AI curates and hones leading AI capabilities from across the ecosystem, and packages them as easy-to-use APIs. It’s a simple but highly effective concept that empowers businesses to deploy tailored AI solutions in days rather than weeks or months. “One AI was founded with the goal of democratizing and delivering AI as a service for companies,” explains Amit Ben, CEO and founder at One AI. “Our customers are product and services companies that plug One AI into the heart and core value of their products,” says Ben. “They are spread across use cases in multiple domains, from analyzing financial documents to AI-automated video editing.” Figure 1: The One AI APIs let developers analyze, process, and transform language input in their code. No training data or NLP/ML knowledge are required. One AI works with over 20 different AI/ML models. Having a flexible data infrastructure was key to help harness the latest innovations in data science, as Ben explains: “The MongoDB document model really allows us to spread our wings and freely explore new capabilities for the AI, such as new predictions, new insights, and new output data points.” Ben adds, “With any other platform, we would have to constantly go back to the underlying infrastructure and maintain it. Now, we can add, expand, and explore new capabilities on a continuous basis.” The company also benefits from the regular new releases from MongoDB, such as Atlas Vector Search , which Ben sees as a highly valuable addition to the platform’s toolkit. Ben explains: “The ability to have that vectorized language representation in the same database as other representations, which you can then access via a single query interface, solves a core problem for us as an API company." To learn more, watch the interview with Amit Ben. 4149.AI: Maximizing team productivity with a hypertasking AI-powered teammate 4149.AI helps teams get more work done by providing them with their very own AI-powered teammate. During the company’s private beta program, the autonomous AI agent has been used by close to 1,000 teams to help them track goals and priorities. It does this by building an understanding of team dynamics and unblocking key tasks. It participates in slack threads, joins meetings, transcribes calls, generates summaries from reports and whitepapers, responds to emails, updates issue trackers, and more. 4149.AI uses a custom-built AI-agent framework leveraging a combination of embedding models and LLMs from OpenAI and AI21 Labs, with text generation and entity extraction managed by Langchain. The models process project documentation and team interactions, persisting summaries and associated vector embeddings into Atlas Vector Search . There is even a no-code way for people to customize and expand the functionality of their AI teammate. Over time, the accumulated context generated for each team means more and more tasks can be offloaded to their AI-powered co-worker. The engineers at 4149.AI evaluated multiple vector stores before deciding on Atlas Vector Search. The ability to store summaries and chat history alongside vector embeddings in the same database accelerates developer velocity and the release of new features. It also simplifies the technology stack by eliminating unnecessary data movement. Hybrid search is another major benefit provided by the Atlas platform. The ability to pre-filter data with keyword-based Atlas Search before semantically searching vectors helps retrieve relevant information to users faster. Looking forward 4149.AI has an aggressive roadmap for its products as it starts to more fully exploit the chain-of-thought and multimodal capabilities provided by the most advanced language models. This will enable the AI co-worker to handle more creative tasks requiring deep reasoning such as conducting market research, monitoring the competitive landscape, and helping identify new candidates for job vacancies. The goal for these AI teammates is for them to eventually be able to take the initiative in what to do next rather than rely on someone to manually assign them a task. Being part of MongoDB’s AI Innovators program puts 4149.AI on a path to success with access to technical support and free Atlas credits, helping them quickly experiment using the native AI capabilities available in the MongoDB developer data platform. Getting started These are just a few examples of the capabilities of genAI-powered assistants and agents. Check out our library of AI case studies to see the range of applications developers are building with MongoDB. Our 10-minute learning byte is a great way to learn what you can do with Atlas Vector Search, how it’s different from other forms of search, and what you’ll need to get started using it.
Leveraging MongoDB Atlas in your Internal Developer Platform (IDP)
DevOps, a portmanteau of “Developer” and “Operations”, rose to prominence around the early 2010s and established a culture of incorporating automated processes and tools designed to deliver applications and services to users faster than the traditional software development process. A significant part of that was the movement to "shift left" by empowering developers to self-serve their infrastructure needs, in theory offering them more control over the application development lifecycle in a way that reduced the dependency on central operational teams. While these shifts towards greater developer autonomy were occurring, the proliferation of public clouds, specific technologies (like GitHub, Docker, Kubernetes, Terraform), and microservices architectures entered the market and became standard practice in the industry. As beneficial as these infrastructure advancements were, these technical shifts added complexity to the setups that developers were using as a part of their application development processes. As a result, developers needed to have a more in-depth, end-to-end understanding of their toolchain, and more dauntingly, take ownership of a growing breadth of infrastructure considerations. This meant that the "shift left" drastically increased the cognitive load on developers, leading to inefficiencies because self-managing infrastructure is time-consuming and difficult without a high level of expertise. In turn, this increased the time to market and hindered innovation. Concurrently, the increasing levels of permissions that developers needed within the organization led to a swath of compliance issues, such as inconsistent security controls, improper auditing, unhygienic data and data practices increased overhead which ate away at department budgets, and incorrect reporting. Unsurprisingly, the desire to enable developers to self-serve to build and ship applications hadn't diminished, but it became clear that empowering them without adding friction or a high level of required expertise needed to become a priority. With this goal in mind, it became clear that investment was required to quickly and efficiently abstract away the complexities of the operational side of things for developers. From this investment comes the rise of Platform Engineering and Internal Developer Platforms (whether companies are labeling it as such or not). Platform engineering and the rise of internal developer platforms Within a developer organization, platform engineering (or even a central platform team) is tasked with creating golden paths for developers to build and ship applications at scale while keeping infrastructure spend and cognitive load on developers low. At the core of the platform engineering ethos is the goal of optimizing the developer experience to accelerate the delivery of applications to customers. Like teaching someone to fish, platform teams help pave the way for greater developer efficiency by providing them with pipelines that they can take and run with, reducing time to build, and paving the way for greater developer autonomy without burdening developers with complexity. To do this, platform teams strive to design toolchains and workflows based on the end goals of the developers in their organization. Therefore, it’s critical for the folks tasked with platform engineering to understand the needs of their developers, and then build a platform that is useful to the target audience. The end result is what is often (but not exclusively) known as an Internal Developer Platform. What is an IDP? An IDP is a collection of tools and services, sourced and stitched together by central teams to create golden paths for developers who will then use the IDP to simplify and streamline application building. IDPs reduce complexity and lower cognitive load on developers - often by dramatically simplifying the experience of configuring infrastructure and services that are not a direct part of the developer's application. They encourage developers to move away from spending excess time managing the tools they use and allow them to focus on delivering applications at speed and scale. IDPs enable developers the freedom to quickly and easily build, deploy, and manage applications while reducing risk and overhead costs for the organization by centralizing oversight and iteration of development practices. An IDP is tailored with developers in mind and will often consist of the following tools: Infrastructure platform that enabled running a wide variety of workloads with the highest degree of security, resilience, and scalability, and a high degree of automation (eg. Kubernetes) Source code repository system that allows teams to establish a single source of truth for configurations, ensuring version control, data governance, and compliance. (eg. Github, Gitlab, BitBucket) Control interface that enables everyone working on the application to interact with and manage its resources. (eg. Port or Backstage) Continuous integration and continuous deployment (CI/CD) pipeline that applies code and infrastructure configuration to an infrastructure platform. (eg. ArgoCD, Flux, CircleCI, Terraform, CloudFormation) Data layer that can handle changes to schemas and data structures. (eg. MongoDB Atlas) Security layer to manage permissions in order to keep compliance. Examples of this are roles-based compliance tools or secrets management tools (eg. Vault). While some tools have overlap and not all of them will be a part of a specific IDP, the goal of platform engineering efforts is to build an IDP for their developers that is tightly integrated with infrastructure resources and services to maximize automation, standardization, self-service, and scale for developers, as well as maximizing security whilst minimizing overhead for the enterprise. While there will be many different terms that different organizations and teams use to refer to their IDP story, at its core, an IDP is a tailored set of tech, tools, and processes , built and managed by a central team, and used to provide developers with golden paths that enable greater developer self-service, lower cognitive load, and reduce risk. How does MongoDB Atlas fit into this story? Developers often cite working with data as one of the most difficult aspects of building applications. Rigid and unintuitive data technologies impede building applications and can lead to project failure if they don’t deliver the data model flexibility and query functionality that your applications demand. A data layer that isn’t integrated into your workflows slows deployments, and manual operations are a never-ending drag on productivity. Failures and downtime lead to on-call emergencies – not to mention the enormous potential risk of a data breach. Therefore, making it easy to work with data is critical to improving the developer experience. IDPs are in part about giving developers the autonomy to build applications. For this reason, MongoDB’s developer data platform is a natural fit for an IDP because it serves as a developer data platform that can easily fit into any team’s existing toolstack and abstracts away the complexities associated with self-managing a data layer. MongoDB’s developer data platform is a step beyond a traditional database in that it helps organizations drive innovation at scale by providing a unified way to work with data that address transactional workloads, app-driven analytics, full-text search, vector search, stream data processing, and more, prioritizing an intuitive developer experience and automating security, resilience, and performance at scale. This simplification and broad coverage of different use cases make a monumental difference to the developer experience. By incorporating MongoDB Atlas within an IDP, developer teams have a fully managed developer data platform at their disposal that enables them to build and underpin best-in-class applications. This way teams won’t have to worry about adding the overhead and manual work involved in self-hosting a database and then building all these other supporting functionality that come out of the box with MongoDB Atlas. Lastly, MongoDB Atlas can be hosted on more cloud regions than any other cloud database in the market today with support for AWS, Azure, and Google Cloud. How can I incorporate MongoDB Atlas into my IDP? MongoDB Atlas’ Developer Data Platform offers many ways to integrate Atlas into their IDP through many tools that leverage the MongoDB Atlas Admin API. The Atlas Admin API can be used independently or via one of these tools/integrations and provides a programmatic interface to directly manage and automate various aspects of MongoDB Atlas, without needing to switch between UIs or incorporate manual scripts. These tools include: Atlas Kubernetes Operator HashiCorp Terraform Atlas Provider AWS CloudFormation Atlas Resources Atlas CDKs Atlas CLI Atlas Go SDK Atlas Admin API With the Atlas Kubernetes Operator, platform teams are able to seamlessly integrate MongoDB Atlas into the current Kubernetes deployment pipeline within their IDP allowing their developers to manage Atlas in the same way they manage their applications running in Kubernetes. First, configurations are stored and managed in a git repository and applied to Kubernetes via CD tools like ArgoCD or Flux. Then, Atlas Operator's custom resources are applied to Atlas using the Atlas Admin API and support all the building blocks you need, including projects, clusters, database users, IP access lists, private endpoints, backup, and more. For teams that want to take the IaC route in connecting Atlas to their IDP, Atlas offers integrations with HashiCorp Terraform and AWS CloudFormation which can also be used to programmatically spin up Atlas services off the IaC integrations built off the Atlas Admin API in the Cloud environment of their choice.. Through provisioning with Terraform, teams can deploy, update, and manage Atlas configurations as code with either the Terraform Provider or the CDKTF. MongoDB also makes it easier for Atlas customers who prefer using AWS CloudFormation to easily manage, provision, and deploy MongoDB Atlas services in three ways: through resources from the CloudFormation Public Registry, AWS Quick Starts, and the AWS CDK. Other programmatic ways that Atlas can be incorporated into an IDP are through Atlas CLI, which interacts with Atlas from a terminal with short and intuitive commands and accomplishes complex operational tasks such as creating a cluster or setting up an access list interactively Atlas Go SDK which provides platform-specific and Go language-specific tools, libraries, and documentation to help build applications quickly and easily Atlas Admin API provides a RESTful API, accessed over HTTPS, to interact directly with MongoDB Atlas control plane resources. Get started with MongoDB Atlas today The fastest way to get started is to create a MongoDB Atlas account from the AWS Marketplace , Azure Marketplace , or Google Cloud Marketplace . Go build with MongoDB Atlas today!
MongoDB Design Reviews Help Customers Achieve Transformative Results
The pressure to deliver flawless software can weigh heavily on developers' minds and cause teams to second-guess their processes. While no amount of preparation can guarantee success, we've found that a design review conducted by members of the MongoDB Developer Relations team can go a long way in ensuring best practices have been followed and that optimizations are in place to help the team deliver confidently. Design reviews are hour-long sessions where we partner with our customers to help them fine-tune their data models for specific projects or use cases. They serve to give our customers a jump start in the early stages of application design when the development team is new to MongoDB and trying to understand how best to model their data to achieve their goals. A design review is a valuable enablement session that leverages the development team’s own workload as a case study to illustrate performant and efficient MongoDB design. We also help customers explore the art of the possible and put them on the right path toward achieving their desired outcomes. When participants leave these sessions, they carry the knowledge and confidence to evolve their designs independently. The underlying principle that characterizes these reviews is the domain-driven design ethos, an indispensable concept in software engineering. Design isn't merely a box to tick; it's a daily routine for developers. Design reviews are more than just academic exercises; they hold tangible goals. A primary aim is to enable and educate developers on a global scale, transitioning them away from legacy systems like Oracle. It's about supporting developers, helping them overcome obstacles, and imparting critical education and training. Mastery of the tools is essential, and our sessions delve deep into addressing access patterns and optimizing schema for performance. At its core, a design review is a catalyst for transformation. It's a collaborative endeavor, merging expertise and fostering an environment where innovation thrives. It's not just about reviewing. When our guidance and expertise are combined with developer innovation and talent, the journey from envisioning to implementing a robust data model becomes a shared success. During the session, our experts look at the workload's data-related functional requirements — like data entities and, in particular, reads and writes — along with non-functional requirements like growth rates, performance, and scalability. With these insights in hand, we can recommend target document schemas that help developers achieve the goals they established before committing their first lines of code. A properly designed document schema is fundamental for performant and cost-efficient operations. Getting schema wrong is often the number one reason why projects fail. Design reviews help customers avoid lost time and effort due to poor schemas. Design reviews in practice Not long ago, we were approached by a customer in financial services who wanted us to conduct a design review for an application they were building in MongoDB Atlas . The application was designed to give regional account managers a comprehensive view of aggregated performance data. Specifically, it aimed to provide insights into individual stock performance within a customer's portfolio across a specified time frame within a designated region. When we talked to them, the customer highlighted an issue with their aggregation pipeline , which was taking longer than expected, ranging from 20 to 40 seconds to complete. Their SLA demanded a response time of under two seconds. Most design reviews involve a couple of steps to assess and diagnose the problem. The first involves assessing the workload. During this step, a few of the things we look at include: Number of collections The documents in collections How many records documents contain How frequently data is being written or updated in the collections What hours of the day see the most activity How much storage is being consumed Whether and how old data is being purged from collections The cluster size the customer is running in MongoDB Once we performed this assessment for our finserv customer, we had a better understanding of the nature and scale of the workload. The next step was examining the structure of the aggregation pipeline. What we found was that the way data was being collected had a few unnecessary steps, such as breaking down the data and then reassembling it through various $unwind and $group stages. The MongoDB DevRel experts suggested using arrays to reduce the number of steps involved to just two: first, finding the right data, and then, looking up the necessary information. Eliminating the $group stage reduced the response time to 19 seconds — a significant improvement but still short of the target. In the next step of the design review, the MongoDB DevRel team looked to determine which schema design patterns could be applied to optimize the pipeline performance. In this particular case, there was a high volume of stock activity documents being written to the database every minute, but users were querying only a limited number of times per day. With this in mind, our DevRel team decided to apply the computed design pattern . The computed pattern is ideal when you have data that needs to be computed repeatedly in an application. By pre-calculating and saving commonly requested data, it avoids having to do the same calculation each time the data is requested. With our finserv customer, we were able to pre-calculate the trading volume and the starting, closing, high, and low prices for each stock. These values were then stored in a new collection that the $lookup pipeline could access. This resulted in a response time of 1800 ms — below our two-second target SLA, but our DevRel team wasn't finished. They performed additional optimizations, including using the extended reference pattern to embed region data in the pre-computed stock activity so that all the related data can be retrieved with a single query and avoiding the use of a $lookup-based join. After the team was finished with their optimizations, the final test execution of the pipeline resulted in a response time of 377 ms — a 60x improvement in the performance of their aggregation pipeline and more than four times faster than the application target response time. Read the complete story , including a step-by-step breakdown with code examples of how we helped one of our financial services customers achieve a 60x performance improvement. If you'd like to learn more about MongoDB data modeling and aggregation pipelines, we recommend the following resources: Daniel Coupal and Ken Alger’s excellent series of blog posts on MongoDB schema patterns Daniel Coupal and Lauren Schaefer’s equally excellent series of blog posts on MongoDB anti-patterns Paul Done’s ebook, Practical MongoDB Aggregations MongoDB University Course, " M320 - MongoDB Data Modeling " If you're interested in a Design Review, please contact your account representative .