ashley-george

2864 results

Teach & Learn with MongoDB: Professor Abdussalam Alawini, University of Illinois at Urbana-Champaign

In this series of interviews, we talk to students and educators around the world who are using MongoDB to make their classes more engaging and relevant. By exploring their stories, we uncover how MongoDB’s innovative platform and resources are transforming educational landscapes and empowering the next generation of tech-savvy professionals. From creative teaching approaches to advanced classroom solutions, the MongoDB for Educators program can help you transform your classroom with cutting-edge technology and free resources. It can help you provide students with an interactive and dynamic learning environment that bridges the gap between theoretical knowledge and practical application. The program includes a variety of free resources for educators crafted by MongoDB experts to prepare learners with in-demand database skills and knowledge. Program participants have access to MongoDB Atlas credits, curriculum materials, certifications, and membership in a global community of educators from over 700 universities. From theory to practice: Hands-on MongoDB Teaching Professor Abdussalam Alawini is known for his creative use of MongoDB in his courses. He heavily uses MongoDB's free cluster to demonstrate MongoDB concepts during classes, and his students also use the free cluster for their projects, giving them hands-on experience with real-world applications. Currently, a Teaching Associate Professor at the University of Illinois Urbana-Champaign, Professor Alawini’s research interests span databases, applied machine learning, and education. He is particularly focused on applying machine learning methods to enhance classroom experiences and education. His work also includes developing next-generation data management systems, such as data provenance, citation, and scientific management systems. He recently received the U of I’s 2024 Campus Excellence in Undergraduate Education award, which highlights his commitment to teaching and the impact he’s had on his students. Professor Alawini is currently collaborating with colleagues on research to map how databases, data systems, data management, and related courses are taught in introductory computer science undergraduate courses worldwide. Professor Alawini’s story offers valuable insights for educators eager to enhance their teaching and prepare students for a tech-driven future. Check out how MongoDB Atlas has revolutionized his teaching by simplifying database deployment, management, and scaling, allowing students to focus more on learning MongoDB concepts. Tell us about your educational journey and what sparked your interest in databases. My educational journey began with a bachelor's degree in Computer Science from the University of Tripoli in 2002. I then spent over six years in the industry as a database administrator, lead software developer, and IT Manager. In 2011, I returned to academia and earned two master's degrees in Computer Science and Engineering and Technology Management from Portland State University, followed by a Ph.D. in Computer Science in 2016. Subsequently, I joined the University of Pennsylvania for a two-year postdoctoral training. My interest in databases was sparked during my time as a database administrator at PepsiCo, where I enjoyed maintaining the company's databases and building specialized reports to improve business operations. I was particularly fascinated by database systems’ ability to optimize queries and handle millions of concurrent user requests seamlessly. This experience led me to focus my doctoral studies on building data management systems for scientific applications. What courses are you currently teaching at the University of Illinois Urbana-Champaign? Currently, I teach Database Systems and Data Management in the Cloud courses at the University of Illinois Urbana-Champaign. In addition, I also teach a course to University High School students to introduce them to data management and database basics. My intention with teaching databases to high schoolers is to use data management as a gateway to lower entry barriers into computing fields for non-computer science students and to recruit underrepresented minorities to computing. What inspired you to start teaching MongoDB? I was inspired to start teaching MongoDB after seeing several surveys indicating that it is the most used database in web development and one of the leading document-oriented databases. MongoDB offers several unique features that set it apart from other databases, including the aggregation pipeline, which simplifies data processing and transformation. Additionally, MongoDB's flexible schema design allows for easier handling of unstructured data, and its horizontal scalability ensures robust performance as data volumes grow. These features make MongoDB an essential tool for modern web development, and I wanted to equip my students with the skills to leverage this powerful technology. How do you design your course content to effectively integrate MongoDB and engage students in practical learning? In all my data management courses, I focus on teaching students the concept of data models, including relational, document, key-value, and graph. In my Database Systems course, I teach MongoDB alongside SQL and Neo4J to highlight the unique features and capabilities of each data model. This comparative approach helps students appreciate the importance and applications of different databases, ultimately making them better data engineers. In my Data Management in the Cloud course, I emphasize the system's side of MongoDB, particularly its scalability. Understanding how MongoDB is built to handle large volumes of data efficiently provides students with practical insights into managing data in a cloud environment. To effectively integrate MongoDB and engage students in practical learning, I use a hybrid flipped-classroom approach. Students watch recorded lectures before class, allowing us to dedicate class time to working through examples together. Additionally, students form teams to work on various data management scenarios using a collaborative online assessment tool called PrairieLearn. This model fosters peer learning and collaboration, enhancing the overall educational experience. How has MongoDB supported you in enhancing your teaching methods and upskilling your students? I would like to sincerely thank MongoDB for Academia for the amazing support and material they provided to enhance my course design. The free courses offered at MongoDB University have significantly improved my course delivery, allowing me to provide more in-depth and practical knowledge to my students. I heavily use MongoDB's free cluster to demonstrate MongoDB concepts during classes, and my students also use the free cluster for their projects, which gives them hands-on experience with real-world applications. MongoDB Atlas has been a game-changer in my teaching methods. As a fully managed cloud database, it simplifies the process of deploying, managing, and scaling databases, allowing students to focus on learning and applying MongoDB concepts without getting bogged down by administrative tasks. The flexibility and reliability of MongoDB Atlas make it an invaluable tool for both educators and students in the field of data management. Could you elaborate on the key findings from your ITiCSE paper on students' experiences with MongoDB and how these insights can help other educators? In my ITiCSE paper, we conducted an in-depth analysis of students' submissions to MongoDB homework assignments to understand their learning experiences and challenges. The study revealed that as students use more advanced MongoDB operators, they tend to make more reference errors, indicating a need for a better conceptual understanding of these operators. Additionally, when students encounter new functionalities, such as the $group operator, they initially struggle but generally do not repeat the same mistakes in subsequent problems. These insights suggest that educators should allocate more time and effort to teaching advanced MongoDB concepts and provide additional support during the initial learning phases. By understanding these common difficulties, instructors can better tailor their teaching strategies to improve student outcomes and enhance their learning experience. What advice would you give to fellow educators who are considering implementing MongoDB in their own courses to ensure a successful and impactful experience for their students? Implementing MongoDB in your courses can be highly rewarding. Here’s some advice to ensure success: Foundation in Data Models: Teach MongoDB alongside other database types to highlight unique features and applications, making students better data engineers. Utilize MongoDB Resources: Leverage support from MongoDB for Academia, free courses from MongoDB University, and free clusters for hands-on projects. Practical Learning: Use MongoDB Atlas to simplify database management and focus on practical applications. Focus on Challenges: Allocate more time for advanced MongoDB concepts. Address common errors and use tools like PrairieLearn that capture students' interactions and learning progress to identify learning patterns and adjust instruction. Encourage Real-World Projects: Incorporate practical projects to enhance skills and relevance. Continuous Improvement: Gather feedback to iteratively improve course content and share successful strategies with peers. MongoDB is always evolving so make sure to stay tuned with their updates and new features. These steps will help create an engaging learning environment, preparing students for real-world data management. Apply to MongoDB for Educators program and explore free resources for educators crafted by MongoDB experts to prepare learners with in-demand database skills and knowledge.

July 10, 2024

Building Gen AI with MongoDB & AI Partners | June 2024

Even for those of us who work in AI, keeping up with the latest news in the AI space can be head-spinning. In just the last few weeks, OpenAI introduced their newest model (GPT-4o), Anthropic continued to develop Claude with the launch of Claude 3.5 Sonnet, and Mistral launched Mixtral 8x22B, their most efficient open model to date. And those are only a handful of recent releases! In such an ever-changing space, partnerships are critical to combining the strengths of organizations to create solutions that would be challenging to develop independently. Also, it can be overwhelming for any one business to keep track of so much change. So there’s a lot of value in partnering with industry leaders and new players alike to bring the latest innovations to customers. I’ve been at MongoDB for less than a year, but in that time our team has already built dozens of strategic partnerships that are helping companies and developers build AI applications faster and safer. I love to see these collaborations take off! A compelling example is MongoDB’s recent work with Vercel. Our team developed an exciting sample application that allows users to deploy a retrieval-augmented generation (RAG) application on Vercel in just a few minutes. By leveraging a MongoDB URI and an OpenAI key, users can one-click deploy this application on Vercel. Another recent collaboration was with Netlify. Our team also developed a starter template that implements a RAG chatbot on top of their platform using LangChain and MongoDB Atlas Vector Search capabilities for storing and searching the knowledge base that powers the chatbot's responses. These examples demonstrate the power of combining MongoDB's robust database capabilities with other deployment platforms. They also show how quickly and efficiently users can set up fully functional RAG applications, and highlight the significant advantages that partnerships bring to the AI ecosystem. And the best part? We’re just getting started! Stay tuned for more information about the MongoDB AI Applications Program later this month. Welcoming new AI partners Speaking of partnerships, in June we welcomed seven AI partners that offer product integrations with MongoDB. Read on to learn more about each great new partner. AppMap is an open source personal observability platform to help developers keep their software secure, clear, and aligned. Elizabeth Lawler, CEO of AppMap, commented on our joint value for developers. “AppMap is thrilled to join forces with MongoDB to help developers improve and optimize their code. MongoDB is the go-to data store for web and mobile applications, and AppMap makes it easier than ever for developers to migrate their code from other data stores to MongoDB and to keep their code optimized as their applications grow and evolve.” Read more about our partnership and how to use AppMapp to improve the quality of code running with MongoDB. Mendable is a platform that automates customer services providing quick and accurate answers to questions without human intervention. Eric Ciarla, co-founder of Mendable, highlighted the importance of our partnership. "Our partnership with MongoDB is unlocking massive potential in AI applications, from go to market copilots to countless other innovative use cases,” he said. “We're excited to see teams at MongoDB and beyond harnessing our combined technologies to create transformative AI solutions across all kinds of industries and functions." Learn how Mendable and MongoDB Atlas Vector Search power customer service applications. OneAI is an API-first platform built for developers to create and manage trusted GPT chatbots. Amit Ben, CEO of One AI, shared his excitement about the partnership. "We're thrilled to partner with MongoDB to help customers bring trusted GenAI to production. OneAI's platform, with RAG pipelines, LLM-based chatbots, goal-based AI, anti-hallucination guardrails, and language analytics, empowers customers to leverage their language data and engage users even more effectively on top of MongoDB Atlas." Check out some One AI’s GPT agents & advanced RAG pipelines built on MongoDB. Prequel allows companies to sync data to and from their customers' data warehouses, databases, or object storage so they get better data access with less engineering effort. "Sharing MongoDB data just got easier with our partnership,” celebrated Charles Chretien, co-founder of Prequel. “Software companies running on MongoDB can use Prequel to instantly share billions of records with customers on every major data warehouse, database, and object storage service.” Learn how you can share MongoDB data using Prequel. Qarbine complements summary data visualization tools allowing for better informed decision-making across teams. Bill Reynolds, CTO of Qarbine, mentioned the impact of our integration to distill better insights from data: “We’re excited to extend the many MongoDB Atlas benefits upward in the modern application stack to deliver actionable insights from publication quality drill-down analysis. The native integrations enhance in-app real-time decisions, business productivity and operational data ROI, fueling modern application innovation.” Want to power up your insights with MongoDB Atlas and Qarbine? Read more . Temporal is a durable execution platform for building and scaling invincible applications faster. "Organizations of all sizes have built AI applications that are ‘durable by design’ using MongoDB and Temporal. The burden of managing data and agent task orchestration is effortlessly abstracted away by Temporal's development primitives and MongoDB's Atlas Developer Data Platform”, says Jay Sivachelvan, VP of Partnerships at Temporal. He also highlighted the benefits of this partnership. “These two solutions, together, provide compounding benefits by increasing product velocity while also seamlessly automating the complexities of scalability and enterprise-grade resilience." Learn how to build microservices in a more efficient way with MongoDB and Temporal. Unstructured is a platform that connects any type of enterprise data for use with vector databases and any LLM framework. Read more about enhancing your gen AI application accuracy using MongoDB and Unstructured. But wait, there's more! To learn more about building AI-powered apps with MongoDB, check out our AI Resources Hub , and stop by our Partner Ecosystem Catalog to read about our integrations with MongoDB’s ever-evolving AI partner ecosystem.

July 9, 2024

Elevate Your Python AI Projects with MongoDB and Haystack

MongoDB is excited to announce an integration with Haystack, enhancing MongoDB Atlas Vector Search for Python developers. This integration amplifies our commitment to providing developers with cutting-edge tools for building AI applications centered around semantic search and Large Language Models (LLMs). We’re excited to partner with MongoDB to help developers build top-tier LLM applications. The new Haystack and MongoDB Atlas integration lets developers seamlessly use MongoDB data in Haystack, a reliable framework for creating quality LLM pipelines for use cases like RAG, QA, and agentic pipelines. Whether you're an experienced developer or just starting, your gen AI projects can quickly progress from prototype to adoption, accelerating value for your business and end-users. Malte Pietsch, co-founder and CTO, deepset Simplifying AI app development with Haystack Haystack is an open-source Python framework that simplifies AI application development. It enables developers to start their projects quickly, experiment with different AI models, and to efficiently scale their applications. Indeed, Haystack is particularly effective for building applications requiring semantic understanding and natural language processing (NLP), such as chatbots and question-answering systems. Haystack’s core features include: Components: Haystack breaks down complex NLP tasks into manageable components, such as document retrieval or text summarization. With the new MongoDB-Haystack integration, MongoDB becomes the place where all your data lives, ready for Haystack to use. Pipelines: Haystack lets you link components together into pipelines for more complex tasks. With this integration, your MongoDB data flows through these pipelines. Agents: Haystack Agents use LLMs to resolve complex queries. They can decide which tools (or components) to use for a given question, leveraging MongoDB data to deliver smarter answers. Atlas Vector Search: Enhance AI development with Haystack At the heart of the new integration is MongoDB Atlas Vector Search , transforming how applications search and retrieve data. By leveraging vector embeddings, Atlas Vector Search goes beyond mere keyword matching: it interprets the intent behind queries, enabling applications to provide highly relevant, context-aware responses. This is a breakthrough for Python developers who aim to build applications that think and understand like humans. Building on this foundation, the Atlas Vector Search and Haystack integration gives Python developers a powerful toolkit for navigating the complexities of AI application development. MongoDB becomes a dynamic document store within Haystack's framework, optimizing data storage, processing, and retrieval. Additionally, the integration eases the use of advanced AI models from leading providers such as OpenAI and Cohere into your applications. Developers can thus create applications that do more than just answer queries—they grasp and act on the underlying intent, ensuring responses are both accurate and contextually relevant. What this means for Python developers For Python developers, this integration means: Faster development: Developers can focus on building and innovating rather than spending time configuring and managing infrastructure. MongoDB's integration with Haystack means you can get up and running quickly, leveraging the best of both technologies to accelerate your development cycles. Smarter applications: By utilizing Haystack's powerful Natural Language Processing tooling in combination with MongoDB Atlas Vector Search’s efficient data handling, developers can create applications that understand and process natural language more effectively. This results in applications that can provide more accurate and contextually relevant responses that resonate with user intent. Access to pre-trained AI models: With seamless integration of leading generative AI models from providers like OpenAI, Anthropic, Cohere, Hugging Face, and AWS Bedrock, Python developers can easily incorporate advanced AI functionalities into their projects. This means developers can quickly adopt state-of-the-art models without the need for extensive training or fine-tuning, saving time and resources. Flexible and scalable pipelines: Haystack's modular approach to building AI applications, through its use of components and pipelines, allows developers to create flexible and scalable solutions. With MongoDB data seamlessly flowing through these pipelines, you can easily adapt and expand your applications to meet growing demands and new challenges. Robust search capabilities: Atlas Vector Search transforms the way applications retrieve and interpret data, going beyond simple keyword searches. It enables applications to perform high-precision searches that return more relevant and semantically rich results. This advanced search capability is crucial for developing applications that require high levels of semantic understanding and accuracy. By integrating MongoDB with Haystack, Python developers are equipped with a powerful toolkit that not only simplifies the AI development process but also significantly enhances the intelligence and functionality of their applications. Whether you are building chatbots, search engines, or other AI-driven applications, this integration provides the tools you need to create innovative and impactful solutions. Get started now Start leveraging the MongoDB and Haystack integration for your AI development. Explore our tutorial , documentation , or check out our github repository to begin building smarter, more intuitive Python projects today!

July 8, 2024

Nokia Corteca Scales Wi-Fi Connectivity to Millions of Devices With MongoDB Atlas

Nokia’s home Wi-Fi connectivity cloud platform was launched in 2019 as the Nokia WiFi Cloud Controller (NWCC). In 2023, it was renamed and relaunched as the Corteca Home Controller, becoming part of the Corteca software suite that delivers smarter broadband for a better experience. The Corteca Home Controller can be hosted on Amazon Web Services, Google Cloud, or Microsoft Azure, and is the industry’s first platform to support three management services—device management, Wi-Fi management, and application management. Supporting TR-369 (a standardized remote device management protocol) also allows the Home Controller to work in a multi-vendor environment, managing both Nokia broadband devices and third-party broadband devices. By solving connectivity issues before the end-user detects them, and by automatically optimizing Wi-Fi performance, the Home Controller helps deliver excellent customer experiences to millions of users, 24/7. During the five years that Nokia Corteca has been a MongoDB Atlas customer, the Home Controller has successfully scaled from 500,000 devices to over 4.5 million. There are now 75 telecommunications customers of Home Controller spread across all regions of the globe. Having the stability, efficiency, and performance to scale Nokia Corteca's solution is end-to-end, from applications embedded in the device, through the home, and into the cloud. Algorithms assess data extracted from home networks, based on which performance parameters automatically adjust as needed—changing Wi-Fi channels to avoid network interference, for example—thereby ensuring zero downtime. The Home Controller processes real-time data sent from millions of devices, generating massive volumes of data. With a cloud optimization team tasked with deploying the solution across the globe to ever more customers, the Home Controller needed to store and manage its vast dataset and to onboard new telecommunication organizations more easily without incurring any downtime. Prior to Nokia Corteca moving to MongoDB Atlas, its legacy relational database lacked stability and required both admin and application teams to manage operations. A flexible model with time series capabilities That's where MongoDB Atlas came in. Nokia was familiar with the MongoDB Atlas database platform, having already worked with it as part of a previous company acquisition and solution integration. As Nokia's development team had direct experience with the scalability, manageability, and ease of use offered by MongoDB Atlas, they knew it had the potential to address the Home Controller’s technical and business requirements. There was another key element: Nokia wanted to store time-series data—a sequence of data points in which insights are gained by analyzing changes over time. MongoDB Atlas has the unique ability to store operational and time series data in parallel and provides robust querying capabilities on that data. Other advantages include MongoDB's flexible schema, which helps developers store data to match the application's needs and adapt as data changes over time. MongoDB Atlas also provides features such as Performance Advisor that monitors the performance of the database and makes intelligent recommendations to optimize and improve the performance and resource consumption Fast real time data browsing and scalability made easy Previously, scaling the database had been time-consuming and manual. With MongoDB Atlas, the team can easily scale up as demand increases with very little effort and no downtime. This also means it is much more straightforward to add new clients, such as large telecommunications companies. Having started with 100GB of data, the team now has more than 1.3 terabytes, and can increase the disc space in a fraction of a second, positioning the team to be able to scale with the business. As the Home Controller grows and onboards more telcos, the team anticipates a strengthening relationship with MongoDB. “We have a very good relationship with the MongoDB team,” said Jaisankar Gunasekaran, Head of Cloud Hosting and Operations at Nokia. “One of the main advantages is their local presence—they’re accessible, they’re friendly, and they’re experts. It makes our lives easier and lets us concentrate on our products and solutions.” To learn more about how MongoDB can help drive innovation and capture customer imaginations, check out our MongoDB for Telecommunications page.

July 2, 2024

AI Apps: What the World Sees vs. What Developers See

Imagine you’re in the market for a new home in, say, Atlanta. And you’re on vacation in a different city. You see an amazing-looking house, whose design you love. You open up your favorite real estate app, snap a picture of this house, and type: “Find me a home that looks like this in Atlanta, in my price range, and within my budget, that’s also next to a park.” Seconds later, you’re served a list of homes that not only resemble this one, but match all your other specifications. This is what the world—specifically, consumers—expects when it comes to AI-powered applications. But when developers see the possibilities for these hyper-personalized, interactive, and conversational apps, they also see what goes into building them. A video showing the behind-the-scenes of an AI-powered real estate app. To make these advanced apps a reality, developers need to be able to unify operational and vector data . They also want to be able to use their preferred tools and popular LLMs. Most of all, developers are looking for a platform that makes their jobs easier—while, at the same time, providing a development experience that’s both seamless and secure. And it’s critical that developers have all of this. Because as in previous tech revolutions (the software revolution, the birth of the World Wide Web, the dawn of the smartphone, etc.), it’s developers who are leading the new AI revolution. And it’s developers who will use different kinds of data to push the boundaries of what’s possible. Take for instance audio data. Imagine a diagnostic application that records real-time sounds and turns those sounds into vectors. Then an AI model checks those sounds against a database of known issues: all of which pinpoints the specific sound that signals a potential problem that can now be fixed. Until recently, this kind of innovation wasn't possible. A video showing an AI-powered advanced diagnostics use case. This is also just the tip of the iceberg when it comes to the types of new applications that developers will build in this new era of AI. Especially when given a platform that not only makes working with operational and vector data easier, but provides an experience that developers actually love . To learn more about how developers are shaping the AI revolution, and how we at MongoDB not only celebrate them, but support them, visit www.mongodb.com/LoveYourDevelopers . There you can explore other AI use cases, see data requirements for building these more intelligent applications, discover developers who are innovating in this space, and get started with MongoDB Atlas for free .

July 1, 2024

Building Gen AI-Powered Predictive Maintenance with MongoDB

In today’s fast-evolving industrial landscape, digital transformation has become a necessity. From manufacturing plants to connected vehicles, the push towards predictive maintenance excellence is driving organizations to embrace smarter, more efficient ways of managing operations. One of the most compelling advancements in this domain is predictive maintenance powered by generative AI , a cutting-edge approach that will revolutionize how industries maintain and optimize their equipment. For manufacturers seeking maintenance excellence, a unified data store and a developer data platform are key enablers. These tools provide the foundation for integrating AI applications that can analyze sensor data, predict failures, and optimize maintenance schedules. MongoDB Atlas is the only multi-cloud developer data platform available that is designed to streamline and speed up developers' data handling. With MongoDB Atlas, developers can enhance end-to-end value chain optimization through AI/ML, advanced analytics, and real-time data processing, supporting cutting-edge mobile, edge, and IoT applications. In this post, we’ll explore the basics of predictive maintenance and how MongoDB can be used for maintenance excellence. Understanding the need for predictive maintenance Predictive maintenance is about anticipating and addressing equipment failures before they occur, ensuring minimal disruption to operations. Traditional maintenance strategies, like time-based or usage-based maintenance, are less effective than predictive maintenance because they don’t account for the varying conditions and complexities of machinery. Unanticipated equipment breakdown can result in line stoppage and substantial throughput losses, potentially leading to millions of dollars in revenue loss. Since the pandemic, many organizations have begun significant digital transformations to improve efficiency and resilience. However, a concerning gap exists between tech adoption and return on investment. While 89% of organizations have begun digital and AI transformations, only 31% have seen the expected revenue lift, and only 25% have realized the expected cost savings. These numbers highlight the importance of implementing new technologies strategically. Manufacturers need to carefully consider how AI can address their specific challenges and then integrate them into existing processes effectively. Predictive maintenance boosts efficiency and saves money Predictive maintenance uses data analysis to identify problems in machines before they fail. This allows organizations to schedule maintenance at the optimal time, maximizing machine reliability and efficiency. Indeed, according to Deloitte , predictive maintenance can lead to a variety of benefits, including: 3-5% reduction in new equipment costs 5-20% increase in labor productivity 15-20% reduction in facility downtime 10-30% reduction in inventory levels 5-20% reduction in carrying costs Since the concept was introduced, predictive maintenance has constantly evolved. We've moved beyond basic threshold-based monitoring to advanced techniques like machine learning (ML) models. These models can not only predict failures but also diagnose the root cause, allowing for targeted repairs. The latest trend in predictive maintenance is automated strategy creation. This involves using AI to not only predict equipment breakdowns but also to generate repair plans, ensuring the right fixes are made at the right time. Generative AI in predictive maintenance To better understand how gen AI can be used to build robust predictive maintenance solutions, let's dig into the characteristics of organizations that have successfully implemented AI. They exhibit common traits across five key areas: Identifying high-impact value drivers and AI use cases: Efforts should be concentrated on domains where artificial intelligence yields maximal utility rather than employing it arbitrarily. Aligning AI strategy with data strategy: Organizations must establish a strong data foundation with a data strategy that directly supports their AI goals. Continuous data enrichment and accessibility: High-quality data, readily available and usable across the organization, is essential for the success of AI initiatives. Empowering talent and fostering development: By equipping their workforce with training and resources, organizations can empower them to leverage AI effectively. Enabling scalable AI adoption: Building a strong and scalable infrastructure is key to unlocking the full potential of AI by enabling its smooth and ongoing integration across the organization. Implementing predictive maintenance using MongoDB Atlas When combined with a robust data management platform like MongoDB Atlas, gen AI can predict failures with remarkable accuracy and suggest optimal maintenance schedules. MongoDB Atlas is the only multi-cloud developer data platform designed to accelerate and simplify how developers work with data. Developers can power end-to-end value chain optimization with AI/ML, advanced analytics, and real-time data processing for innovative mobile, edge, and IoT applications. MongoDB Atlas offers a suite of features perfectly suited for building a predictive maintenance system, as shown in Figure 1 below. Its ability to handle both structured and unstructured data allows for comprehensive condition monitoring and anomaly detection. Here’s how you can build a generative AI-powered predictive maintenance software using MongoDB Atlas: Machine prioritization: This stage prioritizes machines for the maintenance excellence program using a retrieval-augmented generation (RAG) system that takes in structured and unstructured data related to maintenance costs and past failures. Generative AI revolutionizes this process by reducing manual analysis time and minimizing investment risks. At the end of this stage, the organization knows exactly which equipment or assets are well-suited for sensorization. Utilizing MongoDB Atlas, which stores both structured and unstructured data, allows for semantic searches that provide accurate context to AI models. This results in precise machine prioritization and criticality analysis. Failure prediction: MongoDB Atlas provides the necessary tools to implement failure prediction, offering a unified view of operational data, real-time processing, integrated monitoring, and seamless machine learning integration. Sensors on machines, like milling machines, collect data (e.g., air temperature and torque) and process it through Atlas Stream Processing , allowing continuous, real-time data handling. This data is then analyzed by trained models in MongoDB, with results visualized using Atlas Charts and alerts pushed via Atlas Device Sync to mobile devices, establishing an end-to-end failure prediction system. Repair plan generation: To implement a comprehensive repair strategy, generating a detailed maintenance work order is crucial. This involves integrating structured data, such as repair instructions and spare parts, with unstructured data from machine manuals. MongoDB Atlas serves as the operational data layer, seamlessly combining these data types. By leveraging Atlas Vector Search and aggregation pipelines , the system extracts and vectorizes information from manuals and past work orders. This data feeds into a large language model (LLM), which generates the work order template, including inventory and resource details, resulting in an accurate and efficient repair plan. Maintenance guidance generation: Generative AI is used to integrate service notes and additional information with the repair plan, providing enhanced guidance for technicians. For example, if service notes in another language are found in the maintenance management system, we extract and translate the text to suit our application. This information is then combined with the repair plan using a large language model. The updated plan is pushed to the technician’s mobile app via Atlas Device Sync. The system generates step-by-step instructions by analyzing work orders and machine manuals, ensuring comprehensive guidance without manually sifting through extensive documents. Figure 1: Achieving end-to-end predictive maintenance with MongoDB Atlas Developer Data Platform In the quest for operational excellence, predictive maintenance powered by generative AI and MongoDB Atlas stands out as a game-changer. This innovative approach not only enhances the reliability and efficiency of industrial operations but also sets the stage for a future where AI-driven insights and actions become the norm. By leveraging the advanced capabilities of MongoDB Atlas, manufacturers can unlock new levels of performance and productivity, heralding a new era of smart manufacturing and connected systems. If you would like to learn more about generative AI-powered predictive maintenance, visit the following resources: [Video] How to Build a Generative AI-Powered Predictive Maintenance Software [Whitepaper] Generative AI in Predictive Maintenance Applications [Whitepaper] Critical AI Use Cases in Manufacturing and Motion: Realizing AI-powered innovation with MongoDB Atlas

June 27, 2024

Unlock PDF Search in Insurance with MongoDB & SuperDuperDB

As industries go, the insurance industry is particularly document-driven. Insurance professionals, including claim adjusters and underwriters, spend considerable time handling documentation with a significant portion of their workday consumed by paperwork and administrative tasks. This makes solutions that speed up the process of reviewing documents all the more important. Retrieval-augmented generation (RAG) applications are a game-changer for insurance companies, enabling them to harness the power of unstructured data while promoting accessibility and flexibility. This is especially true for PDFs, which despite their prevalence are difficult to search, leading claim adjusters and underwriters to spend hours reviewing contracts, claims, and guidelines in this common format. By combining MongoDB and SuperDuperDB you can build a RAG-powered system for PDF search, thus bringing efficiency and accuracy to this cumbersome task. With a PDF search application, users can simply type a question in natural language and the app will sift through company data, provide an answer, summarize the content of the documents, and indicate the source of the information, including the page and paragraph where it was found. In this blog, we will dive into the architecture of how this PDF search application can be created and what it looks like in practice. Why should insurance companies care about PDF Search? Insurance firms rely heavily on data processing. To make investment decisions or handle claims, they leverage vast amounts of data, mostly unstructured. As previously mentioned, underwriters and claim adjusters need to comb through numerous pages of guidelines, contracts, and reports, typically in PDF format. Manually finding and reviewing every piece of information is time-consuming and can easily lead to expensive mistakes, such as incorrect risk estimations. Quickly finding and accessing relevant content is key. Combining Atlas Vector Search and LLMs to build RAG apps can directly impact the bottom line of an insurance company. Behind the scenes: System architecture and flow As mentioned, MongoDB and SuperDuperDB underpin our information retrieval system. Let’s break down the process of building it: The user adds the PDFs that need to be searched. A script scans them, creates the chunks, and vectorizes them (see Figure 1). The chunking step is carried out using a sliding window methodology, which ensures that potentially important transitional data between chunks is not lost, helping to preserve continuity of context. Vectors and chunk metadata are stored in MongoDB, and an Atlas Vector Search index is created (see Figure 3). The PDFs are now ready to be queried. The user selects a customer, asks a question, and the system returns an answer, where it was found and highlights the section with a red frame (see Figure 3). Figure 1: PDF chunking, embedding creation, and storage orchestrated with SuperDuperDB Each customer has a guidelines PDF associated with their account based on their residency. When the user selects a customer and asks a question, the system runs a Vector Search query on that particular document, seamlessly filtering out the non-relevant ones. This is made possible by the pre-filtering field included in the search query. Atlas Vector Search also takes advantage of MongoDB’s new Search Nodes dedicated architecture, enabling better optimization for the right level of resourcing for specific workload needs. Search Nodes provide dedicated infrastructure for Atlas Search and Vector Search workloads, allowing you to optimize your compute resources and fully scale your search needs independent of the database. Search Nodes provide better performance at scale, delivering workload isolation, higher availability, and the ability to optimize resource usage. Figure 2: PDF querying flow, orchestrated with SuperDuperDB SuperDuperDB SuperDuperDB is an open-source Python framework for integrating AI models and workflows directly with and across major databases for more flexible and scalable custom enterprise AI solutions. It enables developers to build, deploy, and manage AI on their existing data infrastructure and data, while using their preferred tools, eliminating data migration and duplication. With SuperDuperDB, developers can: Bring AI to their databases, eliminate data pipelines and moving data, and minimize engineering efforts, time to production, and computation resources. Implement AI workflows with any open and closed source AI models and APIs, on any type of data, with any AI and Python framework, package, class or function. Safeguard their data by switching from APIs to hosting and fine-tuning your own models, on your own existing infrastructure, whether on-premises or in the cloud. Easily switch between embedding models and LLMs, to other API providers as well as hosting your own models, on HuggingFace, or elsewhere just by changing a small configuration. Build next-generation AI apps on your existing database SuperDuperDB provides an array of sample use cases and notebooks that developers can use to get started, including vector search with MongoDB, embedding generation, multimodal search, retrieval-augmented generation (RAG), transfer learning, and many more. The demo showcased in this post is adapted from an app previously developed by SuperDuperDB. Let's put it into practice To show you how this could work in practice, let’s look at, an underwriter handling a specific case. The underwriter is seeking to identify the risk control measures as shown in Figure 3 below but needs to look through documentation. Analyzing the guidelines PDF associated with a specific customer helps determine the loss in the event of an accident or the new premium in the case of a policy renewal. The app assists by answering questions and displaying relevant sections of the document. Figure 3: Screenshot of the UI of the application, showing the question asked, the LLM’s answer, and the reference document where the information is found By integrating MongoDB and SuperDuperDB, you can create a RAG-powered system for efficient and accurate PDF search. This application allows users to type questions in natural language, enabling the app to search through company data, provide answers, summarize document content, and pinpoint the exact source of the information, including the specific page and paragraph. If you would like to learn more about Vector Search powered apps and SuperDuperDB, visit the following resources: PDF Search in Insurance Github repository Search PDFs at Scale with MongoDB and Nomic SuperDuperDB Github, includes notebooks and examples

June 24, 2024

Atlas Vector Search Once Again Voted Most Loved Vector Database

The 2024 Retool State of AI report has just been released, and for the second year in a row, MongoDB Atlas Vector Search was named the most loved vector database. Atlas Vector Search received the highest net promoter score (NPS), a measure of how likely a user is to recommend a solution to their peers. This post is also available in: Deutsch , Français , Español , Português , Italiano , 한국어 , 简体中文 . The Retool State of AI report is a global annual survey of developers, tech leaders, and IT decision-makers that provides insights into the current and future state of AI, including vector databases, retrieval-augmented generation (RAG) , AI adoption, and challenges innovating with AI. MongoDB Atlas Vector Search commanded the highest NPS in Retool’s inaugural 2023 report, and it was the second most widely used vector database within just five months of its release. This year, Atlas Vector Search came in a virtual tie for the most popular vector database, with 21.1% of the vote, just a hair behind pgvector (PostgreSQL), which received 21.3%. The survey also points to the increasing adoption of RAG as the preferred approach for generating more accurate answers with up-to-date and relevant context that large language models ( LLMs ) aren't trained on. Although LLMs are trained on huge corpuses of data, not all of that data is up to date, nor does it reflect proprietary data. And in those areas where blindspots exist, LLMs are notorious for confidently providing inaccurate "hallucinations." Fine-tuning is one way to customize the data that LLMs are trained on, and 29.3% of Retool survey respondents leverage this approach. But among enterprises with more than 5,000 employees, one-third now leverage RAG for accessing time-sensitive data (such as stock market prices) and internal business intelligence, like customer and transaction histories. This is where MongoDB Atlas Vector Search truly shines. Customers can easily utilize their stored data in MongoDB to augment and dramatically improve the performance of their generative AI applications, during both the training and evaluation phases. In the course of one year, vector database utilization among Retool survey respondents rose dramatically, from 20% in 2023 to an eye-popping 63.6% in 2024. Respondents reported that their primary evaluation criteria for choosing a vector database were performance benchmarks (40%), community feedback (39.3%), and proof-of-concept experiments (38%). One of the pain points the report clearly highlights is difficulty with the AI tech stack . More than 50% indicated they were either somewhat satisfied, not very satisfied, or not at all satisfied with their AI stack. Respondents also reported difficulty getting internal buy-in, which is often complicated by procurement efforts when a new solution needs to be onboarded. One way to reduce much of this friction is through an integrated suite of solutions that streamlines the tech stack and eliminates the need to onboard multiple unknown vendors. Vector search is a native feature of MongoDB's developer data platform, Atlas, so there's no need to bolt on a standalone solution. If you're already using MongoDB Atlas , creating AI-powered experiences involves little more than adding vector data into your existing data collections in Atlas. If you're a developer and want to start using Atlas Vector Search to start building generative AI-powered apps, we have several helpful resources: Learn how to build an AI research assistant agent that uses MongoDB as the memory provider, Fireworks AI for function calling, and LangChain for integrating and managing conversational components. Get an introduction to LangChain and MongoDB Vector Search and learn to create your own chatbot that can read lengthy documents and provide insightful answers to complex queries. Watch Sachin Smotra of Dataworkz as he delves into the intricacies of scaling RAG (retrieval-augmented generation) applications. Read our tutorial that shows you how to combine Google Gemini's advanced natural language processing with MongoDB, facilitated by Vertex AI Extensions to enhance the accessibility and usability of your database. Browse our Resources Hub for articles, analyst reports, case studies, white papers, and more. Want to find out more about recent AI trends and adoption? Read the full 2024 Retool State of AI report .

June 21, 2024

Atlas Vector Search 再次被评为最受欢迎的矢量数据库

Retool 的“2024 年 AI 现状”报告刚刚发布,MongoDB Atlas Vector Search 连续第二年被评为最受欢迎的矢量数据库。 Atlas Vector Search 获得了最高净推荐值 (NPS),该值用于衡量用户向同伴推荐解决方案的可能性。 Retool 的“AI 现状”报告是对开发者、技术领导者和 IT 决策者进行的全球年度调查,提供了对 AI 的当前和未来状态的洞察,包括矢量数据库、 检索增强生成 (RAG) 、AI 采用情况和使用 AI 创新的挑战。 MongoDB Atlas Vector Search 在 Retool 的 2023 年首份报告中获得了最高 NPS,并且在发布后仅五个月内就成为第二广泛使用的矢量数据库。今年,Atlas Vector Search 以 21.1% 的得票率并列成为最受欢迎的矢量数据库,仅次于获得 21.3% 投票率的 pgvector(PostgreSQL)。 该调查还指出,人们越来越多地采用 RAG 作为在大型语言模型 ( LLM ) 未受过训练的最新相关背景下生成更准确回答的首选方法。虽然 LLM 是在庞大的数据语料库中训练出来的,但并非所有数据都是最新的,也不能反映专有数据。在那些存在盲点的领域,LLM 因自信地提供不准确的“幻觉”而臭名昭著。微调是自定义 LLM 训练数据的一种方式,29.3% 的 Retool 调查受访者利用这种方法。但是,在拥有超过 5,000 名员工的企业中,现在有三分之一的企业利用 RAG 来访问时间敏感的数据(例如股市价格)和内部商业情报,例如客户和事务历史记录。 这是 MongoDB Atlas Vector Search 真正大放异彩的地方。在训练和评估阶段,客户可以轻松地利用他们在 MongoDB 中存储的数据来增强和显著改善其生成式 AI 应用程序的性能。 在一年的时间里,Retool 调查受访者的矢量数据库利用率急剧上升,从 2023 年的 20% 上升到 2024 年的 63.6%,令人瞠目。受访者表示,他们选择矢量数据库的主要评估标准是性能基准 (40%)、社区反馈 (39.3%) 和概念验证实验 (38%)。 该报告明确强调的痛点之一是 AI 技术堆栈的困难 。超过 50% 的受访者表示,他们对自己的 AI 堆栈比较满意、不太满意或完全不满意。受访者还表示,在获得内部支持方面存在困难,而在需要采用新解决方案时,采购工作往往会使这一问题变得更加复杂。减少这种摩擦的一种方法是通过一套集成的解决方案,简化技术堆栈,并消除加入多个未知供应商的需要。矢量搜索是 MongoDB 的开发者数据平台 Atlas 的原生功能,因此无需依赖独立的解决方案。如果您已经在使用 MongoDB Atlas ,创建 AI 驱动的体验只需将矢量数据添加到 Atlas 现有的 collection 中即可。 如果您是开发者,并想要开始使用 Atlas Vector Search 构建生成式人工智能应用程序,我们提供以下几个有用资源: 了解如何 构建一个 AI 研究助手代理,该代理使用 MongoDB 作为内存提供商、Fireworks AI 进行函数调用以及 LangChain 集成和管理会话组件。 了解 LangChain 和 MongoDB Vector Search ,并学习创建自己的聊天机器人,该机器人可以阅读长篇文档并为复杂的查询提供深刻的回答。 观看 Dataworkz 公司的 Sachin Smotra 深入探讨 RAG(检索增强生成)应用扩展的复杂性。 阅读我们的教程 ,了解如何在 Vertex AI 扩展的支持下将 Google Gemini 的高级自然语言处理与 MongoDB 相结合,从而增强数据库的可访问性和可用性。 浏览我们的资源中心 ,获取文章、分析报告、案例研究、白皮书等。 想要进一步了解 AI 的最新趋势和采用情况? 阅读 Retool 的“2024 年 AI 现状”完整报告 。

June 21, 2024

Atlas Vector Search가 다시 한 번 가장 사랑받는 벡터 데이터베이스로 선정되었습니다.

최근 발표된 2024 Retool AI 현황 보고서에서 2년 연속으로 MongoDB Atlas Vector Search 가 가장 사랑받는 벡터 데이터베이스로 선정되었습니다. 사용자가 동료에게 솔루션을 추천할 가능성을 측정하는 지표인 순추천지수(NPS)에서도 Atlas Vector Search가 가장 높은 점수를 받았습니다. Retool AI 현황 보고서는 개발자, 기술 리더, IT 의사 결정권자를 대상으로 벡터 데이터베이스, RAG (검색 증강 생성), AI 채택, AI 혁신 과제 등 AI의 현재 및 미래에 대한 통찰력을 제공하는 글로벌 연례 설문조사입니다. MongoDB Atlas Vector Search는 Retool의 첫 번째 2023년 보고서에서 가장 높은 NPS를 기록했으며, 출시 5개월 만에 두 번째로 널리 사용되는 벡터 데이터베이스가 되었습니다. 올해 Atlas Vector Search는 21.1%의 득표율을 얻었으며 이는 21.3%를 얻은 pgvector(PostgreSQL)와 매우 근소한 차이로, 사실상 동률이라고 할 수 있는 결과였습니다. 또한 이 설문조사에 따르면 대규모 언어 모델( LLM )이 학습되지 않은 최신 관련 문맥으로 보다 정확한 답변을 생성하기 위해 선호하는 접근 방식으로 RAG의 채택이 증가하고 있는 것으로 나타났습니다. LLM은 방대한 양의 데이터로 학습되지만, 모든 데이터가 최신 데이터는 아니며 독점 데이터를 반영하지도 않습니다. 그리고 사각지대가 존재하는 영역에서 LLM은 부정확한 "착각"을 자신 있게 제공하는 것으로 악명이 높습니다. 미세 조정은 LLM이 학습하는 데이터를 맞춤화하는 한 가지 방법이며, Retool 설문조사 응답자의 29.3%가 이 방법을 활용하고 있습니다. 하지만 직원 수가 5,000명 이상인 기업 중 1/3이 주식 시세와 같이 시간에 민감한 데이터와 고객 및 거래 내역과 같은 내부 비즈니스 인텔리전스에 액세스하는 데 RAG를 활용하고 있습니다. 바로 이런 경우에 MongoDB Atlas Vector Search이 진정으로 활약할 수 있습니다. 고객은 학습 및 평가 단계 모두에서 MongoDB에 저장된 데이터를 손쉽게 활용하여 생성 AI 애플리케이션의 성능을 보강하고 획기적으로 개선할 수 있습니다. 1년 동안 Retool 설문조사 응답자의 벡터 데이터베이스 활용률은 2023년 20%에서 2024년 63.6%로 눈에 띄게 증가했습니다. 응답자들은 벡터 데이터베이스 선택의 주요 평가 기준으로 성능 벤치마크(40%), 커뮤니티 피드백(39.3%), 개념 증명 실험(38%)을 꼽았습니다. 이 보고서에서는 문제점 중 하나로 AI 기술 스택 의 어려움을 명확히 강조하고 있습니다. 50% 이상의 응답자가 AI 스택에 어느 정도 만족, 별로 만족하지 못함, 또는 전혀 만족하지 않는다고 답했습니다. 응답자들은 또한 새로운 솔루션을 도입해야 할 때 조달 작업으로 인해 내부의 동의를 얻는 데 어려움을 겪었다고 답했습니다. 이러한 마찰을 줄이는 한 가지 방법은 기술 스택을 간소화하고 잘 알려지지 않은 여러 벤더를 온보딩할 필요가 없는 통합 솔루션 제품군을 사용하는 것입니다. 벡터 검색은 MongoDB 개발자 데이터 플랫폼인 Atlas의 기본 기능이므로 독립형 솔루션을 추가할 필요가 없습니다. 이미 MongoDB Atlas 를 사용 중인 경우, AI 기반 환경을 만들기 위해 Atlas의 기존 컬렉션에 벡터 데이터를 추가하기만 하면 됩니다. 개발자로서 Atlas Vector Search를 사용하여 생성형 AI 기반 앱 구축을 시작하려는 경우 다음과 같은 몇 가지 유용한 리소스가 있습니다. MongoDB를 메모리 제공자로 , Fireworks AI를 함수 호출로, LangChain을 대화 구성 요소 통합 및 관리로 사용하는 AI 연구 보조 에이전트를 구축하는 방법을 알아보세요. LangChain 및 MongoDB Vector Search에 대한 소개를 확인하고 긴 문서를 읽고 복잡한 쿼리에 대한 통찰력 있는 답변을 제공할 수 있는 나만의 챗봇을 만드는 방법을 알아보세요. Dataworkz의 Sachin Smotra 가 RAG(검색 증강 생성) 애플리케이션 확장의 복잡성에 대해 자세히 설명하는 동영상을 시청하세요. 튜토리얼을 통해 Google Gemini의 고급 자연어 처리 기능을 Vertex AI 확장 기능을 통해 MongoDB와 결합하여 데이터베이스의 접근성과 유용성을 향상하는 방법을 알아보세요. 리소스 허브에서 기사 , 분석가 보고서, 사례 연구, 백서 등을 확인하세요. 최근 AI 동향과 채택에 대해 더 자세히 알고 싶으신가요? 2024년 Retool AI 현황 보고서 전문을 읽어보세요 .

June 21, 2024

Atlas Vector Search é eleito novamente o banco de dados vetorial mais amado

O relatório de 2024 "State of AI" da Retool acaba de ser lançado e, pelo segundo ano consecutivo, o MongoDB Atlas Vector Search foi considerado o banco de dados vetorial mais amado. O Atlas Vector Search recebeu a maior pontuação no NPS (Net Promoter Score), uma métrica da probabilidade do usuário recomendar uma solução a alguém. O relatório "State of AI" da Retool é uma pesquisa global e anual feita com desenvolvedores, líderes de tecnologia e tomadores de decisões da área de TI que fornece insights sobre o estado atual e futuro da IA, incluindo bancos de dados vetoriais, geração aumentada por recuperação (RAG) , adoção de IA e desafios para inovar com a IA. O MongoDB Atlas Vector Search fez jus ao NPS mais alto no relatório inaugural de 2023 da Retool e foi o segundo banco de dados vetorial mais utilizado em apenas cinco meses após seu lançamento. Este ano, o Atlas Vector Search ficou praticamente empatado como o banco de dados vetorial mais popular, com 21,1% dos votos, apenas um pouco atrás do pgvector (PostgreSQL), que recebeu 21,3%. A pesquisa também aponta para a crescente adesão do RAG como a abordagem preferida para gerar respostas mais precisas, com contexto atualizado e relevante, com o qual os grandes modelos de linguagem ( LLMs ) não são treinados. Embora os LLMs sejam formados em enormes corpus de dados, nem todos esses dados estão atualizados, nem refletem dados proprietários. E nas áreas onde existem pontos cegos, os LLMs são notórios por fornecer "alucinações" imprecisas com confiança. O ajuste fino é uma maneira de personalizar os dados nos quais os LLMs são formados, e 29,3% dos entrevistados na pesquisa da Retool utilizam essa abordagem. Mas entre as empresas com mais de 5.000 funcionários, um terço agora utiliza o RAG para acessar dados sensíveis ao tempo (como preços do mercado de ações) e informações internas de negócios, como histórico de clientes e transação. É aqui que o MongoDB Atlas Vector Search se destaca. Os clientes podem utilizar com facilidade os dados armazenados no MongoDB para aumentar e evoluir drasticamente o desempenho de aplicativos de IA generativa, durante as fases de treinamento e avaliação. No período de um ano, a utilização de bancos de dados vetoriais entre os entrevistados da Retool aumentou drasticamente, de 20% em 2023 para surpreendentes 63,6% em 2024. Os entrevistados informaram que seus principais critérios de avaliação para escolher um banco de dados vetorial foram: benchmarks de desempenho (40%), feedback da comunidade (39,3%) e experimentos de prova de conceito (38%). O relatório destaca a dificuldade no uso da pilha tecnológica de IA como um dos pontos problemáticos. Mais de 50% relataram estar pouco, parcialmente, ou nada satisfeitos com suas pilhas de IA. Os entrevistados também relataram dificuldade em obter a adesão interna, agravada pelo esforço que uma aquisição demanda quando uma nova solução precisa ser integrada. Uma maneira de reduzir grande do problema é por meio de um conjunto integrado de soluções que otimize a pilha de tecnologia e elimine a necessidade de integrar vários fornecedores desconhecidos. A pesquisa vetorial é um recurso nativo do Atlas, plataforma de dados para desenvolvedores do MongoDB e, portanto, não há necessidade de usar uma solução standalone. Se você já usa o MongoDB Atlas , criar experiências baseadas em IA não requer muito mais do que a adição de dados vetoriais a collections existentes no Atlas. Temos uma série de recursos úteis para desenvolvedores interessados em usar o Atlas Vector Search para criar aplicativos de IA generativa: Saiba como criar um agente assistente de IA que usa o MongoDB como provedor de memória, o Fireworks AI para chamadas de funções e o LangChain para integrar e gerenciar componentes conversacionais. Obtenha uma introdução ao LangChain e ao MongoDB Vector Search e aprenda a criar seu próprio chatbot, capaz de ler documentos extensos e fornecer respostas esclarecedoras para queries complexas. Acompanhe o mergulho de Sachin Smotra, da Dataworkz, nas complexidades do dimensionamento de aplicativos RAG (geração aumentada de recuperação). Leia nosso tutorial que mostra como combinar o processamento avançado de linguagem natural do Google Gemini com o MongoDB, facilitado pelas extensões de IA da Vertex, para aprimorar a acessibilidade e a usabilidade do seu banco de dados. Acesse nosso hub de recursos para conferir artigos, relatórios de analistas, estudos de caso, white papers e muito mais. Quer saber mais sobre a adesão e sobre as atuais tendências em IA? Leia o relatório completo "State of AI" da Retool de 2024 .

June 21, 2024

Atlas Vector Search ancora una volta votato il database vettoriale più amato

Il rapporto 2024 Retool State of AI è appena stato pubblicato e, per il secondo anno consecutivo, MongoDB Atlas Vector Search è stato nominato il database vettoriale più amato. Atlas Vector Search ha ricevuto il punteggio NPS (Net Promoter Score) più alto, una misura della probabilità che un utente consigli una soluzione ai propri colleghi. Il rapporto Retool State of AI è un sondaggio annuale globale tra sviluppatori, leader tecnologici e responsabili IT, che fornisce insight sullo stato attuale e futuro dell'IA, compresi i database vettoriali, la retrieval-augmented generation (RAG) , l'adozione dell'IA e le sfide dell'innovazione con l'IA. MongoDB Atlas Vector Search ha ottenuto il punteggio NPS più alto nel rapporto inaugurale di Retool del 2023 ed è stato il secondo database vettoriale più utilizzato a soli cinque mesi dalla sua uscita. Quest'anno, Atlas Vector Search si è aggiudicato un pareggio virtuale per il database vettoriale più popolare, con il 21,1% dei voti, e segue di poco pgvector (PostgreSQL), che ha ricevuto il 21,3%. L'indagine evidenzia anche la crescente adozione di RAG come approccio preferito per generare risposte più accurate con un contesto aggiornato e pertinente su cui i modelli linguistici di grandi dimensioni ( LLM ) non sono addestrati. Sebbene gli LLM siano addestrati su enormi quantità di dati, non tutti i dati sono aggiornati, né riflettono dati proprietari. E nelle aree in cui esistono punti ciechi, sappiamo che gli LLM producono con sicurezza "allucinazioni" imprecise. La messa a punto è un modo per personalizzare i dati su cui vengono formati gli LLM e il 29,3% degli intervistati del sondaggio Retool si basa su questo approccio. Ma tra le aziende con più di 5.000 dipendenti, un terzo ora si avvale di RAG per accedere a dati sensibili al fattore tempo (come i prezzi del mercato azionario) e a business intelligence interna, come la cronologia dei clienti e delle transazioni. È qui che MongoDB Atlas Vector Search brilla davvero. I clienti possono facilmente utilizzare i propri dati archiviati in MongoDB per aumentare e migliorare drasticamente le prestazioni delle applicazioni di IA generativa, sia durante la fase di addestramento che di valutazione. Nel corso di un anno, l'utilizzo del database vettoriale tra gli intervistati del sondaggio Retool è aumentato notevolmente, passando dal 20% nel 2023 a uno strabiliante 63,6% nel 2024. Gli intervistati hanno riferito che i loro criteri di valutazione principali per la scelta di un database vettoriale sono stati i benchmark delle prestazioni (40%), il feedback della comunità (39,3%) e gli esperimenti proof-of-concept (38%). Uno dei punti critici che il rapporto evidenzia chiaramente è la difficoltà con lo stack tecnologico dell’intelligenza artificiale . Oltre il 50% ha dichiarato di essere abbastanza soddisfatto, non molto soddisfatto o per niente soddisfatto del proprio stack IA. Gli intervistati hanno inoltre segnalato difficoltà nell'acquisire il consenso interno, spesso complicato dall'impegno di procurement, quando è necessario integrare una nuova soluzione. Un modo per ridurre gran parte di questo attrito è tramite una suite integrata di soluzioni che razionalizzi lo stack tecnologico e consenta di non dover integrare più fornitori sconosciuti. La ricerca vettoriale è una funzionalità nativa della piattaforma di dati per sviluppatori di MongoDB, Atlas, quindi non è necessario aggiungere una soluzione indipendente. Se stai già utilizzando MongoDB Atlas , la creazione di esperienze basate sull'IA implica poco più che aggiungere dati vettoriali alle collection di dati esistenti in Atlas. Se sei sviluppatore e desideri iniziare a utilizzare Atlas Vector Search per iniziare a creare app basate sull'IA generativa, abbiamo diverse risorse utili: Scopri come creare un agente assistente di ricerca IA che utilizza MongoDB come provider di memoria, Fireworks AI per le chiamate di funzioni e LangChain per l'integrazione e la gestione dei componenti conversazionali. Ottieni un'introduzione a LangChain e MongoDB Vector Search e impara a creare il tuo chatbot in grado di leggere lunghi documenti e fornire risposte approfondite a domande complesse. Guarda Sachin Smotra di Dataworkz mentre approfondisce le complessità della scalabilità delle applicazioni RAG (retrieval-augmented generation). Leggi il nostro tutorial che ti mostra come combinare l'elaborazione avanzata del linguaggio naturale di Google Gemini con MongoDB, agevolata dalle estensioni Vertex AI per migliorare l'accessibilità e l'usabilità del tuo database. Sfoglia il nostro hub di risorse dove troverai articoli, rapporti di analisti, case study, white paper e altro ancora. Vuoi saperne di più sulle recenti tendenze e sull'adozione dell'IA? Leggi il rapporto completo Retool State of AI 2024 .

June 21, 2024