document-level-locking

2757 results

The Journey of MongoDB with COVESA in the Connected Vehicle Landscape

There’s a popular saying: “If you want to go fast, go alone; if you want to go far, go together.” I would argue The Connected Vehicle Systems Alliance (COVESA) in partnership with their extensive member network, turns this saying on its head. They have found a way to go fast, together and also go far, together. COVESA is an industry alliance focused on enabling the widespread adoption of connected vehicle systems. This group aims to accelerate the development of these technologies through collaboration and standardization. It's made up of various stakeholders in the automotive and technology sectors, including car manufacturers, suppliers, and tech companies. COVESA’s collaborative approach allows members to accelerate progress. Shared solutions eliminate the need for individual members to reinvent the wheel. This frees up their resources to tackle new challenges, as the community collectively builds, tests, and refines foundational components. As vehicles become more connected, the data they generate explodes in volume, variety, and velocity. Cars are no longer just a mode of transportation, but a platform for advanced technology and data-driven services. This is where MongoDB steps in. MongoDB.local NYC Join us in person on May 2, 2024 for our keynote address, announcements, and technical sessions to help you build and deploy mission-critical applications at scale. Use Code Web50 for 50% off your ticket! Learn More MongoDB and COVESA As the database trusted for mission-critical systems by enterprises such as Cathay Pacific , Volvo Connect , or Cox Automotive ; MongoDB has gained expertise in automotive, along with many other industries, building cross-industry knowledge in handling large-scale, diverse data sets. This in turn enables us to contribute significantly to vehicle applications and provide a unique view, especially in the data architecture discussions within COVESA. MongoDB solutions support these kinds of innovations, enabling automotive companies to leverage data for advanced features. One of the main features we provide is Atlas Device SDKs : a low-footprint, embedded database directly living on ECUs. It can synchronize data automatically with the cloud using Atlas Device Sync , our data transfer protocol that compresses the data handles conflict resolution, and only syncs delta changes, making it extremely efficient in terms of operations and maintenance. VSS: The backbone of connected vehicle data An important area of COVESA’s work is the Vehicle Signal Specification (VSS). VSS is a standardized framework used to describe data of a vehicle, such as speed, location, and diagnostic information. This standardization is essential for interoperability between different systems and components within a vehicle, as well as for external communication with other vehicles and infrastructure. VSS has been gaining more and more adoption, and it’s backed by ongoing contributions from BMW, Volvo Cars, Jaguar LR, Robert Bosch and Geotab, among others. MongoDB’s BSON and our Object-oriented Device SDKs uniquely position us to contribute to VSS implementation. The VSS data structured maps 1 to 1 to documents in MongoDB and objects in Atlas Device SDKs , which simplifies development, and speeds up applications by completely skipping any Relational Mapper layer. For every read or write, there is no need to transform the data between relational and VSS. Our insights into data structuring, querying, and management can help optimize the way data is stored and accessed in connected vehicles, making it more efficient and robust. Where MongoDB contributes MongoDB, within COVESA, finds its most meaningful contributions in areas where data complexities and community collaboration intersect. First, we can share insights into managing vast and varied data emerging from connected vehicles generating data on everything from engine performance to driver behavior. Second, we have an important role in supporting the standardization efforts, crucial for ensuring different systems within vehicles can communicate seamlessly. Our inputs can help ensure these standards are robust and practical, considering the real-world scenarios of data usage in vehicles. Some of our contributions include an Over the Air update architectural review presented at Troy COVESA’s AMM in October 2023; sharing insights about the Data Middleware PoC with BMW; and weekly contributions at the Data Expert Group. You can find some of our contributions on COVESA’s Wiki page . In essence, MongoDB's role in COVESA is about providing a unique perspective from the database management point of view, offering our understanding from different industries and use cases to support the developments towards more connected and intelligent vehicles. MongoDB, COVESA, and AWS together at CES2024 MongoDB’s most recent collaboration with COVESA was at the Consumer Electronics Show CES 2024 during which MongoDB’s Connected Vehicle solution was showcased. This solution leverages Atlas Device SDKs, such as the SDK for C++ , which enables local data storage, in-vehicle data synchronization, and also uni and bi-directional data transfer with the cloud. Below is a schematic illustrating the integration of MongoDB within the software-defined vehicle: Schema 1: End to end integration for the connected vehicle At CES 2024, MongoDB also teamed up with AWS for a compelling presentation, " AI-powered Connected Vehicles with MongoDB and AWS " led by Dr. Humza Akhtar and Mohan Yellapantula, Head of Automotive Solutions & Go To Market at AWS. The session delved into the intricacies of building connected vehicle user experiences using MongoDB Atlas. It showcased the combined strengths of MongoDB's expertise and AWS's generative AI tools, emphasizing how Atlas Vector Search unlocks the full lifecycle value of connected vehicle data. During the event, MongoDB also engaged in a conversation with The Six Five, exploring various aspects of mobility, self-driving vehicles (SDVs), and the MongoDB and AWS partnership. This discussion extended to merging IT and OT, GenAI, Atlas Edger Server, and Atlas Device SDK. Going forward At the end of the road, it’s all about enhancing the end-user experience and providing unique value propositions. Defect diagnosis based on the acoustics of the engine, improved crash assistance with mobile and vehicle telemetry data, just-in-time food ordering while on the road, in-vehicle payments, and much, much more. What all these experiences have in common is the combination of interconnected data from different systems. At MongoDB, we are laser-focused on empowering OEMs to create, transform, and disrupt the automotive industry by unleashing the power of software and data. We enable this by: Partnering with alliances such as COVESA to build a strong ecosystem or collaboration. Having one single API for In-vehicle Data Storage, Edge to Cloud Synchronization, Time Series storage, and more, improves the developer experience. Focusing on having a robust, scalable, and secure suite of services trusted by tens of thousands of customers in more than 100 countries. Together with COVESA’s vision for connected vehicles, we’re driving a future where this industry is safer, more efficient, and seamlessly integrated into the digital world. The journey is just beginning. To learn more about MongoDB-connected mobility solutions, visit the MongoDB for Manufacturing & Motion webpage . Achieving fast, reliable and compressed data exchange is one of the pillars of Software Defined Vehicles, learn how MongoDB Atlas and Edge Server can help in this short demo .

April 15, 2024

What’s your Developer Persona?

Cool stuff is getting built with MongoDB every day. Behind the scenes are developers, data scientists, and UX designers who use our products in truly innovative ways. We believe developers are doing way more than just writing code and we’ve outlined a few personas we commonly see. Pick your Archetype for custom backgrounds and other shareable content ! MongoDB.local NYC Join us in person on May 2, 2024 for our keynote address, announcements, and technical sessions to help you build and deploy mission-critical applications at scale. Use Code Web50 for 50% off your ticket! Learn More The architect Ideal team member: The Storyteller. Together you’d be unstoppable. Description People are usually intimidated by you at first. You’ll start all quiet and casually drop how you created the very board game everyone is playing at the dinner party you’re at. You have a knack for strategy and probably have a pretty good poker face. Under your calm and collected demeanor is a perpetual motion machine. Scientists said it couldn’t be done but it’s happening inside your brain right now. It’s so effortless you probably don’t even notice it. You’re deeply intuitive and you have a unique combination of both big ideas and informed action. The scientist Ideal team member: The Architect Description You love a healthy debate. Not in the devil’s advocate way but you genuinely like hearing both sides (though you're likely always right). When starting a project, you’re already thinking about things to research so you can hit the ground running. You’re always up to date with the newest technology and constantly thinking about how you can implement it in your workflow. You’re a lifelong learner and you’re super sharp, which makes you an asset to any team. The storyteller Ideal team member: The Scientist Description You’re deeply interested in the human experience. When people enter your house they are wowed by how beautiful and well thought out everything is. You also probably greet them with a seasonally spiced espresso. You have very niche-specific hobbies like crafting modern candlesticks for your friends. People come to you when they want someone to use their product or idea - you know how to craft an experience everyone will keep coming back to! Beauty is in the eye of the beholder, but you’re the beholder and you always know what’s best. The visionary Ideal team member: The Sprinter, they can ground you. Description Leaders are constantly asking you for opinions if you aren’t one yet. You’re interested in the big stuff that will change how people think and interact with the world around them. Your dinner party conversation is strong and you pride yourself on being able to predict the next big thing. Some people may write you off as arrogant until that thing you swore to them would happen, finally did, and they text you saying, “Did you see the news?!?!? You called it!”. You’re uber confident and for good reason. You know exactly how to, as they say, strike while the iron is hot! The sprinter Ideal team member: The Grasshopper Description Your desk is full of half-drunk red-bulls. Or maybe not, but that’s what everyone assumes. You outpace all your colleagues because you simply love getting things done. We imagine you’re into low-effort activities in your spare time like bouldering or Ironman triathlons. You are often the one people call when they want something done right but they are too lazy to do it themselves. You love a good checklist and you're probably the one who does all the work on that new startup venture you and your college friends have. You are deeply passionate about your work and think this meeting could have been an email. The grasshopper Ideal team member: The Architect, you’ll learn a lot! Description No is not in your vocabulary. You’re somewhat of a Renaissance person, people are usually in awe of how you can already be so natural at something you just started. You have a unique blend of independent and collaborative energy that touches everything you set your mind to. You get things done in style. Your unique brand of optimism makes everything you do look effortless. Some people may write you off as a newbie. Don’t worry though, they’re probably just grumpy. The world is truly your oyster, the sky’s the limit, and whatever other saying there is out there you’ve probably thought of 10 already. Meet other developers at .local NYC! We are coming to NYC! Check out our event page to sign up for the conference, including a schedule of events, speaker profiles, and attendee resources, like how to justify your trip and travel details. Register today and meet your ideal collaborator!

April 15, 2024

Transforming Industries with MongoDB and AI: Insurance

This is the fifth in a six-part series focusing on critical AI use cases across several industries . The series covers the manufacturing and motion, financial services, retail, telecommunications and media, insurance, and healthcare industries. With its ability to streamline processes, enhance decision-making, and improve customer experiences in far less time, resources, and staff than traditional IT systems, artificial intelligence offers insurers great promise. In an inherently information-driven industry, insurance companies ingest, analyze, and process massive amounts of data. Whether it’s agents and brokers selling more policies, underwriters adequately pricing, renewing and steering product portfolios, claim handlers adjudicating claims, or service representatives providing assurance and support, data is at the heart of it all. Given the volumes of data, and the amount of decision-making that needs to occur based on it, insurance companies have a myriad of technologies and IT support staff within their technology investment portfolios. It’s no surprise that AI is at the top of the list when it comes to current or prospective IT investments. With its ability to streamline processes, enhance decision-making, and improve customer experiences with far less time, resources, and staff than traditional IT systems, AI offers insurers great promise. Underwriting & risk management Few roles within insurance are as important as that of the underwriters who strike the right balance between profit and risk, bring real-world variables to the actuarial models at the heart of the insurer, and help steer product portfolios, markets, pricing, and coverages. Achieving equilibrium between exposures and premiums means constantly gathering and analyzing information from a myriad of sources to build a risk profile sufficient and detailed enough to make effective policy decisions. While many well-established insurers have access to a wealth of their own underwriting and claims experience data, integrating newer and real-time sources of information, keeping up with regulatory changes, and modeling out what-if risk scenarios still involve significant manual effort. Perhaps the single greatest advantage of AI will be its ability to quickly analyze more information with fewer people and resources. The long-term impact will likely be profound, and there is tremendous promise within underwriting. Advanced analytics: Traditional IT systems are slow to respond to changing formats and requirements surrounding data retrieval. The burden falls on the underwriter to summarize data and turn that into information and insight. Large Language Models are now being leveraged to help speed up the process of wrangling data sources and summarizing the results, helping underwriting teams make quicker decisions from that data. Workload and triage assistance: AI models are mitigating seasonal demands, market shifts, and even staff availability that impact the workload and productivity of underwriting teams, saving underwriting time for high-value accounts and customers where their expertise is truly needed. Amid high volumes for new and renewal underwriting, traditional AI models can help classify and triage risk, sending very low-risk policies to ‘touchless’ automated workflows, low to moderate risk to trained service center staff, and high-risk and high-value accounts to dedicated underwriters. Decision-making support: Determining if a quoted rate needs adjustment before binding and issuing can take significant time and manual effort. So can preparing and issuing renewals of existing policies, another large portion of the underwriters’ day-to-day responsibilities. Automated underwriting workflows leveraging AI are being employed to analyze and classify risk with far less manual effort. This frees up significant time and intellectual capital for the underwriter. Check out our machine learning solutions page to learn more about automated digital underwriting. Vast amounts of data analyzed by underwriters are kept on the underwriter's desktop rather than IT-managed databases. MongoDB offers an unparalleled ability to store data from a vast amount of sources and formats and deliver the ability to respond quickly to requests to ingest new data. As data and requirements change, the Document Model allows insurers to simply add more data and fields without the costly change cycle associated with databases that rely on single, fixed structures. For every major business entity found within the underwriting process, such as a broker, policy, account, and claim, there is a wealth of unstructured data sources, waiting to be leveraged by generative AI. MongoDB offers insurers a platform that consolidates complex data from legacy systems, builds new applications, and extends those same data assets to AI-augmented workflows. By eliminating the need for niche databases for these AI-specific workloads, MongoDB reduces technology evaluation and onboarding time, development time, and developer friction. Claim processing Efficient claim processing is critical for an insurer. Timely resolution of a claim and good communication and information transparency throughout the process is key to maintaining positive relationships and customer satisfaction. In addition, insurers are on the hook to pay and process claims according to jurisdictional regulations and requirements, which may include penalties for failing to comply with specific timelines and stipulations. To process a claim accurately, a wealth of information is needed. A typical automobile accident may include not only verbal and written descriptions from claimants and damage appraisers but also unstructured content from police reports, traffic and vehicle dashboard cameras, photos, and even vehicle telemetry data. Aligning the right technology and the right amount of your workforce in either single or multi-claimant scenarios is crucial to meeting the high demands of claim processing. Taming the flood of data: AI is helping insurers accelerate the process of making sense of a trove of data and allowing insurers to do so in real-time. From Natural Language Processing to image classification and vector embedding, all the pieces of the puzzle are now on the board for insurers to make a generation leap forward when it comes to transforming their IT systems and business workflows for faster information processing. Claims experience: Generating accurate impact assessments for catastrophic events in a timely fashion to inform the market of your exposure can now be done with far less time, and with far more accuracy, by cross-referencing real-time and historical claims experience data, thanks to the power of Generative AI and vector-embedding of unstructured data. Claim expediter: Using vector embeddings from photo, text, and voice sources, insurers are now able to decorate inbound claims with richer and more insightful metadata so that they can more quickly classify, triage, and route work. In addition, real-time insight into workload and staff skills and availability is allowing insurers to be even more prescriptive when it comes to work assignments, driving towards higher output and higher customer satisfaction. Litigation assistance: Claims details are not always black and white, parties do not always act in good faith, and insurers expend significant resources in the pursuit of resolving matters. AI is helping insurers drive to resolution faster and even avoid litigation and subrogation altogether, thanks to its ability to help us analyze more data more effectively and more quickly. Risk prevention: Many insurers provide risk-assessment services to customers using drones, sensors, or cameras, to capture and analyze data. This data offers the promise of preventing losses altogether for customers and lowering exposures, liability, and expenses for the insurer. This is possible thanks to a combination of vector-embedding, traditional, and generative AI models. Learn more about AI-enhanced claim adjustment for automotive insurance on our solutions page. Customer experience Accessing information consistently during a customer service interaction, and expecting the representative to quickly interpret it, are perennial challenges with any customer service desk. Add in the volume, variety, and complexity of information within insurance, and it’s easy to understand why many insurers are investing heavily in the transformation of their customer experience call center systems and processes. 24/7 virtual assistance: As with many AI-based chat agents, the advantage is that it can free up your call center staff to work on more complex and high-touch cases. Handling routine inquiries can now include far more complex scenarios than before, thanks to the power of vector-embedded content and Large Language Models. Claims assistance: Generative AI can deliver specific claim-handling guidelines to claim-handling staff in real time, while traditional ML models can interrogate real-time streams of collected information to alert either the customer or the claim-handler to issues with quality, content, or compliance. AI capabilities allow insurers to process more claims more quickly and significantly reduce errors or incomplete information. Customer profiles: Every interaction is an opportunity to learn more about your customers. Technologies such as voice-to-text streaming, vector embedding, and generative AI help insurers build out a more robust ‘social profile’ of their customers in near real-time. Real-time fraud detection: According to estimates from the Coalition Against Insurance Fraud , the U.S. insurance industry lost over $308 billion to fraud in 2022. With vector-embedding of unstructured data sources, semantic and similarity searches across both vector and structured metadata, and traditional machine learning models, insurers can detect and prevent fraud in ways that were simply not ever before possible. Other notable use cases Predictive Analytics: AI-powered predictive analytics can anticipate customer needs, preferences, and behaviors based on historical data and trends. By leveraging predictive models, insurers can identify at-risk customers, anticipate churn, and proactively engage with customers to prevent issues and enhance satisfaction. Crop Insurance and Precision Farming: AI is being used in agricultural insurance to assess crop health, predict yields, and mitigate risks associated with weather events and crop diseases, which helps insurers offer more accurate and tailored crop insurance products to farmers. Predictive Maintenance for Property Insurance: AI-powered predictive maintenance solutions, leveraging IoT sensors installed in buildings and infrastructure, are used in property insurance to prevent losses and minimize damage to insured properties. Usage-Based Insurance (UBI) for Commercial Fleets: AI-enabled telematics devices installed in commercial vehicles collect data on driving behavior, including speed, acceleration, braking, and location. Machine learning algorithms analyze this data to assess risk and determine insurance premiums for commercial fleets to help promote safer driving practices, reduce accidents, and lower insurance costs for businesses. Learn more about AI use cases for top industries in our new ebook, How Leading Industries are Transforming with AI and MongoDB Atlas. Read the full ebook here .

April 11, 2024

Embracing Neurodiversity During Autism Awareness Month

April is Autism Awareness Month. According to a 2023 study by The Tavistock Institute of Human Relations, nearly half of neurodivergent employees feel impacted by their conditions in the workplace. On top of this, many neurodivergent employees choose to not disclose their conditions due to fear of stigma or reduced career opportunities. It’s safe to say that organizations can do more to create supportive, inclusive, and empowering environments for neurodivergent employees. Config.MDB is an employee resource group aiming to raise awareness for and build support around neurodiversity in the workplace at MongoDB. Sarah Lin, Senior Information & Content Architect, and Config member, speaks about her experience as a parent to a child with autism and shares her perspective on creating inclusive spaces for neurodivergent employees, customers, and members of our communities. According to the CDC, one in every 36 children in the U.S . is on the autism spectrum. My child is one of those. Parenting a child with autism is, in many ways, just parenting. There’s often awe, wonder, frustration, exhaustion, and fountains of love all mixed up in the same day. Parenting is hard, period. As a parent of one neurodiverse and one neurotypical child, I'm able to see a little bit of both worlds and while there are highs and lows for both, they’re usually different. People with autism often have a particular area of interest, and my child has opened up a world to me I was only vaguely familiar with. Learning, enjoying, and being part of their world is a gift. I’m so grateful to experience it with them every day. Parenting children with different abilities can be lonely and isolating, though. I get support where I can, but many folks we encounter are actively unsupportive, assuring me “they’ll grow out of it” or expecting I can make them behave in a certain way. My child needs lower levels of support to navigate the world, which means that people often just dismiss their diagnosis entirely as misbehavior or even say that their challenges don’t exist. Understanding autism spectrum disorder I learned pretty early in my journey that if you’ve met one person with autism, you’ve met one person with autism. Autism Spectrum Disorder is just that, a spectrum, with individuals needing more or less support, so interacting with one person on this spectrum isn’t generalizable to everyone else. I’ve found that each person presents a unique constellation of abilities and challenges, and the best way forward is to ask and get to know each person I meet. Learning about autism and how my child experiences the world has made me a better person, which I hope has carried over into work as well. The most significant impact is raising awareness of sensory needs in the workplace. We all have sensory needs to varying degrees, and understanding that they are genuine makes me more empathetic, accommodating, and patient. Considering the needs of the neurodivergent community If my child grows up wanting to work in the tech industry like I do, I’d advise them to remember their needs and seek a role at a company that seems like the best fit. For example, does working from home help them avoid auditory overstimulation? Awareness of their legal rights is paramount, though; they should ask for the accommodations they need to succeed. I’m sure it’s the parent in me, but the most essential advice is that they take care of themselves. Having time off, structuring their workday and environment to meet their needs, and having a healthy sensory diet are all foundational for doing their best work. To workplaces and colleagues, I encourage you to consider that creating inclusive spaces can be both physical and metaphorical. I’m reminded of a previous employer that set out a specific neurodivergent space at its customer conferences for attendees. I can only speak for what has helped my family, but having alternative options and authority figures willing to compromise when accommodation is outside the norm is impactful. As I teach my child to advocate for their own needs, I rely on the rest of us to meet them with understanding and flexibility. The impact of increasing awareness When talking about impact, there’s no denying that millions of individuals and families benefit from increased awareness, understanding, and inclusivity in our society. Speaking from my own experience as a parent, even if we don’t take advantage of specifically autism-friendly events or spaces, just knowing they are available takes some stress off of daily “how am I going to make this work?” questions. Taking advantage of sensory-friendly activities, for example, allows my child to have the same experiences as everyone else, making them feel included and accepted. For kids, shared experiences are essential for social belonging, and when you’re already experiencing the world differently, that can be a very challenging area to begin with. As I reflect on Autism Awareness Month, I encourage my friends and colleagues to work towards learning about and supporting neurodiversity in their communities. Some of the best ways to do this are simply through asking and researching. Information abounds, and so does the work of learning rather than expecting to be taught. You can also participate in (or start) a community group or employee resource group — like Config at MongoDB , a global employee resource group focused on disability and neurodiversity at MongoDB. The more we strive to educate ourselves and create inclusive environments for everyone, the better our workplaces and communities will become. Learn more about Diversity & Inclusion at MongoDB.

April 10, 2024

Enabling Commerce Innovation with the Power of MongoDB and Google Cloud

Across all industries, business leaders are grappling with economic uncertainty, cost concerns, disruption to supply chains, and pressure to embrace new technologies like generative AI. In this dynamic landscape, having a performant and future-proofed technology foundation is critical to your business’s success. Kin + Carta, a Premier Google Cloud Partner and MongoDB Systems Integrator Partner, recently launched the Integrated Commerce Network . The Integrated Commerce Network is an Accelerator that enables clients to modernize to a composable commerce platform and create value with their commerce data on Google Cloud with a pre-integrated solution in as little as six weeks. This article explains the concept of composable commerce and explores how MongoDB and Google Cloud form a powerful combination that enables innovation in commerce. Finally, it explains how Kin + Carta can help you navigate the complexity facing businesses today with their approach to digital decoupling. MongoDB.local NYC Join us in person on May 2, 2024 for our keynote address, announcements, and technical sessions to help you build and deploy mission-critical applications at scale. Use Code Web50 for 50% off your ticket! Learn More Unraveling the complexity: What is composable commerce? Why microservices and APIs? The evolution of commerce architecture Traditional monolithic architectures, once the cornerstone of commerce platforms, are facing challenges in meeting the demands of today's fast-paced digital environment. Microservices, a paradigm that breaks down applications into small, independent services, offer a solution to the limitations of monoliths. This architectural shift allows for improved agility, scalability, and maintainability. Defining composable commerce Composable commerce is a component-based, API-driven design approach that gives businesses the flexibility to build and run outstanding buying experiences free of constraints found in legacy platforms. To be truly composable, the platform must support key tenets: Support continuous delivery without downtime at the component level Have API as the contract of implementation between services, with open, industry-standard protocols providing the glue between components Be SaaS based, or portable to run on any modern public cloud environment Allow the open egress and ingress of data — no black-boxes of vendor data ownership Defining APIs and microservices APIs play a pivotal role in connecting microservices, enabling seamless communication and data exchange. This modular approach empowers businesses to adapt quickly to market changes, launch new features efficiently, and scale resources as needed. Enhanced scalability, resilience, and agility Taking a microservices approach provides businesses with options and now represents a mature and battle-tested approach with commoditized architectures, infrastructure-as-code, and open-source design patterns to enable robust, resilient, and scalable commerce workloads at lower cost and risk. Additionally, the decoupled nature of microservices facilitates faster development cycles. Development teams can work on isolated components, allowing for parallel development and quicker releases. This agility is a game-changer in the competitive e-commerce landscape, where rapid innovation is essential for staying ahead. Microservices and API-based commerce solutions (like commercetools, which is powered by MongoDB) have begun to dominate the market with their composable approach, and for good reason. These solutions remove the dead-end of legacy commerce suite software and enable a brand to pick and choose to enhance its environment on its own terms and schedule. MongoDB Atlas: The backbone of intelligent, generative AI-driven experiences As e-commerce has developed, customers are expecting more from their interactions — flat, unsophisticated experiences just don’t cut it anymore and brands need to deliver on the expectation of immediacy and contextual relevance. Taking a microservices approach enables richer and more granular data to be surfaced, analyzed, and fed back into the loop, perhaps leveraging generative AI to synthesize information that previously would have been difficult or impossible without huge computing capabilities. However, to do this well you need core data infrastructure that underpins the platform and provides the performance, resilience, and advanced features required. MongoDB Atlas on Google Cloud can play a pivotal role in this enablement. Flexible data models: Microservices often require diverse data models. MongoDB Atlas, a fully managed database service, accommodates these varying needs with its flexible schema design, which allows businesses to adapt their data structures without compromising performance. Horizontal scalability: Modern commerce moves a lot of data. MongoDB Atlas excels in distributing data across multiple nodes, ensuring that the database can handle increased loads effortlessly. Real-time data access: Delivering on expectations relies on real-time data access. MongoDB Atlas supports real-time, event-driven data updates, ensuring you are using the most up-to-date information about your customers. Serverless deployment: Rather than spend time and money managing complex database infrastructure, MongoDB Atlas can leverage serverless deployment, allowing developers to focus on building features that delight customers and impact the bottom line. Unleashing generative AI with MongoDB and Google Cloud Generative AI applications thrive on massive datasets and require robust data management. MongoDB effortlessly handles the complex and ever-evolving nature of gen AI data. This includes text, code, images, and more, allowing you to train your models on a richer data tapestry. MongoDB Atlas: Streamlined gen AI development on Google Cloud MongoDB Atlas, the cloud-based deployment option for MongoDB, integrates seamlessly with Google Cloud. Atlas offers scalability and manageability, letting you focus on building groundbreaking gen AI applications. Here's how this powerful duo functions together: Data ingestion and storage: Effortlessly ingest your training data, regardless of format, into MongoDB Atlas on Google Cloud. This data can include text for natural language processing, code for programming tasks, or images for creative generation. AI model training: Leverage Google Cloud's AI services like Vertex AI to train your gen AI models using the data stored in MongoDB Atlas. Vertex AI provides pre-built algorithms and tools to streamline model development. Operationalization and serving: Once trained, deploy your gen AI model seamlessly within your application. MongoDB Atlas ensures the smooth data flow to and from your model, enabling real-time generation. Vector search with MongoDB Atlas: MongoDB Atlas Vector Search allows for efficient retrieval of similar data points within your gen AI dataset. This is crucial for tasks like image generation or recommendation systems. Advantages of this open approach By leveraging a microservices architecture, APIs, and the scalability and flexibility of Atlas, businesses can build agile and adaptable composable platforms. Atlas seamlessly integrates with Google Cloud, providing a streamlined environment for developing and deploying generative AI models. This integrated approach offers several benefits: Simplified development: The combined power of MongoDB Atlas and Google Cloud streamlines the development process, allowing you to focus on core gen AI functionalities. Scalability and flexibility: Both MongoDB Atlas and Google Cloud offer on-demand scalability, ensuring your infrastructure adapts to your gen AI application's growing needs. Faster time to market: The ease of integration and development offered by this combination helps you get your gen AI applications to market quickly. Cost-effectiveness: Both MongoDB Atlas and Google Cloud offer flexible pricing models, allowing you to optimize costs based on your specific gen AI project requirements. Digital decoupling, a legacy modernization approach With so much digital disruption, technology leaders are constantly being challenged. Existing legacy architectures and infrastructure can be extremely rigid and hard to unravel. Over 94% of senior leaders reported experiencing tech anxiety . So how do you manage this noise, meet the needs of the business, stay relevant, and evolve your technology so that you can deliver the kinds of experiences audiences expect? Digital decoupling is a legacy modernization approach that enables large, often well-established organizations to present a unified online experience to their users, take full advantage of their data, innovate safely, and compete effectively with digital natives. Technology evolves rapidly, and an effective microservices solution should be designed with future scalability and adaptability in mind. Kin + Carta helps to ensure that your solution is not only robust for current requirements but also capable of evolving with emerging technologies and business needs. It all starts with a clear modernization strategy that allows you to iteratively untangle from legacy systems, while also meeting the needs of business stakeholders seeking innovation. Navigating commerce complexity with Kin + Carta on Google Cloud Commerce is undergoing a significant transformation, and businesses need a future-proof technology foundation to handle the demands of complex models and massive datasets. That’s why Kin + Carta launched their Integrated Commerce Network , the first commerce-related solution that’s part of Google’s Industry Value Network . With the right tools and partners, your business can be at the forefront of innovation with generative AI, through automating tasks in revolutionary new ways, creating entirely new content formats, and delivering more personalized customer experiences. The complexities of commerce transformation can be daunting. But you can master the art of digital decoupling and leverage the strengths of the Integrated Commerce Network to unlock limitless possibilities and gain an edge over your competition. Check out Kin + Carta’s guide: Flipping the script — A new vision of legacy modernization enabled by digital decoupling . Get started with MongoDB Atlas on Google Cloud today.

April 9, 2024

A Smarter Factory Floor with MongoDB Atlas and Google Cloud's Manufacturing Data Engine

The manufacturing industry is undergoing a transformative shift from traditional to digital, propelled by data-driven insights, intelligent automation, and artificial intelligence. Traditional methods of data collection and analysis are no longer sufficient to keep pace with the demands of today's competitive landscape. This is precisely where Google Cloud’s Manufacturing Data Engine (MDE) and MongoDB Atlas come into play, offering a powerful combination for optimizing your factory floor. Unlock the power of your factory data MDE is positioned to transform the factory floor with the power of cloud-driven insights. MDE simplifies communication with your factory floor, regardless of the diverse protocols your machines might use. It effortlessly connects legacy equipment with modern systems, ensuring a comprehensive data stream. MDE doesn't just collect data, it transforms it. By intelligently processing and contextualizing the information, you gain a clearer picture of your production processes in real-time with a historical pretext. It offers pre-built analytics and AI tools directly addressing common manufacturing pain points. This means you can start finding solutions faster, whether it's identifying bottlenecks, reducing downtime, or optimizing resource utilization. Conveniently, it also offers great support for integrations that can further enhance the potential of the data (e.g. additional data sinks). The MongoDB Atlas developer data platform enhances MDE by providing scalability and flexibility through automated scaling to adapt to evolving data requirements. This makes it particularly suitable for dynamic manufacturing environments. MongoDB’s document model can handle diverse data types and structures effortlessly while being incredibly flexible because of its native JSON format. This allows for enriching MDE data with other relevant data, such as supply chain logistics, for a deeper understanding of the factory business. You can gain immediate insights into your operations through real-time analytics, enabling informed decision-making based on up-to-date data. While MDE offers a robust solution for collecting, contextualizing, and managing industrial data, leveraging it alongside MongoDB Atlas offers tremendous advantages Inside the MDE integration Google Cloud’s Manufacturing Data Engine (MDE) acts as a central hub for your factory data. It not only processes and enriches the data with context, but also offers flexible storage options like BigQuery and Cloud Storage. Now, customers already using MongoDB Atlas can skip the hassle of application re-integration and make this data readily accessible for applications. Through this joint solution developed by Google Cloud and MongoDB, you can seamlessly move the processed streaming data from MDE to MongoDB Atlas using Dataflow jobs. MDE publishes the data via a Pub/Sub subscription, which is then picked up by a custom Dataflow job built by MongoDB. This job transforms the data into the desired format and writes it to your MongoDB Atlas cluster. Google Cloud’s MDE and MongoDB Atlas utilize compatible data structures, simplifying data integration through a shared semantic configuration. Once the data resides in MongoDB Atlas, your existing applications can access it seamlessly without any code modifications, saving you time and effort. The flexibility of MDE, combined with the scalability and ease of use of MongoDB Atlas, makes this a powerful and versatile solution for various data-driven use cases such as predictive maintenance and quality control, while still providing factory ownership of the data. Instructions on how to set up the dataflow job are available in the MongoDB github repository. Conclusion If you want to level up your manufacturing data analytics, pairing MDE with MongoDB Atlas provides a proven, easy-to-implement solution. It's easy to get started with MDE and MongoDB Atlas . MongoDB.local NYC Join us in person on May 2, 2024 for our keynote address, announcements, and technical sessions to help you build and deploy mission-critical applications at scale. Use Code Web50 for 50% off your ticket! Learn More

April 9, 2024

Unleashing Developer Potential–and Managing Costs–with MongoDB Atlas

In today's business landscape, where unpredictability has become the norm, engineering leaders have to balance the dual challenges of managing uncertainty while optimizing IT costs. Indeed, the 2024 MarginPLUS Deloitte survey—which draws on insights from over 300 business leaders—emphasizes a collective pivot towards growth initiatives and cost transformations amidst the fluctuating global economic climate. MongoDB.local NYC Join us in person on May 2, 2024 for our keynote address, announcements, and technical sessions to help you build and deploy mission-critical applications at scale. Use Code Web50 for 50% off your ticket! Learn More MongoDB Atlas: A developer's ally for cost-effective productivity Executives across industries want to cut costs without impacting innovation; based on the Deloitte survey , 83% of companies are looking to change how they run their business margin improvement efforts. This is where MongoDB Atlas , the most advanced cloud database service on the market, comes in. An integrated suite of data services that simplify how developers build with data, MongoDB Atlas helps teams enhance their productivity without compromising on cost efficiency by offering visibility and control over spending—balancing developer freedom with data governance and cost management. This helps organizations escape the modernization hamster wheel—or the vicious cycle of continuously updating technology without making real progress, and draining resources while failing to deliver meaningful improvements. Put another way, MongoDB gives teams more time to innovate, instead of just maintaining the status quo. Outlined below are the built-in features of MongoDB Atlas, which enable customers to get the most out of their data while also focusing on budget optimization. Strategic features for cost optimization with MongoDB Atlas Right-sizing your cluster Use MongoDB Atlas’s Cluster Sizing Guide or auto-scalability to match your cluster with your workload, optimizing resource use with options for every requirement, including Low CPU options for lighter workloads. Pausing clusters and global distribution Save costs by pausing your cluster , and securely storing data for up to 30 days with auto-resume. Furthermore, Global Clusters improve performance across regions while maintaining cost efficiency and compliance. Index and storage management Enhance performance and reduce costs with MongoDB Atlas’s Performance Advisor , which provides tailored index and schema optimizations for better query execution and potential reductions in cluster size. Strategic data management Reduce storage expenses using Online Archive for infrequently accessed data and TTL indexes for efficient Time Series data management, ensuring only essential data is stored. Securely backup data before deletion with mongodump. Enhanced spend management Use spend analysis, billing alerts , and usage insights via the Billing Cost Explorer for detailed financial management and optimization. Resource tagging and customizable dashboards provide in-depth financial reporting and visual expense tracking, supporting effective budgeting and cost optimization. Additionally, Opt for serverless instances to adjust to your workload's scale, offering a pay-for-what-you-use model that eliminates overprovisioning concerns. Transforming uncertainty into advancement MongoDB Atlas equips IT decision-makers and developers with the features and tools to balance developer productivity with strategic cost management, transforming economic uncertainty into a platform for strategic advancement. MongoDB Atlas is more than a database management solution; it’s a strategic partner in optimizing your IT spending, ensuring that your organization remains agile, efficient, and cost-effective in the face of change. Need expert assistance in taking control of your MongoDB Atlas costs? MongoDB’s Professional Services team can provide a deep-dive assessment of your environment to build a tailored optimization plan—and to help you execute. Reach out to learn how we can support your cost optimization goals! If you haven't yet set up your free cluster on MongoDB Atlas , now is a great time to do so. You have all the instructions in this DevCenter article .

April 8, 2024

Transforming Industries with MongoDB and AI: Financial Services

This is the fourth in a six-part series focusing on critical AI use cases across several industries . The series covers the manufacturing and motion, financial services, retail, telecommunications and media, insurance, and healthcare industries. In the dynamic world of financial services, the partnership between artificial intelligence (AI) and banking services is reshaping traditional practices, offering innovative solutions across critical functions. MongoDB.local NYC Join us in person on May 2, 2024 for our keynote address, announcements, and technical sessions to help you build and deploy mission-critical applications at scale. Use Code Web50 for 50% off your ticket! Learn More Relationship management support with chatbots One key service that relationship managers provide to their private banking customers is aggregating and condensing information. Because banks typically operate on fragmented infrastructure with information spread across different departments, solutions, and applications, this can require a lot of detailed knowledge about this infrastructure and how to source information such as: When are the next coupon dates for bonds in the portfolio? What has been the cost of transactions for a given portfolio? What would be a summary of our latest research? Please generate a summary of my conversation with the client. Until now, these activities would be highly manual and exploratory. For example, a relationship manager (RM) looking for the next coupon dates would likely have to go into each of the clients' individual positions and manually look up the coupon dates. If this is a frequent enough activity, the RM could raise a request for change with the product manager of the portfolio management software to add this as a standardized report. But even if such a standardized report existed, the RM might struggle to find the report quickly. Overall, the process is time-consuming. Generative AI systems can facilitate such tasks. Even without specifically trained models, RAG can be used to have the AI generate the correct answers, provide the inquirer with a detailed explanation of how to get to the data, and, in the same cases directly execute the query against the system and report back the results. Similar to a human, it is critical that the algorithm has access to not only the primary business data, e.g. the portfolio data of the customer, but also user manuals and static data. Detailed customer data, in machine-readable format and as text documents, is used to personalize the output for the individual customer. In an interactive process, the RM can instruct the AI to add more information about specific topics, tweak the text, or make any other necessary changes. Ultimately, the RM will be the quality control for the AI’s output to mitigate hallucinations or information gaps. As outlined above, not only will the AI need highly heterogeneous data from highly structured portfolio information to text documents and system manuals to provide a flexible natural language interface for the RMs, it will also have to have timely processing information about a customer's transactions, positions, and investment objectives. Providing transactional database capabilities as well as vector search makes it easy to build RAG-based applications using MongoDB’s developer data platform. Risk management and regulatory compliance Risk and fraud prevention Banks are tasked with safeguarding customer assets and detecting fraud , verifying customer identities, supporting sanctions regimes (Sanctions), and preventing various illegal activities (AML). The challenge is magnified by the sheer volume and complexity of regulations, making the integration of new rules into bank infrastructure costly, time-consuming, and often inadequate. For instance, when the EU's Fifth Anti-Money Laundering Directive was implemented, it broadened regulations to cover virtual currencies and prepaid cards . Banks had to update their onboarding processes swiftly, and software, train staff, and possibly update their customer interfaces to comply with these new requirements. AI offers a transformative approach to fraud detection and risk management by automating the interpretation of regulations, supporting data cleansing, and enhancing the efficacy of surveillance systems. Unlike static, rules-based frameworks that may miss or misidentify fraud due to narrow scope or limited data, AI can adaptively learn and analyze vast datasets to identify suspicious activities more accurately. Machine learning, in particular, has shown promise in trade surveillance, offering a more dynamic and comprehensive approach to fraud prevention. Regulatory compliance and code change assistance The regulatory landscape for banks has grown increasingly complex, demanding significant resources for the implementation of numerous regulations. Traditionally, adapting to new regulations has required the manual translation of legal text into code, provisioning of data, and thorough quality control—a process that is both costly and time-consuming, often leading to incomplete or insufficient compliance. For instance, to comply with the Basel III international banking regulations , developers must undertake extensive coding changes to accommodate the requirements laid out in thousands of pages of documentation. AI has the capacity to revolutionize compliance by automating the translation of regulatory texts into actionable data requirements and validating compliance through intelligent analysis. This approach is not without its challenges, as AI-based systems may produce non-deterministic outcomes and unexpected errors. However, the ability to rapidly adapt to new regulations and provide detailed records of compliance processes can significantly enhance regulatory adherence. Financial document search and summarization Financial institutions, encompassing both retail banks and capital market firms, handle a broad spectrum of documents critical to their operations. Retail banks focus on contracts, policies, credit memos, underwriting documents, and regulatory filings, which are pivotal for daily banking services. On the other hand, capital market firms delve into company filings, transcripts, reports, and intricate data sets to grasp global market dynamics and risk assessments. These documents often arrive in unstructured formats, presenting challenges in efficiently locating and synthesizing the necessary information. While retail banks aim to streamline customer and internal operations, capital market firms prioritize the rapid and effective analysis of diverse data to inform their investment strategies. Both retail banks and capital market firms allocate considerable time to searching for and condensing information from documents internally, resulting in reduced direct engagement with their clients. Generative AI can streamline the process of finding and integrating information from documents by using NLP and machine learning to understand and summarize content. This reduces the need for manual searches, allowing bank staff to access relevant information more quickly. MongoDB can store vast amounts of both live and historical data, regardless of its format which is typically needed for AI applications. It offers Vector Search capabilities essential for retrieval-augmented generation (RAG). MongoDB supports transactions, ensuring data accuracy and consistency for AI model retraining with live data. It facilitates data access for both deterministic algorithms and AI-driven rules through a single interface. MongoDB boasts a strong partnership ecosystem , including companies like Radiant AI and Mistral AI, to speed solution development. ESG analysis Environmental, social, and governance (ESG) considerations can have a profound impact on organizations. For example, regulatory changes—especially in Europe—have compelled financial institutions to integrate ESG into investment and lending decisions. Regulations such as the EU Sustainable Finance Disclosure Regulation (SFDR) and the EU Taxonomy Regulation are examples of such directives that require financial institutions to consider environmental sustainability in their operations and investment products. Investors' demand for sustainable options has surged, leading to increased ESG-focused funds. The regulatory and commercial requirements, in turn, drive banks to also improve their green lending practices . This shift is strategic for financial institutions, attracting clients, managing risks, and creating long-term value. However, financial institutions face many challenges in managing different aspects of improving their ESG analysis. The key challenges include defining and aligning standards, and processes and managing the flood of rapidly changing and varied data to be included for ESG analysis purposes. AI can help to address these key challenges in not only an automatic but also adaptive manner via techniques like machine learning. Financial institutions and ESG solution providers have already leveraged AI to extract insights from corporate reports, social media, and environmental data, improving the accuracy and depth of ESG analysis. As the market demands a more sustainable and equitable society, predictive AI combined with generative AI can also help to reduce bias in lending to create fairer and more inclusive financing while improving the predictive powers. The power of AI can help facilitate the development of sophisticated sustainability models and strategies, marking a leap forward in integrating ESG into broader financial and corporate practices. Credit scoring The convergence of alternative data, artificial intelligence, and generative AI is reshaping the foundations of credit scoring, marking a pivotal moment in the financial industry. The challenges of traditional models are being overcome by adopting alternative credit scoring methods, offering a more inclusive and nuanced assessment. Generative AI, while introducing the potential challenge of hallucination, represents the forefront of innovation, not only revolutionizing technological capabilities but fundamentally redefining how credit is evaluated, fostering a new era of financial inclusivity, efficiency, and fairness. The use of artificial intelligence, in particular generative artificial intelligence, as an alternative method to credit scoring has emerged as a transformative force to address the challenges of traditional credit scoring methods for several reasons: Alternative data analysis: AI models can process a myriad of information, including alternative data such as utility payments and rental history, to create a more comprehensive assessment of an individual's creditworthiness. AI offers unparalleled adaptability : As economic conditions change and consumer behaviors evolve, AI-powered models can quickly adjust. Fraud detection: AI algorithms can detect fraudulent behavior by identifying anomalies and suspicious patterns in credit applications and transaction data. Predictive analysts: AI algorithms, particularly ML techniques, can be used to build predictive models that identify patterns and correlations in historical credit data. Behavioral analysis: AI algorithms can analyze behavioral data sets to understand financial habits and risk propensity. By harnessing the power of artificial intelligence, lenders can make more informed lending decisions, expand access to credit, and better serve consumers (especially those with limited credit history). However, to mitigate potential biases and ensure consumer trust, it's crucial to ensure transparency, fairness, and regulatory compliance when deploying artificial intelligence in credit scoring. AI in payments A lack of developer capacity is one of the biggest challenges for banks when delivering payment product innovation. Banks believe the product enhancements they could not deliver in the past two years due to resource constraints would have supported a 5.3% growth in payments revenues . With this in mind and the revolutionary transformation with the integration of AI, it is imperative to consider how to free up developer resources to make the most of these opportunities. There are several areas in which banks can apply AI to unlock new revenue streams and efficiency gains. The image below provides a high-level view of eight of the principal themes and areas. This is not an exhaustive view but does demonstrate the depth and breadth of current opportunities. In each example, there are already banks that have begun to bring services or enhancements to the market using AI technologies or are otherwise experimenting with the technology. Learn more about AI use cases for top industries in our new ebook, How Leading Industries are Transforming with AI and MongoDB Atlas .

April 4, 2024

MongoDB Announces Support for Punch Cards, Arrakis

MongoDB is excited to announce a groundbreaking expansion of our capabilities. Beginning today, April 1, MongoDB is expanding its support of programming languages like INTERCAL, FORTRAN, and even UNITYPER punch cards, as well as support for ultra-remote regions of the known universe with new subspace, faster-than-light (FTL) data transfer and networking protocols. Bringing vintage programming languages into the modern age Support for legacy programming languages presents a unique opportunity for our developers. INTERCAL, known for its quirkiness and obscurity, challenges developers with its unconventional syntax. Meanwhile, the experience many pioneers in computing gained by encoding data onto UNITYPER punch cards paved the way for the modern computing we know today, and supporting punch card languages underscores MongoDB’s long-standing commitment to honoring the rich heritage of computing history. We are thrilled to resurrect these relics of computing and, by doing so, to highlight the importance of developers in the past, present, and future. Expanding availability of MongoDB to the earth’s crust, the planet Arrakis (Dune), and beyond Following MongoDB’s long-standing tradition of pushing boundaries, we are also excited to announce the expanded availability of MongoDB Atlas to regions that were once thought unreachable. Beginning today, MongoDB Atlas will support even the most remote regions of the known universe. Whether you find yourself trapped in the Earth’s crust, or you’re riding on a Guild heighliner to harvest spice on the planet Arrakis, MongoDB will be there to support your data needs every step of the way. As part of this planned expansion, MongoDB will provide network connections to the center of the Earth using new types of heat-resistant polymers that can withstand the extreme temperatures of Earth’s molten core. Additionally, new subspace data transfer and networking protocols derived from advancements in quantum computing will allow low-latency applications to function across solar systems, galaxies, and parallel dimensions. ...APRIL FOOLS! Alright, alright, that’s enough, we’ll let you in on the joke—we won’t be supporting any vintage coding languages any time soon. While bringing back punch cards and riding sandworms might not be on MongoDB’s roadmap, we remain as committed as ever to supporting our developer community . No matter the future, developers will build it. Happy April Fools’ Day! For more developer updates, head to our Developer Center .

April 1, 2024

Transforming Industries with MongoDB and AI: Retail

This is the third in a six-part series focusing on critical AI use cases across several industries . The series covers the manufacturing and motion, financial services, retail, telecommunications and media, insurance, and healthcare industries. With generative AI, retailers can create new products and offerings, define and implement upsell strategies, generate marketing materials based on market conditions, and enhance customer experiences. One of the most creative uses of gen AI help retailers understand customer needs and choices that change continually with seasons, trends, and socio-economic shifts. By analyzing customer data and behavior, gen AI can also create personalized product recommendations, customized marketing materials, and unique shopping experiences that are tailored to individual preferences. AI plays a critical role in decision-making at retail enterprises; product decisions such as design, pricing, demand forecasting, and distribution strategies require a complex understanding of a vast array of information from across the organization. To ensure that the right products in the right quantities are in the right place at the right time, back-office teams leverage machine learning arithmetic algorithms. As technology has advanced and the barrier to adopting AI has lowered, retailers are moving towards data-driven decision-making where AI is leveraged in real-time. generative AI is used to consolidate information and provide dramatic insights that could be immediately utilized across the enterprise. MongoDB.local NYC Join us in person on May 2, 2024 for our keynote address, announcements, and technical sessions to help you build and deploy mission-critical applications at scale. Use Code Web50 for 50% off your ticket! Learn More AI-augmented search and vector search Modern retail is a customer-centric business, and customers have more choice than ever in where they purchase a product. To retain and grow their customer base, retailers are working to offer compelling, personalized experiences to customers. To do this, it is necessary to capture a large amount of data on the customers themselves—like their buying patterns, interests, and interactions—and to quickly use that data to make complex decisions. One of the key interactions in an ecommerce experience is search. With full-text search engines, customers can easily find items that match their search, and retailers can rank those results in a way that will give the customer the best option. In previous iterations of personalization, decisions on how to rank search results in a personalized way were made by segmentation of customers through data acquisition from various operational systems, moving it all into a data warehouse, and then running machine learning algorithms on the data. Typically, this would run every 24 hours or a few days, in batches, so that the next time a customer logged in, they’d have a personalized experience. This did not, however, capture the customer intent in real-time, as intent evolves as the customer gathers more information. These days, modern retailers augment search ranking with data from real-time responses and analytics from AI algorithms. It's also now possible to incorporate factors like the current shopping cart/basket and customer clickstream or trending purchases across shoppers. The first step in truly understanding the customer is to build a customer data platform that combines data from disparate systems and silos across an organization: support, ecommerce transactions, in-store interactions, wish lists, reviews, and more. MongoDB’s flexible document model allows for the easy combination of data of different types and formats with the ability to embed sub-documents to get a clear view of the customer in one place. As the retailer captures more data points about the customer, they can easily add fields without the need for downtime in schema change. Next, the capability to run analytics in real-time rather than retroactively in another separate system is built. MongoDB’s architecture allows for workload isolation, meaning the operational workload (the customer's actions on the ecommerce site) and the analytical or AI workload (calculating what the next best offer should be) can be run simultaneously without interrupting the other. Then using MognoDB’s aggregation framework for advanced analytical queries or triggering an AI model in real time to give an answer that can be embedded into the search ranking in real time. Then comes the ability to easily update the search indexing to incorporate your AI augmentation. As MongoDB has Search built in, this whole flow can be completed in one data platform- as your data is being augmented with AI results, the search indexing will sync to match. MongoDB Atlas Vector Search brings the next generation of search capability. By using LLMs to create vector embeddings for each product and then turning on a vector index, retailers can offer semantic search to their customers. AI will calculate the complex similarities between items in vector space and give the customer a unique set of results matched to their true desire. Figure 1: The architecture of an AI-enhanced search engine explaining the different MongoDB Atlas components and Databricks notebooks and workflows used for data cleaning and preparation, product scoring, dynamic pricing, and vector search Figure 2: The architecture of a vector search solution showcasing how the data flows through the different integrated components of MongoDB Atlas and Databricks Demand forecasting and predictive analytics Retailers either develop homegrown applications for demand prediction using traditional machine learning models or buy specialized products designed to provide these insights across the segments for demand prediction and forecasting. The homegrown systems require significant infrastructure for data and machine learning implementation and dedicated technical expertise to develop, manage, and maintain them. More often than not, these systems require constant care to ensure optimal performance and provide value to the businesses. Generative AI already delivers several solutions for demand prediction for retailers by enhancing the accuracy and granularity of forecasts. The application of retrieval augmented generation utilizing large language models (LLMs) enables retailers to generate specific product demand and dig deeper to go to product categories and individual store levels. This not only streamlines distribution but also contributes to a more tailored fulfillment at a store level. The integration of gen AI in demand forecasting not only optimizes inventory management but also fosters a more dynamic and customer-centric approach in the retail industry. Generative AI can be used to enhance supply chain efficiency by accurately predicting demand for products, optimizing/coordinating with production schedules, and ensuring adequate inventory levels in warehouses or distribution centers. Data requirements for such endeavors include historical sales data, customer orders, and current multichannel sales data and trends. This information can be integrated with external datasets, such as weather patterns and events that could impact demand. This data must be consolidated in an operational data layer that is cleansed for obvious reasons of avoiding wrong predictions. Subsequently, feature engineering to extract seasonality, promotions impact, and general economic indicators. A retrieval augmented generation model can be incorporated to improve demand forecasting predictions and avoid hallucinations. The same datasets could be utilized from historical data to train and fine-tune the model for improved accuracy. Such efforts lead to the following business benefits: Precision in demand forecasting Optimized product and supply planning Efficiency improvement Enhanced customer satisfaction Across the retail industry, AI has captured the imaginations of executives and consumers alike. Whether you’re a customer of a grocer, ecommerce site, or retail conglomerate, AI has and will continue to transform and enhance how you do business with corporations. For the retailers that matter most globally, AI has created opportunities to minimize risk and fraud, perfect user experiences, and save companies from wasting labor and resources. From creation to launch, MongoDB Atlas guarantees that AI applications are cemented in accurate operational data and that they deliver the scalability, security, and performance demanded by developers and consumers alike. Learn more about AI use cases for top industries in our new ebook, Enhancing Retail Operations with AI and Vector Search: The Business Case for Adoption .

March 29, 2024

Workload Isolation for More Scalability and Availability: Search Nodes Now on Google Cloud

Today we’re excited to take the next step in bringing scalable, dedicated architecture to your search experiences with the introduction of Atlas Search Nodes, now in general availability for Google Cloud. This post is also available in: Deutsch , Français , Español , Português , Italiano , 한국어 , 简体中文 . Since our initial announcement of Search Nodes in June of 2023, we’ve been rapidly accelerating access to the most scalable dedicated architecture, starting with general availability on AWS and now expanding to general availability on Google Cloud. We'd like to give you a bit more context on what Search Nodes are and why they're important to any search experience running at scale. Search Nodes provide dedicated infrastructure for Atlas Search and Vector Search workloads to enable even greater control over search workloads. They also allow you to isolate and optimize compute resources to scale search and database needs independently, delivering better performance at scale and higher availability. One of the last things developers want to deal with when building and scaling apps is having to worry about infrastructure problems. Any downtime or poor user experiences can result in lost users or revenue, especially when it comes to your database and search experience. This is one of the reasons developers turn to MongoDB, given the ease of use of having one unified system for your database and search solution. With the introduction of Atlas Search Nodes, we’ve taken the next step in providing our builders with ultimate control, giving them the ability to remain flexible by scaling search workloads without the need to over-provision the database. By isolating your search and database workloads while at the same time automatically keeping your search cluster data synchronized with operational data, Atlas Search and Atlas Vector Search eliminate the need to run a separate ETL tool, which takes time and effort to set up and is yet another fail point for your scaling app. This provides superior performance and higher availability while reducing architectural complexity and wasted engineering time recovering from sync failures. In fact, we’ve seen a 40% to 60% decrease in query time for many complex queries, while eliminating the chances of any resource contention or downtime. With just a quick button click, Search Nodes on Google Cloud offer our existing Atlas Search and Vector Search users the following benefits: Higher availability Increased scalability Workload isolation Better performance at scale Improved query performance We offer both compute-heavy search-specific nodes for relevance-based text search, as well as a memory-optimized option that is optimal for semantic and retrieval augmented generation (RAG) production use cases with Atlas Vector Search. This makes resource contention or availability issues a thing of the past. Search Nodes are easy to opt into and set up — to start, jump on into the MongoDB UI and follow the steps do the following: Navigate to your “Database Deployments” section in the MongoDB UI Click the green “+Create” button On the “Create New Cluster” page, change the radio button for Google Cloud for “Multi-cloud, multi-region & workload isolation” to enable Toggle the radio button for “Search Nodes for workload isolation” to enable. Select the number of nodes in the text box Check the agreement box Click “Create cluster” For existing Atlas Search users, click “Edit Configuration” in the MongoDB Atlas Search UI and enable the toggle for workload isolation. Then the steps are the same as noted above. Jump straight into our docs to learn more! MongoDB.local NYC Join us in person on May 2, 2024 for our keynote address, announcements, and technical sessions to help you build and deploy mission-critical applications at scale. Use Code Web50 for 50% off your ticket! Learn More

March 28, 2024

利用工作负载隔离提高可扩展性和可用性:Search Nodes 现已在 Google Cloud 上提供

今天,我们很高兴地宣布 Atlas Search Nodes(公开预览版)现已在 Google Cloud 上提供,这离我们针对搜索体验提供可扩展的专用架构这个目标更进了一步。 自 2023 年 6 月首次宣布推出 Search Nodes 以来,我们一直在加快这个最具可扩展性的专用架构的应用速度, 先是在 AWS 上正式发布 ,现在又在 Google Cloud 上发布了它的公开预览版。让我们简单介绍一下什么是 Search Nodes,以及它为何对任何大规模运行的搜索体验非常重要。 Search Nodes 可为 Atlas Search 和 Vector Search 工作负载提供专用基础架构,让您能够对搜索工作负载拥有更大的控制力度。通过隔离并优化计算资源来独立地扩展搜索和数据库需求,从而大规模提升性能并实现更高的可用性。 在构建和扩展应用时,开发者最不愿处理的一件事情就是要担心基础架构问题。任何停机或不佳的用户体验都可导致用户流失或收入受损,在涉及数据库和搜索体验时,这种影响尤为明显。这也是开发者纷纷转向 MongoDB 的原因之一,因为它可以让开发者为数据库和搜索解决方案使用一个统一的系统。 随着 Atlas Search Nodes 的推出,我们在为构建者提供最大控制力度方面又迈出了重要一步。现在,构建者可以扩展搜索工作负载,而无需过度预配数据库,因此能够保持灵活性。利用 Atlas Search 和 Atlas Vector Search,您可以在隔离搜索和数据库工作负载的同时,自动保持搜索集群数据与操作数据的同步。这样,您就无需运行单独的 ETL 工具,也就不用耗费时间和精力进行额外设置,从而避免在扩展应用时出错。这有助于提升性能和可用性,同时降低架构复杂性,以及减少从同步失败事件中恢复所耗费的工程时间。事实上,我们已经看到许多复杂查询的查询时间减少了 40% - 60%,资源争用或停机问题也得到了解决。 只需切换一下按钮,Google Cloud 上的 Search Nodes 就能为使用 Atlas Search 和 Vector Search 的用户提供以下优势: 更高的可用性 更强的可扩展性 工作负载隔离 大规模提升性能 更好的查询性能 我们为基于相关性的文本搜索提供计算密集型且特定于搜索的节点,同时还提供内存优化选项,该选项最适合使用 Atlas Vector Search 的语义和 RAG 生产用例。这解决了一直以来存在的资源争用或可用性问题。 启用和设置 Search Nodes 非常简单,只需前往 MongoDB 用户界面并执行以下操作: 前往 MongoDB 用户界面中的“数据库部署”部分 单击绿色的“+创建”按钮 在“创建新集群”页面上,将 Google Cloud 的“多云、多区域和工作负载隔离”单选按钮切换至“开启” 将“用于工作负载隔离的 Search Nodes”单选按钮切换至“开启”。在文本框中选择节点数 勾选协议框 单击“创建集群” 对于使用 Atlas Search 的用户,请单击 MongoDB Atlas Search 用户界面中的“修改配置”,并开启工作负载隔离的切换开关。后续步骤与之前所述步骤相同。 直接跳转至我们的文档以了解更多信息 !

March 28, 2024