MongoDB Applied
Customer stories, use cases and experience
Accelerating to T+1 - Have You Got the Speed and Agility Required to Meet the Deadline?
On May 28, 2024, the Securities and Exchange Commission (SEC) will implement a move to a T+1 settlement for standard securities trades , shortening the settlement period from 2 business days after the trade date to one business day. The change aims to address market volatility and reduce credit and settlement risk. The shortened T+1 settlement cycle can potentially decrease market risks, but most firms' current back-office operations cannot handle this change. This is due to several challenges with existing systems, including: Manual processes will be under pressure due to the shortened settlement cycle Batch data processing will not be feasible To prepare for T+1, firms should take urgent action to address these challenges: Automate manual processes to streamline them and improve operational efficiency Event-based real-time processing should replace batch processing for faster settlement In this blog, we will explore how MongoDB can be leveraged to accelerate manual process automation and replace batch processes to enable faster settlement. What is a T+1 and T+2 settlement? T+1 settlement refers to the practice of settling transactions executed before 4:30pm on the following trading day. For example, if a transaction is executed on Monday before 4:30 pm, the settlement will occur on Tuesday. This settlement process involves the transfer of securities and/or funds from the seller's account to the buyer's account. This contrasts with the T+2 settlement, where trades are settled two trading days after the trade date. According to SEC Chair Gary Gensler , “T+1 is designed to benefit investors and reduce the credit, market, and liquidity risks in securities transactions faced by market participants.” Overcoming T+1 transition challenges with MongoDB: Two unique solutions 1. The multi-cloud developer data platform accelerates manual process automation Legacy settlement systems may involve manual intervention for various tasks, including manual matching of trades, manual input of settlement instructions, allocation emails to brokers, reconciliation of trade and settlement details, and manual processing of paper-based documents. These manual processes can be time-consuming and prone to errors. MongoDB (Figure 1 below) can help accelerate developer productivity in several ways: Easy to use: MongoDB is designed to be easy to use, which can reduce the learning curve for developers who are new to the database. Flexible data model: Allows developers to store data in a way that makes sense for their application. This can help accelerate development by reducing the need for complex data transformations or ORM mapping. Scalability: MongoDB is highly scalable , which means it can handle large volumes of trade data and support high levels of concurrency. Rich query language: Allows developers to perform complex queries without writing much code. MongoDB's Apache Lucene-based search can also help screen large volumes of data against sanctions and watch lists in real-time. Figure 1: MongoDB's developer data platform Discover the developer productivity calculator . Developers spend 42% of their work week on maintenance and technical debt. How much does this cost your organization? Calculate how much you can save by working with MongoDB. 2. An operational trade store to replace slow batch processing Back-office technology teams face numerous challenges when consolidating transaction data due to the complexity of legacy batch ETL and integration jobs. Legacy databases have long been the industry standard but are not optimal for post-trade management due to limitations such as rigid schema, difficulty in horizontal scaling, and slow performance. For T+1 settlement, it is crucial to have real-time availability of consolidated positions across assets, geographies, and business lines. It is important to note that the end of the batch cycle will not meet this requirement. As a solution, MongoDB customers use an operational trade data store (ODS) to overcome these challenges for real-time data sharing. By using an ODS, financial firms can improve their operational efficiency by consolidating transaction data in real-time. This allows them to streamline their back-office operations, reduce the complexity of ETL and integration processes, and avoid the limitations of relational databases. As a result, firms can make faster, more informed decisions and gain a competitive edge in the market. Using MongoDB (Figure 2 below), trade desk data is copied into an ODS in real-time through change data capture (CDC), creating a centralized trade store that acts as a live source for downstream trade settlement and compliance systems. This enables faster settlement times, improves data quality and accuracy, and supports full transactionality. As the ODS evolves, it becomes a "system of record/golden source" for many back office and middle office applications, and powers AI/ML-based real-time fraud prevention applications and settlement risk failure systems. Figure 2: Centralized Trade Data Store (ODS) Managing trade settlement risk failure is critical in driving efficiency across the entire securities market ecosystem. Luckily, MongoDB integration capabilities (Figure 3 below) with modern AI and ML platforms enable banks to develop AI/ML models that make managing potential trade settlement fails much more efficient from a cost, time, and quality perspective. Additionally, predictive analytics allow firms to project availability and demand and optimize inventories for lending and borrowing. Figure 3: Event-driven application for real time monitoring Summary Financial institutions face significant challenges in reducing settlement duration from two business days (T+2) to one (T+1), particularly when it comes to addressing the existing back-office issues. However, it's crucial for them to achieve this goal within a year as required by the SEC. This blog highlights how MongoDB's developer data platform can help financial institutions automate manual processes and adopt a best practice approach to replace batch processes with a real-time data store repository (ODS). With the help of MongoDB's developer data platform and best practices, financial institutions can achieve operational excellence and meet the SEC's T+1 settlement deadline on May 28, 2024. In the event of T+0 settlement cycles becoming a reality, institutions with the most flexible data platform will be better equipped to adjust. Top banks in the industry are already adopting MongoDB's developer data platform to modernize their infrastructure, leading to reduced time-to-market, lower total cost of ownership, and improved developer productivity. Looking to learn more about how you can modernize or what MongoDB can do for you? Zero downtime migrations using MongoDB’s flexible schema Accelerate your digital transformation with these 5 Phases of Banking Modernization Reduce time-to-market for your customer lifecycle management applications MongoDB’s financial services hub
4 Ways MongoDB Solves Healthcare's Interoperability Puzzle
Picture this: You're on a road trip, driving across the country, taking in the beautiful scenery, and enjoying the freedom of the open road. But suddenly, the journey comes to a screeching halt as you fall seriously ill and need emergency surgery. The local hospital rushes you into the operating room, but how will they know what medications you're allergic to, or what conditions you've been treated for in the past? Figure 1: Before and after interoperability In a perfect world, the hospital staff would have access to all of your medical records, seamlessly integrated into one interoperable electronic health record (EHR) system. This would enable them to quickly and accurately treat you as seen in Figure 1. Unfortunately, the reality is that data is often siloed, fragmented, and difficult to access, making it nearly impossible for healthcare providers to get a complete picture of their patients' health. That’s where interoperability comes in, enabling seamless integration of data from different sources and formats, allowing healthcare providers with easy access to the information they need, even between different health providers. And at the heart of solving the interoperability challenge is MongoDB, the ideal solution for building a truly interoperable data repository. In this blog post, we'll explore four ways why MongoDB stands out from all others in the interoperability software space. We'll show you how our unique capabilities make us the fundamental missing piece in the interoperability puzzle for healthcare. Let’s get started! 1. Document flexibility MongoDB's document data model is perfect for managing healthcare data. It allows you to work with the data in JSON format, eliminating the need to flatten or transform it into a string. This simplifies the implementation of common interoperability standards for clinical and terminology data, such as HL7 FHIR and openEHR, as well as SNOMED and LOINC - because all of these standards also support JSON. The document model also supports nested and hierarchical data structures, making it easier to represent complex clinical data with varying levels of detail and granularity. MongoDB's document model also provides flexibility in managing healthcare data, allowing for dynamic and self-describing schemas. With no need to pre-define the schema, fields can vary from document to document and can be modified at any time without requiring disruptive schema migrations. This makes it easy for healthcare providers to add or update information to clinical documents, such as when new interoperability standards are released, ensuring that healthcare data is kept accurate and up-to-date without requiring database reconfiguration or downtime. 2. Scalability Dealing with large healthcare datasets can be challenging for traditional relational database systems, but MongoDB's horizontal scaling feature offers a solution. With horizontal scaling, healthcare providers can easily distribute their data across multiple servers and cloud providers (AWS, GCP, and Azure), resulting in increased processing power and faster query times. It also results in more cost-efficient storage as growing vertically is more expensive than growing horizontally. This feature allows healthcare providers to scale their systems seamlessly as their data volumes grow while maintaining performance and reliability. While MongoDB’s reliability is ensured through its replication architecture, where each database replica set consists of three nodes that provide fault tolerance and automatic failover in the event of node failure. Horizontal scaling also improves reliability by adding more servers or nodes to the system, reducing the risk of a single point of failure. 3. Performance When it comes to healthcare data, query performance can make all the difference in delivering timely and accurate care. And that’s another aspect where MongoDB shines. MongoDB holds data in a format that is optimized for storage and retrieval, allowing it to quickly and efficiently read and write data. MongoDB’s advanced querying capabilities, backed by compound and wildcard indexes, make it a standout solution for healthcare applications. MongoDB Atlas’ Search, using Apache Lucene indexing, also enables efficient querying across vast data sets, handling complex queries with multiple fields. This is especially useful for Clinical Data Repositories (CDRs), which permit almost unlimited querying flexibility. Atlas Search indexing also allows for advanced search features enabling medical professionals to quickly and accurately access the information they need from any device. 4. Security Figure 2: Fine-grained access control The security of sensitive clinical data is paramount in the healthcare industry. That’s why MongoDB provides an array of robust security features, including fine-grained access control and auditing as seen in Figure 2. With Client-Side-Field-Level Encryption (CS-FLE) and Queryable Encryption, MongoDB is the only data platform that allows the processing of randomly encrypted patient data, providing the highest level of data security, with minimal impact on performance. Additionally, MongoDB Atlas supports VPC peering and private links that permit secure connections to healthcare applications, wherever they are hosted. By implementing strong security measures from the start, organizations can ensure privacy by design. Partner ecosystem MongoDB is the only non-relational database and modern data platform that directly collaborates with clinical data repository (CDR) vendors like Smile, Exafluence, Better, Firely, and others. While some vendors offer MongoDB as an alternative to a relational database, others have built their solutions exclusively on MongoDB, one for example is Kodjin FHIR server. MongoDB has extended its capabilities to integrate with AWS FHIR Works, enabling healthcare providers and payers to deploy a FHIR server with MongoDB Atlas through the AWS Marketplace. With MongoDB's unique approach to data storage and retrieval and its ability to work with CDR vendors, millions of patients worldwide are already benefiting from its use. Beyond interoperability with MongoDB Access to complete medical records is often limited by data silos and fragmentation, leaving healthcare providers with an incomplete picture of their patients' health. That's where MongoDB's interoperability solution comes in as the missing puzzle piece the healthcare industry needs. With MongoDB's unmatched document flexibility, scalability, performance, and security features, healthcare providers can access accurate and up-to-date patient information in real-time. But MongoDB's solution goes beyond that. Radical interoperability with MongoDB means that healthcare providers own the data layer and are thus able to leverage any usages from the stored data, and connect to any existing applications or APIs. They're free to work with any healthcare data standard, including custom schemas, and leverage the data for use cases beyond storage and interoperability. The future of healthcare is here, and with MongoDB leading the way, we can expect to see more innovative solutions that put patients first. If you're interested in learning more about radical interoperability with MongoDB, check out our brochure .
Aerofiler Brings Breakthrough Automation to the Legal Profession
Don Nguyen is the perfect person to solve a technology problem in the legal space. Don spent several years in software engineering before eventually becoming a lawyer, where he discovered just how much manual, administrative work legal professionals have to do. The company he co-founded, Aerofiler, takes the different parts of the contract lifecycle and digitises them to eliminate manual work, allowing lawyers to focus on things that require their expertise. Don says the legal profession has always been behind industries like accounting, marketing, and finance when it comes to leveraging technology to increase productivity. Both Don and his co-founder, Stuart Loh, thought they could automate a lot of manual tasks for legal professionals through an AI-powered contract lifecycle management solution. Turning mountains into automation Law firms generate mountains of paperwork that must be digitised and filed. Searching contracts post-execution can be an arduous task using the legacy systems most firms are running on today. Initially, Don, Stuart, and Jarrod Mirabito (co-founder and CTO) set out to make searching contracts and tracking obligations easier. As the service became more popular, customers started asking for more capabilities, like digitising and automating the approval process. Aerofiler's solution now manages the entire contract lifecycle, from drafting and negotiations to approvals, signing, and filing. Don says the difficulty with running AI to extract data is you can't usually see where the data is coming from, and you can't train your models, for example, to extract a concept that might be specific to your industry. Aerofiler supports custom extraction so firms can crawl for and find exactly the results they're looking for, and it highlights exactly where in the contract the data is found. Aerofiler is unique as a modern, cloud-based Contract Lifecycle Management solution that streamlines contract management processes and enhances workflow efficiency. It features AI-powered analytics, smart templates, and real-time collaboration tools, and is highly configurable to fit the unique needs of different companies. Aerofiler's user interface is also highly intuitive and user-friendly, leading to greater user adoption and overall efficiency. The startup stack Don has over 10 years of experience working with MongoDB and describes it as very robust. When it was time to choose a database for their startup, MongoDB Atlas was an easy choice. One of the big reasons Don chose Atlas is so they don't have to manage their own infrastructure. Atlas provides the functionality for text search, storage, and metadata retrieval, making it easy to hit the ground running. On top of MongoDB, the system runs Express.js, VueJS, and Node.js, also known as a MEVN stack. In choosing a database, Don points out that every assumption you make will have exceptions to it, and no matter what your requirements are now, they will inevitably change. So one of the key factors in making a decision is how that database will handle those changes when they come. In his experience, NoSQL databases like MongoDB are easy to deploy and maintain. And, with MongoDB offering ACID transactions , they get a lot of the functionality that they would otherwise look for in a relational database stack. How startups grow up Aerofiler is part of the MongoDB for Startups program, which helps early-stage, high-growth startups build faster and scale further. MongoDB for Startups offers access to a wide range of resources, including free credits to our best-in-class developer data platform, MongoDB Atlas, personalized technical advice, co-marketing opportunities, and access to our robust developer community. Don says the free credits helped the startup at a time when resources were tight. The key to their success, Don says, is in solving problems their customers have. In terms of the road ahead, Don is excited about ChatGPT and says there are some very interesting applications for generative AI in the legal space. If anyone would like to talk about what generative AI is and how it could work in the legal space, he's happy to take those calls and emails . Are you part of a startup and interested in joining the MongoDB for Startups program? Apply now .
Temenos Banking Cloud Scales to Record High Transactions with MongoDB Atlas and Microsoft Azure
Banking used to be a somewhat staid, hyper-conservative industry, seemingly evolving over eons. But the emergence of Fintech and pure digital players in the market paired with alternatives in technology is transforming the industry. The combination of MACH , BIAN and composable designs enables true innovation and collaboration within the banking sector, and the introduction of cloud services makes these approaches even easier to implement. Just ask Temenos, the world's largest financial services application provider, providing banking for more than 1.2 billion people . Temenos is leading the way in banking software innovation and offers a seamless experience for their client community in over 150 countries. Temenos embraces a cloud-first, microservices-based infrastructure built with MongoDB, giving customers flexibility, while also delivering significant performance improvements. Financial institutions can embed Temenos components, like Pay-as-you-go, which delivers new functionality to their existing on-premises environments, on their own cloud deployments or through a full banking as a service experience with Temenos Transact powered by MongoDB on various cloud platforms. This new MongoDB-based infrastructure enables Temenos to rapidly innovate on its customers' behalf, while improving security, performance, and scalability. Fintech, payments and core banking Temenos and MongoDB joined forces in 2019 to investigate the path toward data in a componentized world. Over the past few years, our teams have collaborated on a number of new, innovative component services to enhance the Temenos product family, and several banking clients are now using those components in production. However, the approach we've taken allows banks to upgrade on their own terms. By putting components “in front” of the Temenos Transact platform , banks can start using a componentization solution without disrupting their ability to serve existing customer requirements. From May 2023 onwards, banks will have the capability to deploy Temenos Infinity microservices as well as the core banking Temenos Transact exclusively on the developer data platform from MongoDB and derive even more value. Making the composable approach even more valuable, Temenos implemented their new data backend firmly based on JSON and the document model . MongoDB allows fully transparent access to data and the exploitation of additional features of the developer data platform. These features include Atlas Search , application-driven analytics , and AI through workload isolation. Customers also benefit from the geographic distribution of data based solely on the customer requirements, be it in a single country driven by sovereignty requirements or distributed across continents to ensure always-on and best possible data access and speed for trading. Improved performance and scale In contrast to the retail-centric benchmark last year , the approach this time was to test broader functionality and include more diverse business areas – all while increasing the transaction volume by 50%. The benchmark scenario simulated a client with 50 million retail customers, 100 million accounts and a Banking-as-a-Service (BaaS) offering for 10 brands and 50 million embedded finance customers on a single cloud instance. In the test, Temenos Banking Cloud processed 200 million embedded finance loans and 100 million retail accounts at a record-breaking 150,000 transactions per second. In doing so, Temenos proved its robust and scalable platform can support banks’ business models for growth through BaaS or distributing their products themselves. The benchmark included not just core transaction processing, but a composed solution combining payments, financial crime mitigation (FCM), a data hub, and digital channels. "No other banking technology vendor comes close to the performance and scalability of Temenos Banking Cloud. We consistently invest more in cloud technologies and have more banks live with core banking in the cloud than any of our peers. With global non-cash transaction volumes skyrocketing in response to fast-emerging trends like BaaS, banks need a platform that allows them to elastically scale based on business demand, provide composable capabilities on-demand at a low cost, while reducing their environmental impact. This benchmark with Microsoft and MongoDB proves the capability of Temenos’ platform to power the world’s biggest banks and their BaaS offerings with hundreds of millions of customers, efficiently and sustainably in the cloud." Tony Coleman, Chief Technology Officer, Temenos This solution landscape reflects an environment where everyone on the planet runs two banking transactions a day on a single bank. This throughput should cater to any Tier 1 banking deployment, in size and performance, and cover any future growth plans that they have. Below are the transaction details that comprise the actual benchmark mix. As mentioned above it is a broad mix of different functionalities behaving like a retail bank and a fintech institute, which provides multiple product brands, e.g. cards for different retails. Besides the sheer performance of the benchmark, the ESG footprint of the overall landscape shrunk again versus last year’s configuration as the MongoDB Atlas environment was the sole database and no secondary systems were required. Temenos Transact optimized with MongoDB The JSON advantage Temenos made significant engineering efforts to decapsulate the data layer, which was previously stored as PIC, and make JSON formatted data available to their user community. MongoDB was designed from its inception to be a database focused on delivering a great development experience. JSON’s ubiquity made it the obvious choice for representing data structures in MongoDB’s document data model. Below you can see how Temenos Transact stores data vs Oracle or MSSQL vs MongoDB. Temenos and MongoDB have an aligned data store – Temenos Transact application code operates on documents (JSON) and MongoDB stores documents in JSON in one place, making it the perfect partnership. MongoDB enables the user community through its concept of additional nodes in the replica set to align further secondary applications integrated into the same database without interrupting and disturbing the transactional workload of Temenos Transact. The regular occurring challenge with legacy relational database management systems (RDBMS) where secondary applications suddenly have unexpected consequences to the primary application is a problem of the past with MongoDB. Workload Isolation with MongoDB MongoDB Atlas will operate in most cases in three availability zones, where two zones are located in the same region for pure availability and a single node is located in a remote region for disaster recovery. This environment provides the often required RPO/RTO “0” while delivering unprecedented performance. Two nodes in each of the first availability zones provision the transactional replica set and ensure the consistency and operation of the Temenos Transact application. In each availability zone, a third isolated workload node is co-located with the same data set as the other two nodes but is excluded from the transactional processing. These isolated workload nodes provide capacity for additional functionalities. In the example above, one node provides access to the MongoDB Atlas Federation and a second node provides the interface for MongoDB Atlas Search. As the nodes store data in near real-time – replication is measured in sub milliseconds as they are in the same availability zone – this allows exciting new capabilities like real-time large language model (LLM), e.g. ChatGPT, or machine learning connecting to a Databricks lake house. The design is discussed in more detail in this article . The below diagram shows a typical configuration for such a cluster setup in the European market for Microsoft Azure: one availability zone in Zurich, one availability zone in Geneva, and an additional node out of both in Ireland. Additionally, we configured isolated workloads in Zurich and Geneva. MongoDB Atlas allows the creation of such a cluster within seconds, configured to the specific requirements of the solution deployed. Typical configuration for a cluster setup for the European market for Microsoft Azure Should the need arise, MongoDB can have up to 50 nodes in a single replica set so for each additional isolated workload, one or more nodes can be made available when and where needed. Even at locations beyond the initial three chosen! For this benchmark the use of a MongoDB Atlas cluster M600 was utilized which was oversized based on the CPU utilization of 20-60% depending on the node type. Looking backward a smaller MongoDB Atlas M200 would have been easily sufficient. Nonetheless MongoDB Atlas delivered the needed database performance with one third of the resources of last year's result, but delivering 50% more throughput. Additionally MongoDB Atlas performed two times faster in throughput per transaction (measured in milliseconds). Signed, sealed, and delivered. This benchmark gives clients peace of mind that the combination of core banking with Temenos Transact and MongoDB is ready to support the needs of even the largest global banks. While thousands of banks rely on MongoDB for many parts of their operations ranging from login management and online banking, to risk and treasury management systems, Temenos' adoption of MongoDB is a milestone. It shows that there is significant value in moving from a legacy database technology to MongoDB, allowing faster innovation, eliminating technical debt along the way, and simplifying the landscape for financial institutions, their software vendors, and service providers. PS: We know benchmarks can be deceiving and every scenario in each organization is different. Having been in the benchmark business for a long time, you should never trust just ANY benchmark. In fact, my colleague, MongoDB distinguished engineer John Page, wrote a great blog about how to benchmark a database . If you would like to learn more about how you can use MongoDB to move towards a composable system, architecting for real-time adaptability, scalability, and resilience, take a look at the below resources: Componentized core banking built upon MongoDB Tony Coleman, CTO at Temenos and Boris Bialek, Global Head, Industry Solutions at MongoDB discuss the partnership at MongoDB World 2022 Remodel your core banking systems with MongoDB
Application-Driven Analytics: Why are Operational and Analytical Workloads Converging?
Fifteen years ago, our vision was to provide developers a new approach to databases. As industry change is constant, we are working to bring you another shift so you can stay ahead of the curve – application-driven analytics. Application-driven analytics isn’t about replacing your centralized data warehouse or data lakehouse. Rather it’s about augmenting them; bringing a new class of analytics directly into applications where they are built by developers. In his recent Analyst Perspective, Matt Aslett, VP & Research Director at Ventana Research was very clear there remains different functional requirements for dedicated operational and analytical systems. However, he noted the growth in intelligent applications infused with the results of analytic processing. This in turn is driving operational / OLTP data platforms such as MongoDB to integrate native analytics functionality. In the Perspective, Aslett goes on to describe some of the recent product enhancements introduced by MongoDB to support analytics, and wraps up with this advice: I recommend that organizations evaluating potential database providers for new, intelligent operational applications include MongoDB Atlas in their evaluations. If you are interested in learning more about Ventana Research’s insights, take a look at the company’s Analyst Perspective: MongoDB’s Atlas Delivers Data-Driven Applications . From manufacturing to retail and finance Beyond the research from industry analysts, organizations are increasingly working to capture the opportunities presented by application-driven analytics. Bosch Global Software uses MongoDB at the core of its IoT systems powering automotive, industrial, and smart home use cases. Being able to generate analytics in real time is a key capability of the company’s applications. As discussed in his recent article in The New Stack , Kai Hackbarth, senior technology evangelist at Bosch, talked about the value MongoDB provides: From my history [of doing this for] 22 years, we never had the capabilities to do this before. Global retailer Marks and Spencer rebuilt its Sparkes rewards program , moving from a packaged app to an in-house solution with MongoDB. The company reduced its time to build one million personalized customer offers from one hour to just five minutes. It is now able to serve those offers at 10x lower latency with the loyalty program driving 8x higher customer spend. As part of its digital transformation initiative, Toyota Financial Services built a new operational data layer powered with MongoDB Atlas . The data layer connects the company’s internal mainframe backend systems with customers engaging the company through new digital channels. MongoDB handles customer onboarding along with fraud prevention. The native OLTP and analytics capabilities provided by MongoDB Atlas were key. They eliminated the need for Toyota Financial Services to integrate and build against separate database, cache, object store, and data warehouse technologies. All dramatically simplifying the company’s technology estate. MongoDB helps us make better decisions and build better products. Ken Schuelke, Division Information Officer, Toyota Financial Services Enabling developers for application-driven analytics How is MongoDDB helping developers make the shift to smarter apps and intelligent software? MongoDB Atlas unifies the core transactional and analytical data services needed to deliver app-driven analytics. It puts powerful analytics capabilities directly into the hands of developers in ways that fit their workflows. With Atlas, they land data of any structure, index, query, and analyze it in any way they want, and then archive it. All while working with a unified API and without having to build their own data pipelines or duplicate data. MongoDB Atlas supports any level of application intelligence. From querying and searching records to aggregating and transforming data through to feeding rules-based engines and machine learning models. Figure 1: MongoDB Atlas combines transactional and analytical processing in a multi-cloud data platform. Atlas automatically optimizes how data is ingested, processed, and stored, maximizing the efficiency of the application’s operational and analytical workloads. These capabilities are packaged in an elegant and integrated multi-cloud data architecture. Getting started There are many ways you can get started in building more intelligent apps. If you want to read more about the use-cases and business drivers, download our App-Driven Analytics whitepaper . Alternatively if you want to dive straight in, sign up for an account on MongoDB Atlas . From there, they can create a free database cluster, load your own data or our sample data sets, and explore what’s possible within the platform. The MongoDB Developer Center hosts an array of resources including tutorials, sample code, videos, and documentation organized by programming language and product.
Tag, You're It: Using MongoDB Labels to Drive DataDog Business Logic
MongoDB Atlas and DataDog work better together to enable you to take advantage of the automation, elasticity, and scalability of modern infrastructure and act on real-time information to make well-informed decisions. With the latest release of MongoDB’s DataDog integration, we’ve added the ability to send custom defined labels in Atlas, CM, and OM to tags on the DataDog metrics object. Applying tags to DataDog's metrics allows users to organize and categorize their metrics in various ways. Here's how the new DataDog metrics tagging works: Define key-value labels in MongoDB at the cluster level that get automatically sent to DataDog as metrics tags. You can define tags based on the attributes you want to track and then assign those tags to the relevant metrics. For example, a tag named "region" could be assigned to metrics representing servers in different geographic regions. Filter and group metrics in DataDog by tags. You can then use these tags to filter, group, and aggregate metrics in various ways. For example, you can group metrics by tags such as "application," "environment," or "owner" to gain insights into how different parts of your infrastructure are performing. You can also filter metrics by tags to view just the metrics you're interested in. Use tags to set alerts and dashboards. You can create alerts and dashboards in DataDog based on tags. This allows you to monitor specific aspects of your infrastructure, such as servers in a particular region or metrics related to a specific application. DataDog's metric tagging feature is highly flexible and customizable, allowing you to adapt it to your specific organizational needs. Effectively using metrics tagging will grant you a better understanding of your application’s performance and help organizations make data-driven decisions. Read on to see an example of how MongoDB’s DataDog tagging integration can help organize your metrics and how you can use it to drive your business’ monitoring requirements. Setting cluster labels in Atlas In the example below, I’ll use MongoDB’s Data API to access and configure my MongoDB Atlas data. In this case, I’ll use MongoDB’s Data API to set new labels on my MongoDB Atlas clusters. To explore the new MongoDB Data API, you can use the public Postman workspace provided by the MongoDB developer relations team. You can use the cluster configuration API endpoint to add labels to a cluster in MongoDB. You can then use the get all clusters endpoint to verify that your labels were applied correctly. Once you’ve labeled your MongoDB clusters, set up your DataDog integration as you would normally. For more information on configuring MongoDB’s DataDog integration, refer to the documentation here . Using DataDogs metrics tagging to drive business logic After setting up your DataDog integration, you’ll automatically start seeing the labels you previously defined in MongoDB show up as tags on your DataDog metrics. You can now use these tags to filter, organize, and define logic based on the needs of your business. As an example, you can filter results in the DataDog Metrics Explorer. If I only want to see metrics for clusters associated with the “leafy” application, I can use tags to filter the metrics I get back. You can also use tags to define custom logic for your DataDog monitors. In this example, I’m configuring monitors to trigger different priority alerts based on the metrics tags. When query targeting exceeds the threshold for production clusters, I’ll get a P2 alert. However, the same alert for non-production environments wouldn’t have the same priority. In this example, I can use the “usage” metrics tag to define different alerting priorities. And finally, Datadog metrics tagging can help you organize and filter data in dashboards. Metrics tagging enables you to easily group related data together and create more focused and specific dashboards. For example, you can use the app_name tag here to filter on just the leafy application. This will help you quickly identify and troubleshoot issues. Using tags effectively can grant deeper insights into your data and help your organization make more informed decisions to improve the performance and availability of your applications and infrastructure. This new enhancement to MongoDB’s DataDog integration provides significantly more flexibility in how you use DataDog and helps you get the most out of your investment. The DataDog integration is available on MongoDB Atlas on M10 cluster tiers and higher. Learn more about DataDog’s powerful tagging capabilities. If you’d like to see MongoDB’s DataDog integration in action, the easiest way to get started is to sign up for MongoDB Atlas , our cloud database service.
Building an Industrial Unified Namespace Architecture with MongoDB and Arcstone
The fourth industrial revolution, also known as Industry 4.0 is rapidly transforming the manufacturing industry. Leveraging I4.0 reference architectures and Industrial IoT technologies, factories generate more data than ever. Market analyst reports tell us that the global number of Industrial IoT connections will increase to 36.8 billion in 2025. As factories become more connected and data-driven, it is essential to have a unified and standardized approach for manufacturing data management. In this article, we explain how MongoDB helps create a Industrial Unified Namespace (IUN) architecture that can act as a contextualized repository for data and information for all manufacturing assets. Manufacturing companies have been leveraging the International Society of Automation’s standard 95 (ISA-95) to develop automated interfaces between industrial control systems and enterprise systems. ISA-95 provides a hierarchical model for interfacing and integration also known as the automation pyramid. Figure 1 shows the five levels of the automation pyramid. Figure 1: ISA-95 Automation Pyramid. ISA-95 was introduced in 2000 to improve communication and data exchange between different levels of the manufacturing industry. With the advent of Industrial IoT (IIoT), the limitations of the ISA-95 model have become increasingly apparent. Lack of Interoperability: The model was developed for a more traditional, hierarchical approach to manufacturing, where there is a clear separation between operational technology (OT) and information technology (IT). In contrast, IIoT tries to blur the lines between OT and IT, with a greater emphasis on data interoperability and real-time analytics. Limited Flexibility and Agility: The rigid and hierarchical structure imposed by the automation pyramid goes against Industry 4.0 concepts of flexibility and agility. The data captured by sensors must go through the SCADA and MES layers to reach the top level. This makes it difficult for manufacturers to adapt to changing production requirements and integrate IIoT technology into their existing systems. Limited Scalability: The ISA-95 model was designed for a traditional manufacturing environment with a limited number of production lines and machines. However, with the growth of Industry 4.0, the number of connected devices and the amount of data generated has increased dramatically. The automated pyramid does not easily scale to handle this increased volume of data and devices, leading to potential bottlenecks and inefficiencies in the manufacturing process. For example, if a new machine is added to the production line, ISA-95 requires significant changes to the factory IT and OT architecture, which can be time-consuming and costly. Industrial unified namespace (IUN) architecture with MongoDB In order to overcome these challenges, we propose that manufacturers adopt an Industrial Unified Namespace (IUN) architecture leveraging MongoDB technology. Such an architecture will provide a single view of all manufacturing processes and equipment and will enable data interoperability between different layers of the ISA-95 automation pyramid. Figure 2 shows a conceptual diagram of the IUN architecture. Figure 2: Event driven industrial unified namespace IUN follows an event-driven architecture topology where different manufacturing applications publish events in real-time (publishers) to the central MongoDB Atlas database. Application services subscribe asynchronously to the event types or topics of interest and consume them at their own speed (consumers). This results in a decoupled ecosystem allowing applications and services to act interchangeably to provide and consume data when and where needed in real-time. It is understood that many applications and services may produce and consume data at the same time. MongoDB Atlas database plays a central role in the IUN architecture. The events can flow in through MongoDB Kafka Connector or Atlas Device Sync and MongoDB Atlas can aggregate, persist and serve them to consuming manufacturing applications. The core MongoDB Atlas database in this scenario provides a central repository for multiple independent event streams and the developer data platform helps to drive operational and analytical apps providing a complete end-to-end view of the production process. Data modeling for industrial unified namespace The document model is the most natural way to work with data stored in the database. It is simple for any developer to learn how to code against MongoDB, and as a result, industry surveys show it is wildly popular amongst developers. MongoDB provides flexible data modeling options to create a central repository for all factory production data. Asset-centric data model: Focusses on the assets, for example machines, equipment, tools in the manufacturing process. This data model is useful for tracking the performance, maintenance, and utilization of assets. Process-centric data model: Focuses on the day to day production processes. Such a data model is useful in optimizing the process flow and reducing bottlenecks. Product-centric data model: Focuses on the products produced in the manufacturing process. This data model is useful for tracking the production and quality of individual products. It is possible for a factory to have all three models at the same time. In fact, it is common for factories to use multiple data models and integrate them as needed to gain a complete view of their operations. For example, a factory may use an asset-centric model to track its equipment, and a product-centric model to track its finished goods, while also using a process-centric model to optimize its manufacturing processes. Let us take an example of a bicycle factory and look at example asset, process and product-centric data models. At a minimum, the following collections (Figure 3) will need to be created in the database. Figure 3: MongoDB collections for different IUN data models Each collection will have data coming from different sources such as Manufacturing Execution System (MES), IIoT Platform, and Enterprise Resource Planning (ERP) systems. An example document from the production equipment collection is shown in Figure 4. As it can be seen, the data comes from various sources and the MongoDB document model makes it very easy to combine this data together in one document generating a digital twin prototype of the machine. Figure: A sample document from the Production Equipment collection Architecture for industrial unified namespace Let us take our bicycle factory and create a solution architecture for the Industrial Unified Namespace. First, let us list down all the event producers and consumers. All these systems both consume and publish events: IoT Gateways / Edge Server Supervisory Control and Data Acquisition (SCADA) / Shop Floor Connectivity Platform (SCP) Manufacturing Execution System (MES) Enterprise Resource Planning (ERP) Arcstone toolsets for smart manufacturing Arcstone is a Singapore/US-based Industry 4.0 solutions company providing modular-based, next-generation MES alongside hardware integration and process orchestration toolsets. Arcstone delivers success to companies from diverse industries, including Global Fortune 500 companies, manufacturing companies, emerging facility management firms, and SMEs, globally. Arcstone arc.ops MES contains 15+ modules for full operational management that can be custom tailored to specific requirements, and is built to be end-user configurable for easy intuitive use. Arcstone understands that extracting data from legacy equipment is a challenging task. Therefore, they have created a low-code solution named arc.quire to handle the collection of raw data and streaming into a database for storage. arc.quire is used in tandem with a process orchestration tool called arc.flow to establish connectivity between arc.quire and the database, for example, MongoDB EA. Depending on the connectivity interface exposed by the production equipment, SCADA or SCP software can connect to the equipment and push the raw events and alerts to the arc.quire running in the edge server. MongoDB’s Enterprise Operator for Kubernetes , gives the flexibility to run MongoDB as a container in resource-constrained environments such as our IoT edge server. Figure 5 shows how the edge server can be connected with the SCADA and IoT gateways on the production shop floor. Figure 5: Edge Server with MongoDB and Arcstone toolsets The edge server performs the following functions: Aggregation of IIoT events and alerts via arc.quire Real-time analytics such as machine fault detection, process optimization, and process control via the MongoDB aggregation framework Transmitting control instructions back to the equipment via arc.quire Raw data and analytical results storage in MongoDB Edge servers act as one of the event producers for IUN. Using the MongoDB Kafka connector, events can be transmitted from the edge server to a centralized data repository in MongoDB Atlas. Figure 6: MongoDB can serve as both a Sink and a Source for Apache Kafka Bringing it all together Figure 7 shows the complete technical architecture of the Industrial Unified Namespace with MongoDB Atlas Developer Data Platform and Arcstone. Figure 7: In this architecture, arc.ops MES, ERP, and edge server publish data to the message stream in Apache Kafka where the event queue makes the data available for MongoDB Atlas to consume via Kafka connectors [1 and 2]. Depending on the factory requirements around batch processing and scalability, Kafka can be replaced by a MQTT broker. There are multiple community backed and commercial libraries to push MQTT data into MongoDB. The centralized database aggregates and persists events, enriches event streams with data from all sources, including historical data, and provides a central repository for multiple event streams. This enables applications and users to benefit from all data across all microservices and provides a unified view of the state across the factory. Atlas also leverages Atlas Charts for events visualization as well as Atlas Search for full-text search of events [3 and 4]. MongoDB’s Atlas Triggers provide a serverless way of consuming change stream events [5]. With Triggers, the manufacturer doesn't have to set up their own application server to run your change data capture process. Change streams flow change data to Atlas Triggers to create responsive, event-driven pipelines. Finally, Atlas Device Sync and Realm SDK can be leveraged to push real-time notifications and alerts to shop floor applications for users to consume. Use cases Predictive maintenance IUN can be deployed as the foundation for predictive maintenance applications. Edge server streams time-series event data from the production equipment into MongoDB to drive machine-learning models that will detect equipment health and performance degradation trends. The data is enriched using data streams about production jobs from MES. The factory can either repair equipment or swap it out for replacement parts before shutting down production lines. Atlas Device Sync can alert engineers on the shop floor to potential equipment failures, and help the company optimize the equipment maintenance strategy. Operational data layer The IUN architecture can be used to create a manufacturing Operational Data Layer (ODL). An ODL strives to centrally integrate and organize all siloed manufacturing IT/OT data and makes it accessible to stakeholders across the factory floor. This ODL will combine data from both OT and IT sources into a single MongoDB Atlas database where Atlas Search and Charts can be used to analyze this data and drive actions on the shop floor. IUN captures any changes in source systems and streams them into MongoDB to keep the ODL fresh, and helps to update the source systems in real-time. Conclusion In conclusion, the ISA95 Automation Pyramid presents significant challenges for the manufacturing industry, including a lack of flexibility, limited scalability, and difficulty integrating new technologies. By adopting an Industrial Unified Namespace architecture with Arcstone and MongoDB, manufacturers can overcome these challenges and achieve real-time visibility and control over their operations, leading to increased efficiency and improved business outcomes. To learn more about MongoDB’s role in the manufacturing industry, please visit our Manufacturing and Industrial IoT page.
Connected Devices - How GE HealthCare Uses MongoDB to Manage IoT Device Lifecycle
GE HealthCare, a global leader in medical technology, has turned to MongoDB to manage the lifecycle of its IoT devices, from deployment (Beginning of Life or BoL) to retirement (End of Life or EoL). At GE HealthCare, MongoDB Atlas is used to persist device and customer data. These related data layers are utilized by the organization to develop customer experience strategies by providing greater efficiency, improving patient outcomes, and increasing access to care. The MongoDB document model easily combines data from diverse source systems while preserving its full fidelity. This flexibility allows seamless onboarding of new customers and related data sources without requiring time consuming schema modifications. According to Emir Biser, Senior Data Architect at GE HealthCare, MongoDB Atlas is very appealing to the team because of its effective management, built-in monitoring and backup, global vertical and horizontal scalability, built-in security, and multi-cloud support. MongoDB Atlas is a gamechanger. This technology stack is helping us streamline commercialization and bring market-ready solutions to deliver advanced healthcare. Some of the recent tests resulted in an *83% decrease in retrieval time for critical data elements. When all these features are put together, the tech stack is designed to help healthcare providers enhance productivity by reducing the complexity and time required to manage databases, enabling faster deployment of IoT devices. Enhancing the IoT life cycle with MongoDB GE HealthCare’s tech-stack is designed to accelerate the integration of healthcare applications by connecting IoT devices together with additional data sources into an aggregated clinical data layer. As the IoT device connections are established, multiple services are applied on the platform to support analytic and clinical applications. Beginning of life - Device provisioning and configuration As the device is being manufactured, the device parameters such as MAC and serial number are stored in MongoDB as a device digital representation. When the device is turned on, the GEHC team gets information about the device usage and the customer information. This information is used to validate the device. MongoDB is playing a crucial role in device provisioning by persisting the configuration information and making sure that the device is set up with the right configuration parameters. MongoDB change streams are used at this stage to make sure that the device gets the right parameters at the BoL stage. Middle of life - Device usage and maintenance Once the device comes online, it transmits both clinical and non-clinical information. The team at GE HealthCare uses MongoDB Atlas to help ensure clear separation between clinical and non-clinical as permissions, sensitivity, and access differs. Additionally, to understand how the device is being used compared to its standard configuration parameters. MongoDB’s real-time analytics capabilities help track key device performance metrics, such as battery life and identify trends and patterns in device usage. This enables the team to proactively address device issues, improve overall device performance and reliability for customers. GEHC is able to share these insights with customers to help optimize use of devices within their enterprise. MongoDB Atlas Search is used to retrieve information about status of connected devices and usage patterns. Search Compound Geo JSON queries are used to look at products in a certain geographic region. Horizontal scalability with automatic sharding across clusters ensures Edison applications can continue to be cost effective while delivering real-time results. MongoDB’s security features, including authorization, authentication and encryption, work with GEHC processes to enable teams working to protect device data from unauthorized access. End of life - Device decommissioning and archiving When the IoT device reaches the end of its lifecycle, GE HealthCare needs to decommission it and ensure that any data associated with the device is securely archived. By using MongoDB’s TTL (time-to-live) collections feature, the team automates the process of data deletion, reducing the data footprint. In addition, Atlas Online Archive helps to ensure that the data is always backed up and securely archived, reducing the risk of data loss and corruption. The authentication and authorization mechanisms help to ensure that decommissioned devices data can only be accessed by authorized personnel. The future of GE HealthCare According to Emir, the teams using MongoDB Atlas are excited about the benefits it brings, and they are looking forward to exciting new developments in Atlas platform. We are helping teams achieve business goals across Imaging, Ultrasound Digital Solutions, and Patient Care Solutions. Our current strategy focuses on building solid pipelines to further help our medical device engineering teams deliver interoperability resulting in better care for our customers. More on managing massive IoT devices Internet of Things (IoT) is transforming the healthcare industry by providing real-time, actionable insights that improve patient outcomes and drive operational efficiencies. According to market analyst reports , the global IoT healthcare market is projected to reach around USD 446.52 billion by 2028 while exhibiting a CAGR of 25.9% between 2021 and 2028. In hospitals, IoT-enabled medical devices help improve patient safety and clinical experience by transmitting real-time monitoring and alerts in the event of device malfunctions or irregularities. The life of an IoT device can be divided into three main stages: Beginning of Life (BoL), Middle of Life (MoL) and End of Life (EoL). During the BoL stage, the key activities are deployment design and provisioning. In this stage the device may be pre loaded with default credentials and configuration files. Once the device is installed and comes online, the focus in the MoL is to maintain its basic functional purpose as well as regularly updating firmware for reliability and security purposes. Over time, as new versions of devices are manufactured, the deployed devices need to be decommissioned by revoking the device certificate, archiving device data and disabling the model of device in the cloud as part of the EoL stage. Figure 1: Three stages of IoT device lifecycle management In each of the stages, the device has to be maintained to stay reliable, efficient, persistent and secure. Setting up telemetry from device to cloud/back end is just the tip of the iceberg. As the number of IoT devices deployed in healthcare continues to grow, so does the challenge of managing them efficiently. The large amount of data generated creates scalability challenges for IoT device management systems, which need to be able to handle large amounts of data and support the increased traffic. Different communication protocols make it challenging to integrate these devices into a unified system. Maintaining standard communication protocols and interoperability is critical to ensure seamless communication between devices and cloud backend. Finally, with the increasing number of cyber-attacks targeting IoT devices, it is critical to have robust security measures in place to protect against threats. To learn more about GEHC digital offerings please visit https://apps.gehealthcare.com/ Test performed internally by GE HealthCare on company datasets and may not be replicable. To learn more about MongoDB’s role in the healthcare and manufacturing industry, please visit our Manufacturing and Industrial IoT and Healthcare pages.
Three Major IoT Data-Related Challenges and How to Address Them
IoT has formed itself a crucial component for future-oriented solutions and holds a massive potential of economic value. McKinsey & Company estimates that by 2030, IoT (Internet of Things) will enable $5.5 trillion to $12.6 trillion in value worldwide, including the value captured by consumers and customers. For proof of its growing popularity and consumers’ dependency on it, you likely don't need to look any further than your own wrist. From fitness bands to connected vehicles, smart homes, and fleet-management solutions in manufacturing and retail, IoT already connects billions of devices worldwide, with many more to come. As more IoT enabled devices come online, with increasingly sophisticated sensors, choosing the right underlying technology to make IoT solutions easier to implement and help companies seize new innovative opportunities is essential. In this blog we will discuss how MongoDB has successfully addressed three major IoT data-related challenges across various industries, including Manufacturing, Retail, Telecommunications, and Healthcare. The challenges are the following: Data Management Real-Time Analytics Supply Chain Optimization FIgure 1: MongoDB Atlas for IoT Let's dive right in! Data management Storing, transmitting, and processing the large amount of data that IoT devices produce is a significant challenge. Additionally, the data produced by IoT devices often comes in variable structures. This data must be carefully timestamped, indexed, and correlated with other data sources to make the context required for effective decision-making. This data volume and complexity combination makes it difficult to effectively and efficiently process data from IoT devices. Bosch Consider Bosch IoT Suite , a family of products and services in IoT device management, IoT data management, and IoT edge by Bosch Digital. These products and services hold over 250 international IoT projects and over 10 million connected devices. Bosch implemented MongoDB to store, manage, and analyze data in real time. MongoDB’s ability to handle structured, semi-structured, and unstructured data, and efficient data modeling with JSON make it easy to map the information model of each device to its associated document in the database. In addition, dynamic schemas support agile development methodologies and make it simple to develop apps and software. Adding new devices, sensors, and assets is easy, which means the team can focus on creating better software. ThingSpace Another example is that of ThingSpace , Verizon’s market-leading IoT connectivity management platform, which provides the network access required to deliver various IoT products and services. Verizon works with companies that purchase network access from it to connect their devices, bundled together with their own solutions, which they sell to end users. ThingSpace’s customers each sell an IoT product that needs reliable connectivity to ensure the devices always work, which WiFi cannot offer. Verizon’s monolithic RDBMS-based system would not be able to scale to handle both transactional and time-series workloads, so Verizon decided it needed a distributed database architecture going forward. MongoDB proved to be the only solution that scaled to meet Verizon’s requirements across different use cases and combinations of workload types. The immense processing needs resulting from the high number of devices and high velocity of messages coming in were only addressed by MongoDB’s highly available, scalable architecture. Native MongoDB Time Series allow for improved performance, through optimized storage with clustered indexes and optimized Time-Series query operators. MongoDB's advanced capabilities, such as flexible data modeling, powerful indexing and Time Series provide an effective solution for managing the complex and diverse data generated by IoT devices. Real-time analytics Real-time data analytics, one of the most crucial parts of big data analytics today, brings value to businesses for making more data-driven real-time decisions. However, despite its importance, very few can respond to changes in data minute by minute or second by second. Many challenges arise when it comes to the implementation of real-time analytics for enterprises. Storing such a huge volume of data and analyzing it in real time is an entirely different story. Thermo Fisher Cloud Let’s consider the Thermo Fisher Cloud, one of the largest cloud platforms for the scientific community on AWS. MS Instrument Connect allows Thermo Fisher customers to see live experiment results from any mobile device or browser. Each experiment produced millions of "rows" of data, which led to suboptimal performance with existing databases. Internal developers needed a database that could easily handle a wide variety of fast-changing data. MongoDB's expressive query language and rich secondary indexes provided the flexibility to support both ad-hoc and predefined queries customers needed for their scientific experiments. Anytime I can use a service like MongoDB Atlas, I’m going to take that so that we at Thermo Fisher can focus on what we’re good at, which is being the leader in serving science. Joseph Fluckiger, Sr. Software Architect @Thermo Fisher MongoDB Atlas scales seamlessly and is capable of ingesting enormous amounts of sensor and event data to support real-time analysis for catching any critical events or changes as they happen. That gives organizations new capabilities, including: Capturing streaming or batch data of all types without excessive data mapping Analyzing data easily and intuitively with a built-in aggregation framework Delivering data insights rapidly and at scale with ease With MongoDB organizations can optimize queries to quickly deliver results to improve operations and drive business growth. Supply chain optimization Items move through different locations in the supply chain, making it hard to maintain end-to-end visibility throughout their journey. The lack of control on any stage can dramatically harm the efficiency of planning, slow down the entire supply chain and ultimately result in lower return on investment. From optimizing warehouse space by sourcing raw materials as needed, to real-time supply chain insights, IoT-enabled supply chains can help significantly optimize these processes by eliminating blind spots and inefficiencies. Longbow Advantage Longbow Advantage delivers substantial business results by enabling clients to optimize their supply chains. Millions of shipments move through multiple warehouses every day, generating massive quantities of data throughout the day that must be analyzed for real-time visibility and reporting. Its flagship warehouse visibility platform, Rebus, combines real-time performance reporting with end-to-end warehouse visibility and intelligent labor management. Longbow needed a database solution that could process quantities of that scale and deliver real-time warehouse visibility and reporting at the heart of Rebus, and it knew it could not rely on monolithic, time-consuming spreadsheets to do so. It became clear that MongoDB’s document database model was a good match and would allow Rebus to gather, store, and build visibility into disparate data in near real time. Another key component of smart supply chain solutions is IoT-enabled mobile apps that provide real-time visibility and facilitate on-the-spot data-driven decisions. In such situations, the offline first paradigm becomes crucial, since staff need access to data in areas where connectivity is poor or nonexistent. Realm by MongoDB is a lightweight, object-oriented embedded database technology for resource constrained environments. It is an ideal solution for storing data on mobile devices. By utilizing MongoDB’s Realm SDKs, which wrap the Realm database, and Atlas Device Sync , which enables seamless data synchronization between MongoDB and Realm on your mobile phone with minimal developer efforts, businesses can rapidly develop mobile applications and drive innovation. MongoDB provides a powerful solution for IoT-enabled supply chains that can optimize processes and eliminate inefficiencies, enabling organizations to make data-driven decisions and improve supply chain efficiency. Conclusion The IoT industry is rapidly evolving, and as the number of connected devices grows, so do the challenges faced by businesses leveraging these solutions. Through a range of real-world use cases, we have seen how MongoDB has helped businesses deal with IoT data management, perform real-time analytics and optimize their supply chains, driving innovation in a variety of industries. With its unique features and capabilities, designed to manage the heavy lifting for you, MongoDB is well-positioned to continue playing a crucial role in the ongoing digital transformation of the IoT landscape. Want to learn more or get started with MongoDB? Check out our IoT resources MongoDB IoT Reference Architecture Migrate existing applications - with Relational Migrator MongoDB & IIoT ebook IoT webpage
Bosch IoT and the Importance of Application-Driven Analytics
For Kai Hackbarth and his team at Bosch, there exists the ability to bring real-time analytics into their applications that can handle lots and lots of data. “Without data, it’s just trial and error.” So says Kai Hackbarth . He’s a senior technology evangelist at Bosch Global Software with over 22 years of industry experience in the Internet of Things. “Be it automotive, be it industrial, be it smart-home, we’ve done it all,” Hackbarth said. Except for tires. That’s his latest focus and technology challenge. “Sounds maybe a bit simple,” Hackbarth said, “but if you think about it more deeply, [it’s really complex].” Because, as it turns out, tires can collect a lot of different pieces of data that can tell a system a lot about what’s going on with a lot of different things related to the car at any given moment. “Pressure, temperature, accelerometer,” Hackbarth said. “And then you also have other data from the car that’s critical for safety and sustainability.” But to be of any value, that data needs to be analyzed as close to the source as possible and in real time. Why? “It’s safety-critical,” Hackbarth said. “If you send all the raw data to the cloud this consumes a lot of costs.” Chief among those costs: Time. In order to react to an issue, that data cannot be historical. Because historical data about a tire that’s losing pressure or hydroplaning isn’t helpful to the applications inside the car that need to respond to these developments when they’re happening And thankfully for Hackbarth and his team at Bosch, there exists the ability to bring real-time analytics into their applications that can handle lots and lots of data. Smarter applications start with built-in analytics Traditionally, applications and analytics have existed as two separate classes of workloads with different requirements, for example: read and write access patterns, as well as concurrency and latency. As a result, businesses have usually deployed purpose-built data stores — including both databases for applications and data warehouses for analytics — and piping or duplicating the data between them. And that’s been fine when analytics don’t need to affect how an application responds in real time. But most customers expect applications to take intelligent actions in the moment, rather than after the fact. The same principle applies to Bosch’s tire project. The applications inside the car that can autonomously brake when approaching another vehicle too fast or slow down if the tire senses that it’s hydroplaning, need to also analyze all the data from all the sensors in real time. This process of building real-time analytics into applications is known as “ application-driven analytics .” And it’s how applications get smarter. Be they e-commerce apps on your phone, safety apps in your car or those that monitor rocket launches . The question for many development teams, though, is how do you build this capability into your applications easily? For a long time, that’s been a difficult question to answer. A platform for building real-time analytics into your apps “From my history [of doing this for] 22 years,” Hackbarth says, “we never had the capabilities to do this before.” Previously, teams everywhere — not just at Bosch — used to have to do a lot of custom engineering work to do real-time analytics close to the source, including: Stitching together multiple databases to handle different data structures (documents, tables, time series measurements, key values, graph), each accessed with its own unique query API. Building ETL data pipelines to transform data into required analytics formats and tier it from the live database to lower-cost object storage . Spinning up a federated query engine to work across each data tier, again using its own unique query API. Integrating serverless functions to react to real-time data changes. Standing up their own API layers to expose data to consuming applications. All of which result in a multitude of operational and security models to deal with, a ton of data integration work and lots of data duplication. But that’s no longer the case. There are now data platforms that bring together operational and analytical workloads into one, and that, in turn, allow you to bring live, operational data and real-time analysis together. MongoDB Atlas — the platform with which I am most familiar (since I work at MongoDB) — allows developers to do this by providing an integrated set of data and application services that fit their workflows. Developers can land data of any structure, index, query and analyze it in any way they want, and then archive it. And they can do all of this with a unified API and without having to build their own data pipelines or duplicate data. This is the platform on which the Bosch team continues to build its solutions. “Without data,” Hackbarth says, “it’s just trial and error.” But now with data and with a single platform to build real-time analytics into their applications, it’s something concrete, something responsive and something actionable. It’s something smart. Especially in the most critical of use cases that Bosch is working on. Like tires. If you’d like to learn more about how to build application-driven analytics into your applications, check out our three-part livestream demo showing how to build application-driven analytical applications in MongoDB Atlas . During this three-part series, we build real-time analytical capabilities for a simulated system for managing rocket launches. Part One covers the basics: Building complex analytical queries using MongoDB’s aggregation framework and building visualizations using charts. Part Two highlights some additional Atlas capabilities that are often invaluable when building app-driven analytics: Atlas search, triggers and embedding Charts generated visualizations into an application UI. Part Three focuses on how to use Atlas Data Federation, Atlas Data Lake and Atlas SQL Interface to perform analytics using large sets of historical data and federated queries across multiple data sources.
Operationalize Digital Transformation with a Robust Analytical Data Store
In today's rapidly evolving technological landscape, digital transformation has become a critical business imperative. The key drivers and objectives of transformation efforts are to enhance decision-making, maximize efficiencies and create compelling customer experiences. To achieve these goals, organizations require an Analytical Data Store with modern data storage, analytical capabilities, high scalability, and faster querying speed. Data is growing at exponentially high speeds. The total amount of data generated by 2025 is set to accelerate further to 175 zettabytes. The global big data market is also expected to reach $273.4 billion by 2026, highlighting the growing importance of data analytics in today's business landscape. As data becomes more critical for organizations, it is necessary to store, manage, and analyze data effectively to gain valuable insights that help drive business success. Industry drivers Stronger decisioning capabilities: To remain competitive and agile in an increasingly crowded marketplace, identify opportunities, optimize processes and pivot quickly while driving long-term business success and sustainability. Improved operational efficiency: To optimize processes and resources, and create a competitive business model that can adapt to changing market conditions and customer needs. Enhanced customer experience: To deliver superior digital experiences to customers that simplify access to information whenever needed and provide seamless interfacing abilities across multiple channels. IT drivers Modern data store Analytical capabilities High scalability Faster querying speed Modern data store vs. traditional data store Modern data stores enable businesses to store and process both structured and unstructured data in a flexible and efficient manner. Such data stores can analyze large volumes of data to extract valuable insights, while their high scalability can help handle massive data volumes, and maintain performance and productivity at optimum. With the fast querying speeds of modern data stores, businesses get access to insights quickly to make informed decisions. However, traditional data stores are limited in terms of scalability and flexibility. As data volumes increase, traditional data stores struggle to scale, leading to decreased performance and productivity. The MongoDB Analytical Data Store, powered by Exafluence, stores both structured and unstructured data, performs faster, and accurate data analysis, and scales efficiently, to unearth valuable insights that are difficult to identify using traditional data stores. Unlike traditional data stores, the MongoDB Analytical Data Store provides optimal performance by leveraging its distributed architecture, automatic data sharding, and horizontal scaling. Benefits of using MongoDB analytical data store The key benefit of MongoDB Analytical Data Store is its ability to handle large volumes of data without compromising on performance. MongoDB's distributed architecture allows data to be partitioned across multiple servers, providing efficient data processing and faster query response times. Additionally, MongoDB's automatic data sharding enables data to be partitioned and distributed across multiple nodes, ensuring that data is evenly distributed across the cluster. The Data Store’s ability to store both structured and unstructured data is particularly important for organizations that deal with large data volumes from diverse sources. MongoDB can also handle a variety of data types, including text, geospatial data, time-series data, and binary data. Other benefits include - Flexible document schemas: Schema-less database gives the flexibility and freedom to store data of different types Easy horizontal scale-out with sharding: Shards high-throughput collections across multiple clusters to sustain performance and scale horizontally Powerful querying and analytics: Simplifies access to data and performs complex analytics pipelines High-performance capabilities: Enables faster querying and returns all the necessary information in a single call to the database Real-world implementation Case study 1: Utility sector | AI platform for a network of IoT device The challenge: The client had over 300K+ connected IoT sensors across their city and needed detailed information on the sensor's communication data to improve decision-making and optimize field visits The client wanted to identify cluster-wise unconnected devices, sensor patterns for choosing the gateways, sensors connected in different spreading factors and regions, and the capacity of the gateways by using machine-learning models The solution Exafluence customized the IoT Devices AI Platform for the client, which stored the sensor data on MongoDB, delivered insights on sensor connectivity and behavior, and enabled faster decision-making The outcome The platform helped monitor the performance, availability, and health of a network of IoT devices through a single, all-in-one platform powered by AI Case study 2: Healthcare | Modernizing technology architecture The challenge: The client's data volume and complexity had increased multifold, affecting the performance of applications, databases, and reporting tools Reports generated with the existing architecture were based on 10-day-old stale data, which was a major concern for the client The solution: Exafluence modernized the client's technology architecture and made necessary programming changes, helping the client generate reports with near real-time data Leveraging a Spark-based solution, Exafluence completed the migration from MS SQL Server to MongoDB Atlas, creating an Analytical Data Store on the Atlas platform The outcome: Over 850 million records processed with speed and accuracy Load completed within just two days Bottom line In today's hyper-competitive business environment, the ability to harness the power of data is essential for success. That's where MongoDB Analytical Data Store comes in. The game-changing solution empowers businesses to manage and analyze massive amounts of data with ease. With MongoDB's cutting-edge analytics, unmatched scalability, and lightning-fast querying abilities, businesses can unlock valuable insights, make informed decisions, and take their operational efficiency up by a notch.
Three Ways Retailers Use MongoDB for Their Mobile Strategy
Mobile experiences are a crucial aspect of a retail omnichannel strategy. Retailers strive to create a consistent customer experience as consumers switch between online, in-store, and mobile channels. This presents a complex data management challenge as views across customer and workforce mobile applications need real-time access to the same data sets, both on or offline. Let’s dive into three ways retailers are tackling omnichannel data challenges with MongoDB mobile solutions. Mobile solutions to omnichannel challenges Achieving data centricity across channels Before building any mobile omnichannel solution, you first have to solve the data-centricity problem. Established retailers tend to have fragmented and siloed data both in-store and online, which needs to be combined in real time to facilitate omnichannel experiences. Consider Marks & Spencer ’s loyalty program, which was part of a key strategic initiative to increase customer retention and drive multi channel sales. This required a data-centric solution to gain deep insight into customer behavior. As data size and traffic grew, the legacy solution couldn’t scale. The company addressed this problem by re-platforming the Sparks mobile application backend onto MongoDB Atlas, a high-performance data platform capable of expanding vertically and horizontally to deal with the heavy read/write throughput of a data-driven enterprise. Its Sparks customer mobile app caters to more than 8 million unique customers and is capable of calculating more than 15 million unique offers a day. The flexibility of the document model allowed them to respond to trends in the market or new user behavior, and thus update its analytical framework. Taking advantage of the translytical data platform capabilities, business teams could classify and track customers, products, content, and promotions across any stage of the value chain, unlocking new revenue streams, all in real time. No matter the channel their customer is engaging with, which device they’re browsing on, or their geographical location, Marks & Spencer is able to cater their customers’ needs and use data to keep improving what their brand has to offer. Delivering a cohesive omnichannel retail brand experience It has become increasingly difficult for retailers to deliver a consistent experience across multiple channels. Think of the associated complexity of capturing and serving the right data at the right time, with extensive product catalogs, complex and changing categorization, regional nuances and language challenges for a global footprint, diverse seasonal sales, promotions, and more. Because customers are engaging in “phygital” behavior, browsing the online product catalog while also walking through the store, enabling your workforce to respond to customer questions becomes crucial to deliver the expected brand experience. Retailers are creating Workforce Enablement Mobile Apps for complex store management operations or browsing global inventory that requires synced data to achieve a connected store. For a company like 7-Eleven , whose value proposition is “Be the first choice for convenience. Anytime. Anywhere,” enabling its workforce became a critical issue for maintaining brand value. Using an omnichannel approach, 7-Eleven deployed a custom mobile device using MongoDB Realm, MongoDB’s unified mobile platform, to manage its in-store inventory system. Leveraging the power of Atlas Device Sync , MongoDB’s mobile database service for syncing data across devices, users, and backends, 7-Eleven’s front-line staff can start using devices immediately, not having to wait minutes to download the data on initial startup, increasing data accuracy, especially around real-time stock management. Ease of access to correct and real-time product information boosts 7-Eleven’s convenience-centered brand offering and secures the cohesion of their brand experience with the brand both in the digital and brick-and-mortar stores. Brand cohesion is dependent on efficient order management and visual merchandising. As customers are window shop, employees need to analyze real-time stock data from the retailer's supply chain while it passes from warehouse to in-store processes like stock delivery, visual merchandising, and product returns management, to optimize store operations and create seamless experiences. Imagine the case of a fashion retailer with data gathered from RFID product tagging, scanned through mobile devices in the store. It can reduce total operating costs through optimized order processing, enabling logistics managers to discover pain points in the supply chain, forecast demand, and avoid stock breaks thanks to real-time triggers and alerts with auto-replenishment capabilities. Retailers can also optimize in-store merchandising based on customer shopping behavior data gathered from garment SKUs scanned on the store racks or in fitting rooms with beacon-like RFID scanners. By viewing the movement of items and measuring things like how many times items are tried on related to how often they’re purchased, companies can understand how product assortment and movement affect purchasing intention, and relocate clothes based on that information. Thanks to MongoDB's real-time architectures combined with Kafka managing the event streaming, and with MongoDB Realm providing a simple, fully integrated way to sync real-time inventory data to MongoDB Atlas, companies can achieve a deeper understanding of customer behavior as a competitive advantage and a reduction of total operational costs. Data collection and its associated change events often occur in variable latency and low network availability scenarios, like warehouses, delivery trucks, or store buildings, creating the third challenge we will next address: Network Consistency. Dealing with network consistency issues Networks with variable latency across store floors, due to server distance, applications dealing with heavy content, or simply network congestion over sales periods, can generate the unwanted byproduct of data inconsistencies. Apps can restart or shut down any time due to bugs, using too much memory, or other apps working in the background. To address these issues, businesses need an on-device database with offline synchronization, or an Offline-First approach. The luxury market in particular expects perfection, where an item might range from $20,000 to more than $100,000. Customers expect more than just a purchase – they expect personalized experiences. One of our customers in luxury retail regularly holds pop-up events, often in destinations sometimes without network signal, which creates the need for a reliable mobile app with access to customer data, like purchase histories, to provide that personalized experience. For example, providing fast and easy check-in for customers at these events is critical to their experience-centric business. Thanks to MongoDB Realm, their event tablets always work, even when internet connections are poor. Capitalizing on MongoDB Realm ’s local data persistence, storing user data on their devices , and combining it with Atlas Device Sync , the retailer has a top-performing mobile web app with the ability to keep working offline storing user data to then sync it back to the database once connectivity is restored. This approach allows the company to build an uninterrupted and unified 360-degree customer management platform spanning web and mobile touchpoints, with relevant data always up to date. Bi-directional device to cloud sync with MongoDB Realm Sync work together Conclusion Mobile experiences are crucial in today's retail landscape, but integrating them can be challenging. By leveraging mobile channels, retailers can differentiate their brands, increase customer loyalty, and expand their reach in a fiercely competitive industry — staying ahead of the game and thriving in the omnichannel era. Learn how to quickly launch and scale secure mobile apps on our MongoDB for Mobile site. Want to learn more about modernizing retail experiences for your customers? Join one of two webinar sessions on May 17: European Time Zone Americas Time Zone