MongoDB Applied

Customer stories, use cases and experience

Utilizing AI and MongoDB to Power Content Personalization

Content is everywhere. Whatever it is a consumer is looking for, and whatever industry that person might be in, it’s not hard to find content. The problem, however, is trying to find the right content. That’s where Concured comes into play. Concured , an AI-startup based in Montreal, helps marketing teams align their website and sales content to audiences, as well as content marketing teams who need to differentiate and accelerate insight-driven content personalization. Concured was founded in 2015 by CEO Tom Salvat and built for content marketers to better understand their audience's interests in order to deliver more impactful content. Built with MongoDB spoke with CTO Tom Wilson , who joined Concured roughly a year after it was founded. We discussed Concured’s use of artificial intelligence, how Wilson joined Concured, as well as what he sees as the future for the company. Built with MongoDB: What does Concured do? Tom Wilson: Concured is a software company that leverages artificial intelligence techniques developed in the last five to 10 years. It is designed to help marketers know what to write about on a per-topic basis, to see what is working well from their own content, as well as that of their competitors, and within their industry. Finally, it maximizes the return on investment they make on any content that they create by personalizing the web visitor’s journey through the client’s website. Concured has developed a recommender system, which is personalized to the individual visitor. It is a privacy friendly approach that doesn’t use third-party cookies or spy on the user, it is purely based on the web visitor’s behavior on the site, and it builds up an interest profile on that visitor as they click through successive web pages. As it builds a more precise picture of the interests and intent of the user, it tries to recommend content to them to read next, whether it’s a blog post, a product description, or any other kind of content. Built with MongoDB: Built with MongoDB: You mentioned artificial intelligence. How does Concured use artificial intelligence behind the scenes? Wilson: We use artificial intelligence in a number of places. One of our unique selling propositions to our clients, is that, unlike other personalized systems, Concured doesn’t require a lengthy integration period, nor does it require ongoing maintenance on the part of the domain. Our approach is to scrape the content of the clients’ websites using AI powered bots to find the content that is relevant and to extract the text, the title, and any other relevant metadata, and to index that in an automatic way. Our system leverages recent advances in natural language processing (NLP) in order to generate semantic metadata for each document, corresponding to a unique point within a multi-dimensional space. Another aspect is to understand the key entities that are in it, and also to understand the relationship of that particular article to the other ones within the same website. We create a lot of metadata automatically on each piece of content that we find using AI powered web crawlers and scrapers. Built with MongoDB: Since AI isn’t always 100% accurate, what is the accuracy for the NLP that you’ve created at Concured? Wilson: With recommender systems, it’s very hard to say what the best recommendation is, because it could be very different for the same person depending on the day, or the web session. If you think of some of the most famous recommender systems, such as Netflix, Amazon, or Spotify, they try to show us what we want to see next, but there’s actually no single right answer. Because of that, it’s very hard to measure the performance, so the approach that we use is not to say that there is a 100% correct answer, but rather, to say, if we make changes to the algorithms, do the visitors on these websites click on more articles, and do they reach the target pages defined by the website’s owner, e.g. a product page or a sign-up form. The higher proportion of people performing that action at the end of the journey on the website relative to the number of the people who arrived on the website, the more successful the recommender system is, and we can compare the success rate that our clients have on their websites before and after they activate the personalization system from Concured. So far we’re looking at a two to three times uplift, and the algorithms are improving all the time. Built with MongoDB: When did you join the team at Concured? Wilson: At a time when the company had raised its first serious money from external investors, and one of the requirements was that they brought on a professional CTO, this often happens in early stage companies, the investors want to impose some structure, they want to know that their money is going to be spent wisely, and do things a little less shooting from the hip. And so, in some companies they joke about bringing in adult supervision. I don’t know if I’d say that about my role, since the team was already nicely established, but I was able to provide a lot more structure to ensure that we would meet our next milestones, as well as longer-term strategic planning and a technical vision. Built with MongoDB: How did your team decide to build with MongoDB? Wilson: The team that I joined was already using MongoDB at the time. Within a few months of joining, there was some discussion about whether to move to a structured database. That was a decision that had to be made, so that’s where I got involved and it became a conscious decision not to move away from MongoDB. This was the right fit for what we wanted going forward, and we absolutely made the right decision. We are also going to move away from our Community edition hosted on Google Cloud Platform to MongoDB Atlas Serverless. We’ll be happy not to manage machines any more, thanks to the serverless aspect, and we’re also excited to try out the text search features available on Atlas, since this could potentially simplify our tech stack For where we are, as a business today, and where we want to be in five years, MongoDB is and will continue to be the right choice. Built with MongoDB: What does the future of Concured look like? Wilson: The future is being written as we speak. It’s bringing on more clients with similar needs to those of our largest clients at the moment, namely enterprises that have a huge amount of content, already in the archives, that they would like to continue to squeeze value out of, as well as publishing a lot; whether it’s big companies, such as consultancies or financial services industries, or traditional publishers, you want to make sure that they’re promoting the right content that’s going to have the biggest uplift in terms of whatever KPIs they’re measuring. Built with MongoDB: What is the best piece of feedback that you’ve received? Wilson: One nice piece of feedback I’ve received from my team is that I’ve always been there for them. If they have a problem, I will either fix it or remove obstacles for them so they can do the best that they can. It’s part of my philosophy that if you take care of the team, a lot of other things will take care of themselves. For any business that invests in content as part of its marketing strategy, it makes only good business sense to try to maximize the return on that. Learn more about how to turn readers into customers via Content Personalization on Concured’s website . Interested in learning more about MongoDB for Startups? Learn more about us here .

January 19, 2022
Applied

Manufacturing at Scale: MongoDB & IIoT

In recent years, we’ve seen a massive shift in digital transformation. As people, we’re all “connected”, by our smart devices, smart homes, smart cars, smart cities, and so on. We interact with smart devices because of the convenience it offers us- automating daily tasks and giving insight into daily movements, from how much pressure is in each car tire to whether we left our stove on before leaving the house. The reasons we use smart devices is mirrored in why businesses are adopting IoT and IIoT (Industrial IoT) on a much larger scale- for convenience, insight, predictive maintenance, automation, and productive efficiency. IoT is becoming increasingly more critical in manufacturing and engineering, connecting thousands of sensors and actors of devices in the processes before, during, and after fabrication. The implementation of IoT within manufacturing processes, from raw materials to the product or smart products has only just begun and is destined to evolve into a key differentiator for successful manufacturing companies throughout the entire supply chain. The digital transformation in IIoT comes down to data at its core and how data is generated, stored and analyzed. IIoT requires data to be collected and processed at massive volumes in real/ near real time to provide accurate and live business insights for better decision making. Shop floors are continuously optimized as new components, sensors and actors are introduced to improve OEE (Overall Equipment Effectiveness), increase quality and reduce waste. With almost every new device, additional data variety is introduced and thus requires a flexible, highly available and scalable data platform to store and analyze this data. Furthermore, with the increasing convergence of IT and OT, even more complexity and diverse data needs to be integrated and processed, which adds higher complexity to the picture. MongoDB’s general purpose data platform, allows manufacturers to store OT / time series data sets in MongoDB together with recipes or digital twins for a complete real time end-to-end picture from edge to cloud and onto mobile devices for insights and control anytime and anywhere, online or offline. A connected factory model The Industry Solutions team at MongoDB set out to demonstrate how easily MongoDB can be integrated to solve the digital transformation challenges of IIoT in manufacturing with its flexible, scalable, application data platform. Using the small scale model of a smart fabrication factory, from Fischertechnik , the team collects and sends data via MQTT and processes it in MongoDB Atlas and Realm . Similar to a full scale physical factory, the smart factory model demonstrates how easily IIoT use cases can be built on top of MongoDB’s application data platform to enable and accelerate digitalization of manufacturing processes in any industry. The Fischertechnik model factory itself is often used to train engineering students in the field of what a manufacturing fabrication would look like, as well as a model for manufacturing companies to plan for the very real setup, building, and investment of their factories . So, what initially looks like a toy in robotics gets serious quite quickly. The model factory operates as the foundation of an IIoT use case. It is made up of several components- a warehouse, multiprocessing station, and sorting area. The warehouse is where raw material is stacked and stored, and when triggered, the raw material is retrieved and moved to processing by a mobile crane. From there, the items are sorted by color (i.e. red, white, or blue), to be sent to the correct outbound destination. The process covers ordering and storing of raw material to ordering and manufacturing of end products. Throughout these processes, there are multiple sensors detecting the type and color of the items, as well as environmental aspects like temperature and how much inventory is in stock. A surveillance camera detects motion and sends alerts including photos via MQTT. This simulates the wide variety of data a smart factory would emit in real time for track and trace, monitoring and visualization, alerts and as input for machine learning algorithms. The factory's infrastructure Out of the box, the Fischertechnik factory comes with a few dashboards connected to the Fischertechnik cloud, established via a WLAN router integrated in the factory. These dashboards include a: Customer View: A webshop interface, where a product can be ordered to trigger the supply chain processing Supplier View: A visualization and display of the ordering process of raw material Production View: A visualization of the factory status, production process, and sensor values from the camera and NFC/RFID readers. To emphasize and explain how MongoDB can be leveraged in this picture, the MongoDB team developed additional apps, using JavaScript, ReactJS, and Realm, to integrate and streamline data flows and processes on top of the MongoDB data platform. This included a: MongoDB Realm Order Portal: A ReactJS web application to order new products and track the process of orders. Data Visualization: A visualization of the different data types collected in MongoDB and visualized via MongoDB Charts for insights. Alert Management App: A mobile app leveraging MongoDB Realm and Realm Sync for alert notification and management offline and online. The machines of the factory are controlled by TXT controllers, Linux-based computers which use MQTT to communicate between each other and also with the cloud based applications. There are basically two types of data sent and received via MQTT- commands to trigger an action and streaming of event and time series sensor data. The main TXT controller runs a MQTT broker and replicates selected topics to a HiveMQ MQTT broker in the HiveMQ cloud. From there a redpanda kafka container collects the data streams and inserts them into MongoDB. The data persisted in MongoDB is then visualized via MongoDB Charts for real-time insights. Factory layout connected to data infrastructure The MongoDB Order Portal uses the Realm Web SDK and the serverless GraphQL API. GraphQL is used to pull data from MongoDB Atlas and the web SDK is used to add new orders (insert new documents) into a MongoDB cluster. When a new order is inserted into the Atlas database, an Atlas trigger is executed, which then sends a MQTT message directly to the HiveMQ MQTT broker, alerting the factory to process the order. The HiveMQ broker then replicates the order to the factory for processing. Sending data to the factory Receiving data from the factory is just as simple. The Factory provides a large amount of live data that can be streamed from the factory. In order to receive the data, HiveMQ and Kafka are used. The factory has an MQTT broker, which is bridged to a cloud HiveMQ broker. From the HiveMQ broker Kafka Connect with an MQTT source and a MongoDB sink connector the data is moved into MongoDB Atlas. Receiving data from the factory MongoDB & IIoT Digitalization in manufacturing means connecting IT and OT, mixing and meshing data from both domains and providing access to people and algorithms for higher levels of automation, increased efficiency and less waste. MongoDB’s application data platform is the data platform optimized for large varieties and amounts of data with a powerful query language for better decision making across a large volume of data. All easier said than done. However, Atlas helps solve these complex requirements with its ecosystem of functions, including: Real Time Analytics: As IIoT continues to boom with a surge of connected devices, limits are pushed each day with increased volumes of data. Atlas scales seamlessly, capable of ingesting enormous amounts of sensor and event data to support real time analysis for catching any critical events or changes as they happen. Dynamic Scalability: MongoDB Atlas and Realm provide automated scalability allowing you to start small and dynamically adapt your clusters with increasing/decreasing demand. Especially as sensor data gets colder over time, you can automatically offload cold data into object stores, such as S3, while maintaining the ability to query the hot and cold data through a single API. Time-Series: MongoDB 5.0 supports Time-Series Data natively through optimized storage with clustered indexes and optimized Time-Series query operators to analyze trends and identify anomalies quickly. The combination of time series data with other data structures such as digital twin models within the same data platform dramatically reduces complexity, development efforts and costs by avoiding additional technologies, ETL processes and data duplication. The MongoDB database can also be deployed next to the shop floor for data collection and analysis, making the shopfloor independent of the cloud. Pre Aggregated or raw data can then be seamlessly replicated or streamed into the public cloud for global views across factories. Additionally Realm, the serverless backend on top of MongoDB Atlas provides easy application and system integrations through REST / MongoDB Data and GraphQL APIs as well as synchronizing data with mobile devices for offline first use cases such as workforce enablement use cases. Atlas, Realm, and IIoT IIOT is an exciting realm (no pun intended) right now, with a massive opportunity for growth and innovation. The next level of innovation requires a resilient multi-functional data platform that reduces complexity, increases developer efficiency and reduces data duplication/integration while scaling elastically with your demand. What the MongoDB team scratched by quickly syncing the smart model factory with Atlas and Realm and iterating on top of that is just a fraction of the innovation we can support within manufacturing use cases. Learn more MongoDB Atlas and Realm, and how major enterprises are using these solutions for their manufacturing and IIoT needs here . This is the first of an IIoT series from MongoDB’s Industry Solutions team. Stay tuned for more to come!

January 19, 2022
Applied

Revolutionizing Data Storage and Analytics with MongoDB Atlas on Google Cloud and HCL

Every organization requires data they can trust—and access—regardless of its format, size, or location. The rapid pace of change in technology and the shift towards cloud computing is revolutionizing how companies handle, govern and manage their data by freeing them from the heavy operational burden of on-premise deployments. Enterprises are looking for a centralized, cost-effective solution that allows them to scale their storage and analytics so they can ingest data and perform artificial intelligence (AI) and machine learning (ML) operations, ultimately expanding their marketing horizon. This blog post explores why companies should partner with MongoDB Atlas on Google Cloud to begin their data revolution journey, and how HCL Technologies can support customers looking to migrate. MongoDB Atlas as the distributed data platform MongoDB Atlas is the leading database-as-a-service on the market for three main reasons: Unparalleled developer experience - allows organizations to bring new features to market at a high velocity Horizontal scalability - supports hundreds of terabytes of data with sub-second queries Flexibility - stores data to meet various regulatory, operational, and high availability requirements. The versatility offered by MongoDB’s document model makes it ideal for modern data-driven use cases that require support for structured, semi-structured, and unstructured content all within a single platform. Its flexible schema allows changes to support new application features without costly schema migrations typically required with relational databases. MongoDB Atlas extends the core database by offering services like Atlas Search and MongoDB Realm that are a necessity for modern applications. Atlas Search provides a powerful Apache Lucene-based full text search engine that automatically indexes data in your MongoDB database without the need for a separate dedicated search engine or error-prone replication processes. Realm provides edge-to-cloud sync and backend services to accelerate and simplify mobile and web development. Atlas’ distributed architecture supports horizontal scaling for data volume, query latency, and query throughput which offers the scalability benefits of distributed data storage alongside the rich functionality of a fully-featured general purpose database. MongoDB Atlas is unique in its ability to provide the most wanted database as a managed service and is relied on by the world’s largest companies for their mission-critical production applications. Innovation powered by collaboration with HCL Technologies MongoDB’s versatility as a general-purpose database, in addition to its massive scalability, makes it a perfect foundation for analytics, visualization, and AI/ML applications on Google Cloud. As an MSP partner for Google Cloud, HCL Technologies helps enterprises accelerate and risk-mitigate their digital agenda, powered by Google Cloud. We’ve successfully implemented applications leveraging MongoDB Atlas on Google Cloud, building upon MongoDB’s flexible JSON-like data model, rich querying and indexing, and elastic scalability in conjunction with Google Cloud’s class-leading cloud infrastructure, data analytics, and machine learning capabilities. HCL is working with some of the world’s largest enterprises in building secure, performant, and cost-effective solutions with MongoDB and Google. Possessing technical expertise in Google Cloud, MongoDB, machine learning, and data science, our dedicated team developed a reference architecture that ensures high performance and scalability. This is simplified by MongoDB Atlas’ support for Google Cloud services which allows it to essentially operate as a cloud-native solution. Highlighted features include: Integration with Google Cloud Key Management Service Use of Google Cloud’s native storage snapshot for fast backup and restore Ability to create read-only MongoDB nodes in Google Cloud to reduce latency with Google Cloud-native services regardless of where the primary node is located (even other public cloud providers!) Integrated billing with Google Cloud Ability to span a single MongoDB cluster across Google Cloud regions worldwide, and more As represented in Figure 1 below, MongoDB Atlas on Google Cloud can be used as a single database solution for transactional, operational, and analytical workloads across a variety of use cases. Figure 1: MongoDB's core characteristics and features The following architecture in Figure 2 demonstrates the ease of reading and writing data to MongoDB from Google Cloud services. Dataflow, Cloud Data Fusion, and Dataproc can be leveraged to build data pipelines to migrate data from heterogeneous databases to MongoDB and to feed data to create interactive dashboards using Looker. These data pipelines support both batch and real-time ingestion workloads and can be automated and orchestrated using Google Cloud - native services.. Figure 2: MongoDB Atlas' integration with core Google Cloud services A data platform built using MongoDB Atlas and Google Cloud offers an integrated suite of services for storage, analysis, and visualization. Address your business challenges with HCL: Industry use cases Data-driven solutions built with MongoDB Atlas on Google Cloud have multiple applications across industries such as financial services, media and entertainment, healthcare, oil and gas, energy, manufacturing, retail, and the public sector. Every industry can benefit from this highly integrated storage and analytical solution. Use Cases and Benefits Data lake modernization with low cost and high availability for media and entertainment customers: Maintaining high availability and a low-cost data lake is an obstacle for any online entertainment platform that builds mobile or web ticketing applications. However, building on Google App Engine with MongoDB Atlas Clusters in the backend allows for a high-availability, low-cost data platform that seamlessly feeds data to downstream analytics platforms in real time. Unified data platform for retail customers: The retail business frequently requests an agile environment in order to encourage innovation among its engineers. With its agility in scaling and resource management, seamless multi-region clusters, and premium monitoring, running MongoDB Atlas on Google Cloud is a fantastic choice for building a single data platform. This simplifies the management of different data platforms and allows developers to focus on new ideas. High-speed real-time data platform of supply chain system for manufacturing units: By having real-time visibility and distributed data services, supply chain data can become a competitive advantage. MongoDB Atlas on Google Cloud provides a solid foundation for creating distributed data services with a unified, easy-to-maintain architecture. The unrivaled speed of MongoDB Atlas simplifies supply chain operations with real-time data analytics. The way forward Even in just the past decade, organizations have been forced to adapt to the extremely fast pace of innovation in the data analytics landscape: moving from batch to real-time, on-premise to cloud, gigabytes to petabytes, and the increased accessibility of advanced AI/ML models thanks to providers like Google Cloud. With our track record of success in this domain, HCL Technologies is uniquely positioned to help organizations realize the joint benefits of building data analytics applications with best-of-breed solutions from Google Cloud and MongoDB. Visit us to learn more about the HCL Google Ecosystem Business Unit and how we can help you harness the power of MongoDB Atlas and Google Cloud Platform to change the way you store and analyze your data through these solutions.

January 13, 2022
Applied

Retail Tech in 2022: Predictions for What's on the Horizon

If 2020 and 2021 were all about adjusting to the Covid-19 pandemic, 2022 will be about finding a way to be successful in this “new normal”. So what should retailers expect in the upcoming year, and where should you consider making new retail technology investments? Omnichannel is still going strong Who would have anticipated the Covid-19 pandemic would still be disrupting lives after two years? For the retail industry this means more of the same - omnichannel shopping. Despite the hope many of us had for the end of the pandemic and the gradual increase of in-person shopping, retail workers can expect to continue accommodating all kinds of shopping experiences – online shopping, brick and mortar shopping, buy online and pick up in store, reserve online and pick up in store. Even beyond the pandemic, the face of shopping is likely forever changed. This means retailers need to start considering the long-term tech investments required to meet transforming customer expectations. Adopting solutions that offer a single view of the consumer gives you the unique opportunity to personalize offerings, products and loyalty programs to their demand. With a superior consumer experience, you can achieve repeat business and increased customer loyalty. While many retailers may have thought they could “get by” with their current solutions until the pandemic ends, it’s time to rethink that approach and start exploring more long-term solutions to improve omnichannel shopping experiences. Leaner tech stacks over many specialized solutions In 2022, you should explore solutions that allow your IT teams to do more with less. The typical retail tech stack looks something like the diagram below. Legacy, relational databases are supplemented by other specialist NoSQL and relational databases, and additional mobile data and analytics platforms. As a result, retailers looking to respond quickly to changing consumer preferences and improve the customer experience face an uphill battle against siloed data, slow data processing, and unnecessary complexity. Your development teams are so busy cobbling solutions together and maintaining different technologies at once that they fail to innovate to their full potential, so you’re never quite able to pull ahead of the competition. This is the data innovation recurring tax (or DIRT) . Think of this as the ongoing tax on innovation that spaghetti architectures, like the example above, legacy architecture costs your business. As technology grows more sophisticated and data grows more complex, companies are expected to react almost instantaneously to signals from their data. Legacy technologies, like relational databases, are rigid, inefficient, and hard to adapt, making it difficult to deliver true innovation to your customers and employees in a timely manner. Your development teams are so busy cobbling solutions together that they fail to innovate to their full potential, so you’re never quite able to pull ahead of the competition. It’s time to rethink your legacy systems, and adopt solutions that streamline operations and seamlessly share data to ensure you’re working with a single source of data truth. Many retailers recognize the need to upgrade legacy solutions and get away from multiple different database technologies, but you may not know where to start. Look for modern data applications that simplify data collection from disparate sources and include automated conflict resolution for added data reliability. Also, consider what you could do with fully managed application data platforms, like MongoDB Atlas . With someone else doing the admin work, your developers are free to focus on critical work or turn their talents to innovation. Digital worker enablement will increase retention For employees, 2022 looks set to continue last year’s trend of the “ Great Resignation ”. To combat worker fatigue, and retain your workforce you need to prioritize worker engagement. One way to better engage your employees is through mobile workforce enablement. While many companies consider how to engage their customers with a more digital-friendly work environment, you shouldn’t forget about your workers in the process. Global companies like Walmart are starting to invest in mobile apps to enable their workforce. A modern, always-on retail workforce enablement app could transform the way your employees do their jobs. Features like real-time view of stock, cross-departmental collaboration, detailed product information, instant communication with other stores can simplify your workers’ experiences and help them to better serve your customers. Your workers need an always-on app that syncs with your single source of data truth, regardless of connectivity (which may be an issue as retail workers are constantly on the move). But building a mobile app with data sync capabilities can be a costly and time-intensive investment. MongoDB Realm Sync solves for this with an intuitive, object-oriented data model that is simple to use, and an out-of-the-box data synchronization service. When your mobile data seamlessly integrates with back-end systems, you can deliver a modern, distributed application data platform to your workers. Huge investment in the supply chain From microchips to toilet paper, disruptions in the supply chain were a huge issue in 2020 and 2021, and the supply chain pain continues in 2022. And while there continue to be supply chain issues beyond the control of retailers, there are steps that can be taken to mitigate some of the pain and prepare for future disruptions. Warehouse tech is getting smarter, and you need to upgrade your solutions to keep up. For starters, consider adopting the right application data platform to unify siloed data and gain a single view of operations . A single view of your data will allow for better management of store-level demand forecasts, distribution center-to-store network optimizations, vendor ordering, truck load optimizations, and much more. With a modern application data platform, all this data feeds into one, single view application, giving retailers the insights to react to supply chain issues in real time. With disruption set to dominate 2022, as it did in 2020 and 2021, investing in proactive solution upgrades could help your business not only survive, but thrive. Want to learn more about gaining a competitive advantage in the retail industry? Get this free white paper on retail modernization .

January 13, 2022
Applied

Why Telcos Implement TM Forum Open APIs with MongoDB

How MongoDB speeds up development of new TM Forum services In the evolving and increasingly complex telecommunications industry, providers are turning to open digital architectures to enable interoperability and manage new digital offerings. TM Forum (TMF), an alliance of more than 850 companies, accelerates digital innovation through its TMF Open APIs , which provide a standard interface for the exchange of different telco data models. The use of TMF Open APIs ranges from providers of off-the-shelf software to proprietary developments of the largest telecommunications providers. In working with many of the world’s largest communication service providers (CSPs) and the related software provider ecosystem, MongoDB has seen a significant number of organizations leveraging these emerging standards. Through exposing common interfaces CSPs are able to adopt a modular architecture made up of best-of-breed components (either internally or externally developed) while minimizing the time, effort, and cost required to integrate them. “MongoDB’s document model technology has given CSPs the ability to be more agile and more innovative at the same time, which aligns perfectly with the mission of TM Forum Open APIs,” said George Glass, TM Forum Chief Technology Officer. “We’re delighted to see MongoDB partnering with developers to deliver TM Forum-compliant microservices in days instead of weeks or months.” TMF Open APIs empower CSPs to build new microservices in days, not weeks or months The MongoDB document model allows developers to work with data in a natural way and store data as it is retrieved by the application. In the context of TMF Open APIs, this means that TMF resources of the API can be persisted 1:1 in the database without the need for additional mappings. This is highlighted in the example below, which demonstrates how a portion of the TMF666 (Account Management) resource model can be simply and intuitively implemented as MongoDB documents as opposed to traditional relational structures. Relational Model MongoDB Representation { "_id": "3df04c97-51c7-4cb4-817e-1bd86eb2e2b3", "name": "John Doe", "state": "verified", "accountType": "B2B", "description": "Special Customer", "accountBalance": [ { "amount": { "unit": "euro", "value": NumberDecimal(89.98) }, "balanceType": "main", "validFor": { "endDateTime": ISODate("2021-09-01"), "startDateTime": ISODate("2021-07-01") } } ], "accountRelationship": [...], "contact": [...], "creditLimit": { "unit": "euro", "value": 500 }, "lastModified": ISODate("2021-08-31T16:28:41.111Z") } When new versions of the specifications are released, or requirements demand that the data model be extended with custom defined data, the new attributes can be implemented in the application without having to spend time and effort changing database tables and constraints. This flexibility allows CSPs to achieve the agility promise of TMF Open APIs in practice. Development teams can rapidly develop TMF-compliant microservices because there is no need to model TMF entity models in your relational database. Together with full-stack digital BSS Finnish provider Tecnotree, MongoDB is helping build broader access for telecommunications development teams in charge of launching new microservices. Tecnotree recently earned Platinum Badge for Open APIs from TM Forum and were among the first in the telecom industry to adopt, mature and stabilize open-source principles for Digital Business Support System (BSS). “As we continue to support telecoms operators’ customer-first approach to drive growth in the enterprise market, partnership with MongoDB is helping us push aside some of the biggest technological obstacles faced by operators globally,’’ said Sajan Joy Thomas, VP-Product Office, Tecnotree Oyj. Query data flexibly using TMF specifications With this flexibility comes the ability to query data flexibly in order to support a variety of defined access patterns. This is demonstrated by the FinancialAccount resource in the TM666 specification (account management API). The GET operation contains a parameter to specify which fields should be returned (a projection) and iterate through the result set by 'offset' and 'limit’ parameters. As the specification evolves, the MongoDB query language and rich indexing capabilities will allow for complex filter parameters to be easily supported. The original GET operation referred to above can be implemented using MongoDB’s aggregation framework with the following query: db.collection.aggregate([ { $skip: 50 }, { $limit: 10 }, { $project: { "accountType": "B2C", "state": "active" } } ]) Adding additional filtering is as simple as including an additional $match stage as the first stage in the pipeline (with appropriate index support). These aggregation queries can easily be represented in an idiomatic way using any of the supported drivers. The following Java example implementation shows how the above query can be written using the Java driver. No error-prone query string is needed nor an additional object-relational mapping layer to translate the result from MongoDB objects into Java objects. public List<FinancialAccount> getFinancialAccounts(String fields, Integer skip, Integer limit) { [...] var projectStage = createProjectStageFromFieldList(fields); var pipeline = new ArrayList<Bson>(); pipeline.add(skip(skip)); pipeline.add(limit(limit)); if (projectStage != null) { pipeline.add(projectStage); } collection.aggregate(pipeline).iterator().forEachRemaining(accounts::add); return accounts; } GitHub: https://github.com/mongodb-industry-solutions/tmforum-openapi-example Aside from some additional boilerplate code, this is the only code needed to get a basic working API. TMF provides the API as a Swagger specification, and this can be used to generate the resource and API classes automatically. These resource classes can be passed directly to the MongoDB driver, which will handle the translation into MongoDB data types. The easiest way to access the Swagger file and get started is to download the specification from the TMF Open API resources page . In the linked GitHub project, a Maven plugin is used to generate the code from the Swagger file, but this can also be manually done by using the Swagger Codegen CLI, depending on what fits better into the development workflow: java -jar swagger-codegen-cli-3.0.30.jar generate \ -i ./TTMF666-Account-v4.0.0.swagger.json \ -l java \ -o ./client/java In most applications, more is needed than a functioning external API. Business processes, such as customer creation, may require that multiple operations on resources be treated as an atomic transaction. MongoDB has support for distributed transactions at global scale to fulfil this requirement at the database level. Despite this, with the power of the document model, transactions are often not needed in many cases where a relational database would need them. Take a look at the FinancialAccount again. Updating all related information in one atomic operation would require a transaction in relational databases, whereas it’s just one update operation of a document in MongoDB. This leads to an overall reduction in I/O and better application performance. These advantages, combined with industry leading scale and resilience capabilities, drive CSPs to implement TMF APIs with MongoDB as the primary data source and indeed, many of the example implementations provided by TM Forum use MongoDB for data persistence. More resources for telecommunications IT professionals: [ Case study ] How Verizon built an edge architecture and manages edge applications [ Video ] Establishing a simpler, cloud-based database model to benefit internal operations [ Solution brief ] MongoDB for TM Forum Open APIs

January 10, 2022
Applied

Data and the European Landscape: 3 Trends for 2022

The past two years have brought massive changes for IT leaders: large and complex cloud migrations; unprecedented numbers of people suddenly working, shopping and learning from home; and a burst in demand for digital-first experiences. Like everyone else, we are hoping that 2022 isn’t so disruptive (fingers crossed!), but our customer conversations in Europe do lead us to believe the new year will bring new business priorities. We’re already noticing changes in conversations around vendor lock-in, thanks to the Digital Markets Act, a new enthusiasm for combining operational and analytical data to drive new insights faster, and a more strategic embrace of sustainability. Here’s how we see these trends playing out in 2022. Digital markets act draws new attention to cloud vendor lock-in in Europe We’ve heard plenty about the European Commission’s Digital Markets Act , which, in the name of ensuring fair and open digital markets, would place new restrictions on companies that are deemed to be digital “gatekeepers” in the region. That discussion will be nothing compared to the vigorous debate we expect once the EU begins the very tricky political business of determining exactly which companies will fall under the act. If the EU sets the bar for revenues, users, and market size high enough, it’s possible that the regulation will end up affecting only Facebook, Amazon, Google, Apple, and Microsoft. But a European group representing 2,500 CIOs and almost 700 organisations is now pushing to have the regulation encompass more software companies. Their main concern centers around “distorted competition” in cloud infrastructure services and a worry that companies are being locked into one cloud vendor. A trend that will likely increase in 2022 that pushes back on cloud vendor lock-in is embracing multi-cloud strategies. We should expect to see more organisations in the region pursuing multi-cloud environments as a means to improve business continuity and agility whilst being able to access best of breed services from each cloud provider. As we have always said …”it’s fine to date your cloud provider….but don’t ever marry them.” The convergence of operational and analytical data The processing of operational and analytical data is almost always contained in different data systems, each tuned to that use case and managed by separate teams. But because that data lives in separate places, it’s almost impossible for organisations to generate insights and automate actions in real time, against live data. We believe 2022 is the year we’ll see a critical mass of companies in the region make significant progress toward a convergence of their operational and analytical data. We’re already starting to see some of the principles of microservices in operational applications, such as domain ownership, be applied to analytics as well. We’re hearing about this from so many of our customers locally, who are looking at MongoDB as an application data platform that allows them to perform queries across both real-time and historical data, using a unified platform and a single query API. This results in the applications they are building becoming more intelligent and contextual to their users, while avoiding dependencies on centralized analytics teams that otherwise slow down how quickly new, data-driven experiences can be released. Sustainability drives local strategic IT choice Technology always has some environmental cost. Sometimes that’s obvious — such as the energy needs and emissions associated with Bitcoin mining. More often, though, the environmental costs are well hidden. The European Green Deal commits the European Union to reducing emissions by 55% by 2030, with a focus on sustainable industry. With the U.N. Climate Change Conference (COP26) recently completed in Glasgow, and coming off the hottest European summer on record, climate issues have become top of mind. That means our customers are increasingly looking to make their technical operations more sustainable — including in their choice of cloud provider and data centers. According to research from IDC , more than 20% of CxOs say that sustainability is now important in selecting a strategic cloud service provider, and some 29% of CxOs are including sustainability into their RFPs for cloud services. Most interesting, 26% say they are willing to switch to providers with better sustainability credentials. Historically, it’s been difficult to make a switch like that. That’s part of the reason we built MongoDB Atlas — to give our customers the flexibility to run in any region , with any of the three largest cloud providers, and to make it easy to switch between them, and even to run a single database cluster across them. Publicly available information about the footprint of individual regions and even single data centers will make it simpler for companies to make informed decisions. Already, at least one cloud platform has added indicators to regions with the lowest carbon footprint. So while we hope 2022 will not be as disruptive as the years gone by, it will still bring seminal changes to our industry. These changes will also prompt organisations toward more agile, cohesive and sustainable data platform strategies as they seek to gain competitive advantage and exceed customer expectations. Source: IDC, European Customers Engage Services Providers at All Stages of Their Cloud Journey, IDC Survey Spotlight, Doc #EUR248484021, Dec 2021

December 21, 2021
Applied

Joyce, a Decentralized Approach to Foster Business Agility

Despite all of the tools and methodologies that have arisen in the last few years, many companies, particularly those that have been in the market for decades, struggle when it comes to leveraging their operational data to build new digital products and services. According to research and surveys conducted by McKinsey over the last few years, the success rate of digital transformations is consistently low, with less than 30% succeeding at improving their company’s performance. There are a lot of reasons for this, but most of them can be summarized in a sentence: A digital transformation is primarily an organizational and cultural change and then a technological shift. The question is not if digital transformation is a good thing nor is it if moving to the cloud is a good choice. Companies need (badly, in some cases) a digital transformation and yes, the pros of moving to the cloud usually overcome the cons. So, let’s try to dig deeper and analyze three of the main problems companies face when they go on this journey Digital products development Products by nature are customer-driven but companies run their businesses on multiple back-end systems that are instead purpose-driven. Unless you run a very small business, different people with different objectives have ownership of such products and systems. Given this context, what happens when a company wants to launch a new digital product at speed? The back-end systems (CRMs, E-commerce, ERP, etc.) hold the data they need to bring to the customer. Some systems are SaaS, some are legacy, and perhaps others are custom applications created by the company that disrupted the market with innovative solutions back in the days, the perfect recipe for integration hell. The product manager needs to coordinate and negotiate multiple change requests with the system’s owners whilst trying to convince them to add their needs in the backlog to meet the deadline. And things get even worse, as the new product relies on the computational power of the source systems, and if those systems cannot handle the additional traffic, both the product and the core services will be affected. Third-party integration “Everybody wants the change, (almost) nobody wants to change.” In this ever-growing digital world, partnering with third parties (whether they are clients or service providers) is crucial, but everyone who has tried to do so knows how challenging this is: non-standard interfaces, CSV files over FTP with fancy update rules, security issues… The list of unwanted things can grow indefinitely. SaaS everywhere The Software-as-a-Service model is extremely popular and getting the service you want without worrying about the underlying infrastructure gives freedom and speed of adoption, but what happens when a big company relies on multiple SaaS products to run their business? Sooner or later, they experience loss of control and higher costs in keeping a consistent view of the big picture. They need to deal with SaaS internal representations of their own data, multiple views of the same domain concept, unplanned expenses to export, and interpret and integrate the data from different sources with different formats. Putting it all together All the issues above fall into a well-known category of information technology. They are integration problems, and over the years, a lot of vendors promised a definitive solution. Now, you can consider low-code/no-code platforms with hundreds of ready-made connectors and modern graphical interfaces. Problem solved, right? Well, not really. Low-code integration platforms simplify implementation. They are really good at it, but doing so oversimplifies the real challenge: creating and maintaining a consistent set of APIs shaped around the business value over time, and preventing the interfaces from leaking internal complexities to the rest of the company, something that has to be defined and maintained through architectural choices and proper skills (completely hidden behind the selling points of such platforms). There are two different ways to solve integration problems: Centralized using adapters. In this case, the logic is pushed to the central orchestration component, with integration managed through a set of adapters. This is the rather old school SOA approach, the one that the majority of market integration platforms are built on. Decentralized, pushing the logic to the edges, giving autonomous teams the freedom to define both the boundaries and the APIs that a domain must expose to deliver business value. This is a more modern approach that has arisen recently alongside the rise of microservices and, in the analytical world, with the concept of data mesh. The former gives speed at the starting point and the illusion of reducing the number of choices and skills to manage the problems, but in the long run, inevitably, this begins to accumulate technical debt. Due to the lack of necessary degrees of freedom, you lose the ability to evolve the integration points over time, the same thing that caused the transition from SOA to microservices architectures. The latter needs the relevant skills, vision, and ability to execute but gives immediate results and allows you to flexibly manage the evolution of the enterprise architecture over time. Old problems, new solutions At Sourcesense in the last 20 years, we have partnered on hundreds of projects to bring agility, speed, and new open-source technology to our customers. Many times through the years, we were faced with the integration challenges above, and yes, we tried to solve them with the technology available at the time, so we have built some integration solutions on SOA (when they were the best of breed) and interacted with many of the integration platforms on the market. Then, we struggled with the issues and limitations of the integration landscape and have listened to our customers’ needs and where expectations have fallen short. The rise of agile methodologies, cloud computing, new techniques, technologies, and architectural styles has given an unprecedented boost to software evolution and the ability to support business needs, so we embraced the new wave and now have growing experience in solving problems with these tools. Along the way, we’ve seen a recurring pattern when we encountered integration problems, the effectiveness of data hubs as components of the enterprise architectures to solve these challenges, so we built one of our own: Joyce. Data hubs This is a relatively new term and refers to software platforms that collect data from different sources with the main purpose of distribution and sharing. Since this definition is broad and vague, let’s add some other key elements that matter and help define the contours of our implementation. Collecting data from different sources can bring three major benefits: Computational decoupling from the sources. Pulling (or pushing) the data out of the originating systems means that client applications and services interact with the hub and not directly with the sources, preventing them from being slowed down by additional traffic. Catalog and discoverability. If data is collected correctly, this leads to the creation of a catalog, allowing people inside the organization to search, discover, and use the data inside the hub. Security. The main purpose of the hubs is distribution and sharing. This leads immediately to focus on access control and security hardening. A single access point simplifies the overall security around the data because it significantly reduces the number of systems the clients have to interact with to gather the data they need. Joyce, how it works The cornerstone concept of Joyce is the schema. It allows you to shape the ingested data and how this data will be made available to client services. Using the same declarative approach made popular by Kubernetes, the schemas describe the expected result and the platform performs the actions to make it happen. Schemas are standard JSON schema files stored and classified in a catalog. Their definition falls into three categories: Input – how to gather and shape the source data. We leverage the Kafka Connect framework to provide ready-made connectors for a wide variety of sources. The ingested data can be filtered, formatted, and enriched with transformation handlers (domain-specific extensions of JSON schema). Model – allows you to create new aggregates from the data stored in the platform. This feature gives the freedom to model the data the way needed by client services. Export – bulk data export capability. Exported data can be any query run against the existing data with an optional temporal filter. Input and model data is made available to all the client services with the proper authorization grants through auto-generated REST and GraphQL APIs. It is also possible to subscribe to a dedicated topic if an event-driven approach is more suitable for the use-case. MongoDB: the key for a flexible model and performance at scale We heavily rely on MongoDB. Thanks to its flexibility, we can easily map any data structure the user defines to collect the data. Half of the schema definition is basically the definition of a MongoDB schema. (We also auto-generate one schema per collection to guarantee data integrity.) Joyce runs in a Kubernetes cluster and all its services are inherently stateless to exploit the full potential of horizontal scaling. The architecture is based on the CQRS pattern. This means that writes and reads are completely decoupled and can scale independently to meet the unique needs of the production environment. MongoDB is also the backing database of the API layer so we can keep the promise of low latency, high throughput, and continuous availability along all the components of the stack. The platform is available as a fully managed PaaS on the three major cloud providers (AWS, Azure, GCP) but if needed, it can be installed on an existing infrastructure (in cloud and on prem). Final considerations There are many challenges leaders must face for a successful digital transformation. They need to guide their organizations along a process that involves changes on many levels. The exponential growth of technological solutions in the last few years adds more complexity and confusion. The evolution of organizational models and methodologies point in the direction of shared responsibility, people empowerment, and autonomous teams with a light and effective central governance. The same evolution also permeates the novel approaches to enterprise architectures like the data mesh. Unfortunately, there’s no silver bullet, just the right choices for the given context. Despite all the marketing and hype around this or that one solution to all of your digital transformation needs, a long term successful shift needs guidance, competence and empowerment. We’ve built Joyce with the aim of reducing the burden of repetitive tasks and boilerplate code to get the results faster and catch the low hanging fruits without trying to replace the necessary architectural thinking to properly define the current state and the evolution of the enterprise architectures of our customers. If you’re struggling with the problems enlisted at the beginning of this article you should give Joyce a try. Learn more about Joyce

December 21, 2021
Applied

Simplifying Compliance with VComply & MongoDB

As businesses globally are facing external pressures to be more focused on privacy, security, and transparency, compliance management is needed now more than ever. With 200+ regulatory updates, 900 regulatory agencies, and the average cost of a non-compliance incident being $14 million, maintaining compliance is critical for every business, no matter the size. Tracking, maintaining, and proving compliance has traditionally been incredibly difficult, resource-intensive, and takes a significant time commitment. That's why one startup aims to simplify compliance by disrupting the antiquated industry. Enter VComply . Founded in 2019, VComply is a governance, risk management, and compliance (GRC) platform that enables its customers with a secure and easy-to-use solution. VComply is highly configurable to meet the specific needs of any organization without additional coding or infrastructure changes. The platform collects, organizes, analyzes, and automatically reports on GRC data inputted into the system to provide a high-level view of an organization's compliance posture at any given time. Combining that with the ability to surface detailed information on any control, VComply modernizes how people work and interact with GRC programs within their businesses. In this week's #BuiltWithMongoDB, we take a look at VComply to learn more about how they are truly helping organizations strengthen their risk and compliance management. We spoke with Harshvardhan Kariwala, CEO, and Ashish Jha, Vice President of Engineering at VComply to discuss the company's journey and how they decided to build with MongoDB. What inspired you to build the business? Harshvardhan: VComply is actually my third startup. At one of my previous companies, I had become hyper-focused on building the business, and I eventually lost sight of compliance. Operational functions fell through the cracks, and I ended up outsourcing our compliance programs to this corporate firm in Singapore. Fast forward a bit, they ended up forgetting to do a required compliance filing, and we ended up responsible for paying the non-compliance fines associated with that. One reporting misstep, and we were fined. That's what got me worried. We got lucky that was all that happened. It only took one time to inspire action. We then built an internal tool where the entire idea was around creating a culture of reporting excellence and internal accountability. After adopting our newly created tool, in 2018, we realized that we built a very robust solution to real day-to-day compliance problems. We thought, "Why don't we spin this off into its own product?" By that time, I was ready to get back into product development, and this was the perfect opportunity. In early 2019 we set up VComply. We quickly got our first customer, the City of Boston, and never looked back. So that's where VComply got its start. It was never meant to be sold as a product. It was more of an internal compliance tracking tool. That's how we entered the GRC space. What exactly does VComply do? What are some of its most useful product features? Harshvardhan: We help businesses be compliant, mitigate risk, and adopt a culture of transparency. If there isn't internal alignment within a company, no tool is going to help them. At its core, VComply is designed to be easy to use so that anyone in an organization can adopt a compliance-first mindset. By removing the traditional technological barriers, we found that businesses can realize the benefits quickly. That said, VComply serves as the single source of truth for everything GRC within an organization. Think tracking compliance obligations, compliance monitoring, automating activities, alerts and follow-ups, compliance evidence collection, audit trails, and more. Another popular piece of our tool is our enterprise risk management as well as policy management functionalities. You can monitor and manage risk programs, quickly identify risks, and start linking compliance obligations to mitigate that risk. What makes VComply stand out from its competitors? Harshvardhan: Most other solutions on the market require a compliance expert, are hard to navigate, and take a significant time commitment upfront to get up and running. We built VComply to be more practical and realistic with how people manage their compliance and risk programs today. VComply is easy to set up, simple to use for the end-user, and flexible to map to the specific controls a business needs to comply with without any additional coding. How did you decide to build with MongoDB? Ashish: Easy and intuitive search support, as well as indexing and automated performance suggestions, were the key drivers for us building with MongoDB. Also, training new developers is very straightforward. What has your experience been like scaling with MongoDB? Ashish: Scaling is pretty seamless with MongoDB. Setting up alerts and monitoring is very straightforward. We've had nothing but great experiences so far. Do you have a favorite technical book or podcast that you would recommend to other tech entrepreneurs? Harshvardhan: I would recommend The Great CEO Within: The Tactical Guide to Company Building by Matt Mochary. That's definitely a great read. This is a bit of an open questions, so feel free to interpret it how you'd like. What are you currently learning? Harshvardhan: That's a tough one. I think you're always learning so many different things on any given day that it's difficult to give one answer. Today, I'm learning marketing strategies, like demand gen as well as sales tactics to scale the business. Ashish: Primarily, I'm learning how engineering can augment and support other organizations within the company. Who are some tech leaders or entrepreneurs that you admire? Ashish: I do admire Jeff Bezos quite a bit. I admire the laser focus that he has and the clarity in terms of his reasoning. Harshvardhan: Elon Musk because of his ideas and execution. One thing that's great about him is how he executes his ideas flawlessly. Interested in learning more about MongoDB for Startups? Learn more about us here .

November 17, 2021
Applied

For Banks, KYC Should Mean More than Just "Knowing Your Client"

Banks in the loan or mortgage business believe they know their clients well, yet they struggle to offer services that capitalize on customer data or tailor the loan origination experience to the individual based on the existing volume of information they have. That’s one of the key takeaways from Mortgages: A Digital Process to Be Mastered , a new report from MongoDB and FinTechFutures. The report, which surveyed 104 retail banking, business banking, and corporate banking executives, highlights that customer pain is particularly acute when — despite collecting reams of information about clients — banks and loan originators are still unable to turn around loan requests in a timely manner or offer personalized experiences. Click here to check out the Panel Discussion Do You Really Know Your Customer? According to the report, 61% of financial executives said they have industry-leading Know Your Client (KYC) processes. But at the same time, 43% also named poor digital experiences as a barrier to recruiting and retaining customers, with the inability to deliver personalized offers coming in second at 34%. Other issues commonly cited include the speed of innovation, the complexity of doing business, and the inability to serve customers in real time. So, do banks really know their customers? And if they do, what are they doing with that information if not using it to better serve their customers? What’s holding the industry back? Outdated processes, with agents and employees forced to grapple with manual processes, shuffling paper piles, and creating spreadsheets before they can get to work serving the customer. The market is crying out for better automated and data-driven decisions, and legacy systems can’t keep up. Besides the obvious waste of people resources, the lack of a holistic digital offering also hurts business. Customers increasingly cite easy to use and transparent mortgage processes, smooth onboarding, and digital processes as factors behind the lender they choose. Behind the Numbers Banks have made some progress digitizing and automating what were once almost exclusively paper-based, manual processes. But the primary driver of this transformation has been compliance with local regulations rather than an overarching strategy for really getting to know the client and achieving true customer delight. It’s a missed opportunity for a couple of reasons. First, banks create a comprehensive client profile during the onboarding process. They have enough data to perform risk assessments and personalize offers, but instead default to “new client onboarding” processes that are 30+ years old. The simple fact that the consumer is already a long term client with detailed information is ignored. The famous example is the ask for pay slips while the bank can see the actual monthly (or weekly) pay moving in to begin with. Second, most bank customers stay with their chosen financial institution for their entire lifetimes. And as the client relationship matures, the insight banks can bring to bear become deeper and richer. But a slow approval process (40%) and slow response times (39%) were two of the top areas in need of improvement cited in the survey . These are not the hallmarks of industry-leading KYC processes or deep client relationships, but of siloed data and misalignment of digital strategy. What we’re seeing is the difference between the practice of ensuring compliance with local regulations and a strategic imperative for truly understanding the client. While modernization investments have helped automate much of the paper-pushing related to compliance, transforming customer experiences and making LOS more transparent have yet to be achieved. Most executives in the survey planned on leveraging real-time analytics, AI/ML, and workflow software to improve processes. These are all technologies that can take KYC processes beyond simple compliance use cases and lead to more value-added, personalised client relationships. Boris Bialek, Global Head, Industry Solutions Smart Money Today’s clients are demanding fully modern, mobile-first banking experiences. To meet those expectations, bank executives and IT leaders plan to invest in technologies that can address some of their most glaring needs. Chief among them include getting away from manual processes like email and spreadsheets, better data analytics for decision making, and gaining access to real-time information at every touchpoint in the customer journey. The investments they’re willing to make include real-time analytics, artificial intelligence, and machine learning (AI/ML). Along with digital customer experiences (for example, chatbots and personalized recommendations), these are the three areas that bank executives and IT leaders say will drive greater market share and profitability in the loans business. So, even though many banks have started the journey toward modernization, they still have further to go before they’re able to meet the expectations of their clients. It’s not about reducing the paper-pushing or satisfying regulatory requirements involved with the LOS business. It’s about personalization and real-time experiences, hallmarks of true KYC. Mortgages are a Digital Process to be Mastered If real-time data and AI/ML are the way forward for driving value and transforming customer experiences, it will have to be accompanied by modernization of the underlying data architecture . As bank executives and IT leaders in the survey acknowledge , the lack of a digitization strategy, speed to market, and costly legacy migration are their top three concerns when digitizing their mortgage processes. A de-siloing of data and the introduction of data mesh concepts allows the leap from modernizing legacy infrastructure to digital transformation and competitive advantages. Banking innovators strive to be first to market, but legacy systems are holding them back, stymying digitization strategies. Overcoming these and other challenges requires the introduction of a modern data domain model that integrates the transactional and process workloads and augments customer data with information from other legacy and external systems. MongoDB Atlas is perfectly suited for this purpose. We have deep experience building customer 360 models that can be mapped to omnichannel interactions. In addition, MongoDB also has proven capabilities integrating risk and treasury functions (for mortgages this means funds transfer pricing and credit risk), with MongoDB Atlas being used by many banks and other financial service providers in the mortgage space, from building societies in the UK to special purpose lenders in Australia. Lastly, MongoDB’s ability to integrate mobile experiences, search capabilities, and real-time analytics (for example, scoring for consumer ratings while that consumer is on a web page) makes MongoDB the proven data platform for mortgage modernization and true digital transformation.

November 10, 2021
Applied

Ready to get Started with MongoDB Atlas?

Start Free