MongoDB Applied

Customer stories, use cases and experience

4 Critical Features for a Modern Payments System

The business systems of many traditional banks rely on solutions that are decades old. These systems, which are built on outdated, inflexible relational databases, prevent traditional banks from competing with industry disruptors and those already adopting more modern approaches. Such outdated systems are ill-equipped to handle one of the core offerings that customers expect from banks today — instantaneous, cashless, digital payments . The relational database management systems (RDBMSes) at the core of these applications require breaking data structures into a complex web of tables. Originally, this tabular approach was necessary to minimize memory and storage footprints. But as hardware has become cheaper and more powerful, these advantages have also become less relevant. Instead, the complexity of this model results in data management and programmatic access issues. In this article, we’ll look at how a document database can simplify complexity and provide the scalability, performance, and other features required in modern business applications. Document model To stay competitive, many financial institutions will need to update their foundational data architecture and introduce a data platform that enables a flexible, real-time, and enriched customer experience. Without this, new apps and other services won’t be able to deliver significant value to the business. A document model eliminates the need for an intricate web of related tables. Adding new data to a document is relatively easy and quick since it can be done without the usually lengthy reorganization that RDBMSes require. What makes a document database different from a relational database? Intuitive data model simplifies and accelerates development work. Flexible schema allows modification of fields at any time, without disruptive migrations. Expressive query language and rich indexing enhance query flexibility. Universal JSON standard lets you structure data to meet application requirements. Distributed approach improves resiliency and enables global scalability. With a document database, there is no need for complicated multi-level joins for business objects, such as a bill or even a complex financial derivative, which often require object-relational mapping with complex stored procedures. Such stored procedures, which are written in custom languages, not only increase the cognitive load on developers but also are fiendishly hard to test. Missing automated tests present a major impediment to the adoption of agile software development methods. Required features Let’s look at four critical features that modern applications require for a successful overhaul of payment systems and how MongoDB can help address those needs. 1. Scalability Modern applications must operate at scales that were unthinkable just a few years ago, in relation to both transaction volume and to the number of development and test environments needed to support rapid development. Evolving consumer trends have also put higher demands on payment systems. Not only has the number of transactions increased, but the responsive experiences that customers expect have increased the query load, and data volumes are growing super-linear. The fully transactional RDBMS model is ill suited to support this level of performance and scale. Consequently, most organizations have created a plethora of caching layers, data warehouses, and aggregation and consolidation layers that create complexity, consume valuable developer time and cognitive load, and increase costs. To work efficiently, developers also need to be able to quickly create and tear down development and test environments, and this is only possible by leveraging the cloud. Traditional RDBMSes, however, are ill suited for cloud deployment. They are very sensitive to network latency, as business objects spread across multiple tables can only be retrieved through multiple sequential queries. MongoDB provides the scalability and performance that modern applications require. MongoDB’s developer data platform also ensures that the same data is available for use with other frequent consumption patterns like time series and full-text search . Thus, there is no need for custom replication code between the operational and analytical datastore. 2. Resiliency Many existing payment platforms were designed and architected when networking was expensive and slow. They depend on high-quality hardware with low redundancy for resilience. Not only is this approach very expensive, but the resiliency of a distributed system can never be reached through redundancy. At the core of MongoDB’s developer data platform is MongoDB Atlas , the most advanced cloud database service on the market. MongoDB Atlas can run in any cloud, or even across multiple clouds, and offers 99.995% uptime. This downtime is far less than typically expected to apply necessary security updates to a monolithic legacy database system. 3. Locality and global coverage Modern computing demands are at once ubiquitous and highly localized. Customers expect to be able to view their cash balances wherever they are, but client secrecy and data availability rules set strict guardrails on where data can be hosted and processed. The combination of geo-sharding, replication, and edge data addresses these problems. MongoDB Atlas in combination with MongoDB for Mobile brings these powerful tools to the developer. During the global pandemic, more consumers than ever have begun using their smartphones as payment terminals. To enable these rich functions, data must be held at the edge. Developing the synchronization of the data is difficult, however, and not a differentiator for financial institutions. MongoDB for Mobile, in addition with MongoDB’s geo-sharding capability on Atlas cloud, offloads this complexity from the developer. 4. Diverse workloads and workload isolation As more services and opportunities are developed, the demand to use the same data for multiple purposes is growing. Although legacy systems are well suited to support functions such as double entry accounting, when the same information has to be served up to a customer portal, the central credit engine, or an AI/ML algorithm, the limits of the relational databases become obvious. These limitations have resulted in developers following what is often called “best-of-breed” practices. Under this approach, data is replicated from the transactional core to a secondary, read-only datastore based on technology that is better suited to the particular workload. Typical examples are transactional data stores being copied nightly into data lakes to be available for AI/ML modelers. The additional hardware and licensing cost for this replication are not prohibitive, but the complexity of the replication, synchronization, and the complicated semantics introduced by batch dumps slows down development and increases both development and maintenance costs. Often, three or more different technologies are necessary to facilitate the usage patterns. With its developer data platform, MongoDB has integrated this replication, eliminating all the complexity for the developers. When a document is updated in the transactional datastore, MongoDB will automatically make it available for full-text search and time series analytics. The pace of change in the payments industry shows no signs of slowing. To stay competitive, it’s vital that you reassess your technology architecture. MongoDB Atlas is emerging as the technology of choice for many financial services firms that want to free their data, empower developers, and embrace disruption. Replacing legacy relational databases with a modern document database is a key step toward enhancing agility, controlling costs, better addressing consumer expectations, and achieving compliance with new regulations. Learn more by downloading our white paper “Modernize Your Payment Systems."

August 8, 2022

Navigating the Future of Data Sovereignty With MongoDB

There are 2.5 quintillion bytes of data created every day , and more and more of that data is being stored in a public cloud. The rise of cloud data storage brings with it a focus on data sovereignty. Governments and industry regulatory bodies are cracking down on protecting user data. At any given time, organizations must know where its data is located, replicated, and stored — as well as how it is collected and processed, prioritizing personal data privacy all along the way. The challenge of GDPR compliance A PwC survey found that 92% of U.S. companies consider GDPR a top data protection priority , and rightly so, as there is pressure from both governments and citizens to protect user data. A recent Vormetric survey found that 85% of American consumers said that if significant personal consequences resulted from their information being compromised as part of a breach, they’d take their business elsewhere. Without a strong handle on data sovereignty, organizations are risking millions of dollars in regulatory fees for mishandling data, loss of brand credibility, and distrust from customers. Where to start with data sovereignty Creating a proper structure for data sovereignty can be complex, and as big data gets bigger, so will the breadth and depth of regulations. The GDPR of today may not resemble the GDPR of tomorrow, and more laws continue to be rolled out at the federal, state, and industry levels. GDPR, while the most notable, is not the only data regulation policy that businesses must consider. California has rolled out the California Consumer Privacy Act, and there are numerous countries that have similar laws in place to protect consumer data and regulate how data is managed, including Japan, India, Egypt, and Australia. And as these regulations continue to be introduced, organizations will need to keep pace to avoid damage to their businesses. Major considerations that impact data sovereignty include: Process: How is your company going to maintain compliance for data sovereignty with efficiency? Infrastructure: Is a legacy infrastructure holding you back from being able to easily comply with data regulations? Scaling: Is your data architecture agile enough to meet regulations quickly as they grow in breadth and complexity? Cost: Are you wasting time and money by leveraging manual processes to adhere to governmental regulations and risking hefty fees attached to incompliance? Penalties: Are your business leaders fully aware of the costs associated with noncompliance? GDPR violations can exact up to €10 million (an average of 2% to 4% of organizational revenue) in penalties. Learn more about strong controls for critical data privacy at our upcoming webinar on queryable encryption . Managing data sovereignty with MongoDB Atlas MongoDB enables you to easily comply with most data privacy regulations. MongoDB Atlas , our cloud database as a service, includes intuitive security features and privacy controls, including: Queryable encryption : Revolutionary to the industry and currently in preview with MongoDB 6.0, queryable encryption enables encryption of sensitive data from the client side, stored as fully randomized, encrypted data on the database server side. This feature delivers the utmost in security without sacrificing performance, ensuring that even the most critical and sensitive workloads are safe and performant in a public cloud. MongoDB Atlas global clusters : It is no longer sustainable or advantageous to build applications across geographic areas and jurisdictions. Doing so requires more infrastructure, more maintenance, more management, and, in turn, more complexity and more resources exhausted. Atlas global clusters allow organizations with distributed applications to geographically partition a fully managed deployment in a few clicks and control the distribution and placement of their data with sophisticated policies that can be easily generated and changed. This means that not only can your organization achieve compliance with regulations containing data residency requirements more easily, but you can also reduce overhead at the same time. Virtual private clouds (VPCs): Each MongoDB Atlas project is provisioned into its own VPC, thereby isolating your data and underlying systems from other MongoDB Atlas users. This allows businesses to meet data sovereignty requirements while staying highly available within each region. Each shard of data will have multiple nodes that automatically and transparently failover for zero downtime, all within the same jurisdiction. Being able to meet data residency requirements is another big technical challenge made simple with MongoDB Atlas . Further, businesses can connect Atlas VPCs to customer infrastructure via private networking (including private endpoints and VPC peering) for increased security. IP whitelists : IP whitelists allow you to specify a specific range of IP addresses against which access will be granted, delivering granular control over data. Client-side field-level encryption (CSFLE) : This feature dramatically reduces the risk of unauthorized access or disclosure of sensitive data. Fields are encrypted before they leave your application, protecting them everywhere: in motion over the network, in database memory, at rest in storage and backups, and in system logs. Dig deeper into data sovereignty To learn more about strong controls for critical data privacy, join MongoDB’s webinar on August 24, 2022 . Our experts will focus on queryable encryption, the industry’s first encrypted search scheme, and how, with MongoDB Atlas, your data is protected with preconfigured security features for authentication, authorization, encryption, and more. Register for our queryable encryption webinar on August 22, 2022 .

August 3, 2022

Tools for Implementing Zero Trust Security With MongoDB

The practice of protecting IT environments from unauthorized access used to be centered on perimeter security — the strategy of securing the perimeter but allowing unrestricted access inside it. As users became increasingly mobile and IT assets became increasingly dispersed, however, the notion of a network perimeter became obsolete. That strategy has now been replaced by the concept of zero trust. In a zero trust environment, the perimeter is assumed to have been breached. There are no trusted users, and no user or device gains trust simply because of its physical or network location. Every user, device, and connection must be continually verified and audited. MongoDB offers several tools and features for integrating our products into a zero trust environment, including: Security by default Multiple forms of authentication TLS and SSL encryption X.509 security certificates Role-based access control (RBAC) Database authentication logs Encryption for data at rest, in flight, and in use For government customers, MongoDB Atlas for Government is FedRAMP-ready. Security by default MongoDB Atlas clusters do not allow for any connectivity to the internet when they’re first spun up. Each dedicated MongoDB Atlas cluster is deployed in a unique virtual private cloud (VPC) configured to prohibit inbound access. (Free and shared clusters do not support VPCs.) The only way to access these clusters is through the MongoDB Atlas interface. Users can configure IP access lists to allow certain addresses to attempt to authenticate to the database. Without being included on such a list, application servers are unable to access the database. Even the person who sets up the clusters needs to add their IP address to the access list. To find out more about the security measures that protect our cloud-based database, MongoDB Atlas, and the rules governing employee access, read our whitepaper, MongoDB: Capabilities for Use in a Zero Trust Environment . Authentication Customers have several options to allow users to authenticate themselves to a database, including a username and password, LDAP proxy authentication, and Kerberos authentication. All forms of MongoDB support transport layer security (TLS) and SCRAM authentication. They are turned on by default and cannot be disabled. Traffic from clients to MongoDB Atlas is authenticated and encrypted in transit, and traffic between a customer’s internally managed MongoDB nodes is also authenticated and encrypted in transit using TLS. For passwordless authentication, MongoDB offers two different options to support the use of X.509 certificates. The first option, called “easy,” auto-generates the certificates needed to authenticate database users. The “advanced” option is for organizations already using X.509 certificates and that already have a certificate management infrastructure. The advanced option can be combined with LDAPS for authorization. Access infrastructure can only be reached via bastion hosts and by users for whom senior management has approved backend access. These hosts require multifactor authentication and are configured to require SSH keys — not passwords. Logging and auditing MongoDB supports a wide variety of auditing strategies, making it easier to monitor your zero trust environment to ensure that it remains in force and encompasses your database. Administrators can configure MongoDB to log all actions or apply filters to capture only specific events, users, or roles. Role-based auditing lets you log and report activities by specific role, such as userAdmin or dbAdmin, coupled with any roles inherited by each user, rather than having to extract activity for each individual administrator. This approach makes it easier for organizations to enforce end-to-end operational control and maintain the insight necessary for compliance and reporting. The audit log can be written to multiple destinations in a variety of formats, such as to the console and syslog (in JSON) and to a file (JSON or BSON). It can then be loaded to MongoDB and analyzed to identify relevant events. Encryption MongoDB also lets you encrypt data in flight, at rest, or even, with field-level encryption and queryable encryption , in use. For data in motion, all versions of MongoDB support TLS and SSL encryption. For data at rest, MongoDB supports AES-256 encryption, and it can also be configured for FIPS compliance. To encrypt data when it is in use, MongoDB offers client-side field-level encryption , which can be implemented to safeguard data even from database administrators and vendors who otherwise would have access to it. Securing data with client-side field-level encryption allows you to move to managed services in the cloud with greater confidence. The database only works with encrypted fields, and organizations control their own encryption keys, rather than having the database provider manage them. This additional layer of security enforces an even more fine-grained separation of duties between those who use the database and those who administer and manage it. MongoDB Atlas exclusively offers queryable encryption, which allows customers to run rich expressive queries on fully randomized encrypted data with efficiency, improving both the development process and user experience. Organizations are able to protect their business by confidently storing sensitive data and meeting compliance requirements. Zero trust and MongoDB MongoDB is optimally suited for use within a zero trust environment. MongoDB is secure by default and has developed industry-leading capabilities in key areas such as access, authorization, and encryption. Used together, these features help protect the database from outside attackers and internal users who otherwise could gain an unauthorized level of access. For more detailed information about security features in MongoDB, read our whitepaper, MongoDB: Capabilities for Use in a Zero Trust Environment .

August 2, 2022

Connected Data: How IoT Will Save Healthcare and Why MongoDB Matters

Over the next decade, healthcare systems around the world will face a two-fold challenge: Delivering higher quality care while managing rising costs, and doing so for increasingly larger populations of patients. For decades, healthcare systems have operated predominantly with traditional fee-for-service models, in which reimbursements are given to providers based on services rendered. Value-based healthcare, in contrast, attempts to lower the cost of care by keeping patients healthier longer through more effective and efficient use of healthcare systems. This article — Part 2 of our series on connected healthcare data — looks at how IoT, with support from MongoDB, can help meet future healthcare challenges. Read Part 1 of this series on connected healthcare data Increased demand It's expected that by 2050, 22% of the world's population will be over 60 years old . This adds increased pressure to the goals of optimizing both patient outcomes and healthcare spend, because there are more people within healthcare systems than ever before. And, as these patients live longer, they experience more chronic conditions and, therefore, require more care. Constraints on the ability to graduate enough doctors and nurses to meet this surge of healthcare demand suggest that innovation will be needed to provide adequate supply. Additionally, many healthcare services are delivered in an exam or hospital room, where patient vitals and observations are captured, a chart is reviewed, and medications and treatments are ordered. According to a recent study from the Annals of Internal Medicine , providers spend more than 16 minutes per encounter on these tasks alone. Observation and data collection in healthcare is critical to identifying and subsequently adjusting treatment pathways; however, the process is heavily reliant on in-person visits. How IoT will save healthcare Global adoption of the Internet of Things (IoT) is soaring across numerous industries. In fact, healthcare is forecasted to be the second largest industry in value for IoT by 2030. IoT offers the ability to remotely monitor patients via wearables and connected devices. It provides the means to collect data beyond the patient exam or hospital room and can help providers deliver care outside of traditional, in-person methods. With this power to collect more information, more often, and do so with fewer patient encounters, IoT plays a role in solving the two-fold challenge of delivering better quality of care for increasingly larger populations of patients. A patient wearing a smartwatch, for example, may be able to stream heart rate and oxygen saturation levels during real-world activities to an electronic healthcare record, where the data can be aggregated and summarized for a physician to review, or even for a machine-learning algorithm to periodically interrogate. IoT devices can help collect more data, more often, to help providers deliver more meaningful, timely, and impactful healthcare recommendations and treatments to patients. Through this added value, IoT can further the benefits of telemedicine and promote the idea of “care anywhere,” in which healthcare is not directly tied to or dependent upon in-person encounters. Challenges of healthcare data on the move What challenges face developers when it comes to capturing and leveraging data from healthcare IoT devices? Four significant capabilities top the list, which we will look at in turn: Scalable and efficient storage Global coverage and data synchronization Interoperability Security and privacy Scalable and efficient storage IoT devices have the capability to produce massive volumes of continuous data. In fact, market intelligence provider International Data Corporation (IDC) predicts that IoT devices alone will produce 74.9 ZB of data by 2025, from a staggering 55.9 billion devices. A cloud-based developer data platform will be critical to support these kinds of massive data volumes, which may also exhibit unpredictable peaks in workloads. Additionally, as is the case for many IoT use cases, often only the most recent data is used for analysis. In this scenario, the ability to automatically archive raw and historical data to a more cost-effective storage, and yet still be able to query it when and if needed, would be ideal. MongoDB’s Atlas Online Archive lets developers do just that, with minimal setup and configuration required, as shown in Figure 1. Figure 1. MongoDB automates data tiering while keeping it queryable with Atlas Online Archive. Not all databases are ready to deal with the massive, continuous data generated by IoT devices. Sensor data is typically collected with high frequency, which may mean high concurrency of writes, unpredictable workload peaks, and the need for dynamic scalability. Additionally, IoT data is almost by definition time-series data, meaning it typically comes with a timestamp that allows following the evolution of a parameter through time, at regular or irregular time intervals. Storing time-series data efficiently at scale can be difficult. In fact, specialized time-series databases exist to tackle workloads such as these. Additionally, storing the data is simply one side of the challenge. Another aspect involves running analytics as the data is collected, such as discovering heart anomalies and sending alerts in real time to the patient. Using specialized time-series databases solves these challenges but also introduces new ones: Developers will need to learn the nuances of working with a niche platform, slowing development cycles. Building and maintaining ETL pipelines to move data and merge data across different platforms. Integrating, securing, and maintaining an additional database platform, thereby increasing operational overhead. MongoDB's new time series collection feature allows you to automatically optimize your schema and deployment for high storage efficiency and low-latency queries, without the need of an additional, niche database. Additionally, MongoDB integrates time-series data with operational data and analytics capabilities in one unified environment with built-in scalability, delivering the performance your IoT applications need while simplifying your architecture. Global coverage and data synchronization For many IoT scenarios, users are effectively on the move: They go to work, they go shopping, and they get on planes to see the new beautiful shiny star on top of Barcelona's Sagrada Família. With all of this mobility, they might lose connectivity for a few minutes or even hours. Tracking their health effectively in real time is not just a nice feature, it may be mandatory. Using MongoDB’s Atlas Device Sync , developers can easily deploy IoT applications that seamlessly handle drops in connectivity, without missing critical write operations of the most important data workloads. Interoperability Most IoT devices use proprietary protocols and operating systems, which seriously limit interoperability. The IoT industry advocates the use of standard communication protocols such as MQTT, but, as of this writing, there is no single industry standard. Custom solutions exist that serve one single type of sensor and/or healthcare provider, but these solutions tend to suffer from interoperability challenges when interlinking data across different healthcare networks. As discussed in our first post , sharing healthcare data across different participants of the healthcare ecosystem requires standards such as JSON-based FHIR, which is key to mitigate healthcare fragmentation. Learn how we used MongoDB and MQtt to "listen" and "talk" remotely to an IoT-powered facility. Downloadable code available. Security and privacy Given its sensitive and personal nature (and relatively easy monetization through theft), health data is especially appealing to bad actors. The number of security incidents impacting healthcare systems is sobering. According to a report by Crowdstrike , 82% of health systems experienced some form of IoT cyberattack in 2020. With IoT proliferation on the rise, the need for the highest level of security at the application level and at the database level, becomes non-negotiable. Unsurprisingly, McKinsey cites interoperability, security, and privacy as major headwinds for IoT adoption, especially for healthcare. How MongoDB supports IoT challenges Here's a visual view of how MongoDB helps developers bring IoT applications to market faster: Scalability and efficient storage Global coverage and data synchronization High availability and scalability are built in via replication and native sharding. Online Archive automatically archives aged data to a fully managed cloud object storage, so you can optimize cost and performance without sacrificing data accessibility. Time series collections automatically optimize your schema for high storage efficiency, low-latency queries, and real-time analytics. MongoDB Atlas is a global, multi-cloud platform that lets your apps run anywhere in the world. Atlas Device Sync solves conflict resolution and keeps your data up to date across devices, users, and your backend, regardless of connectivity. Interoperability Security and privacy The document model provides a flexible schema and maps exactly to the objects that developers work with in their code. Different industry communication standards are being built over JSON, such as FHIR, which is a natural fit to MongoDB's document model. Thanks to MongoDB Client-side Field Level Encryption , data is encrypted in motion, in memory, and at rest. Queryable Encryption allows running expressive queries on fully randomized encrypted data. MongoDB provides the strongest levels of data privacy and security for regulated workloads. MongoDB Atlas takes care of the backend, removing friction from the development process and simplifying your technology stack, so you can focus on building differentiating features for your applications. Atlas is a developer data platform that supports a broad array of use cases, from operational to transactional and through analytical workloads. Atlas also offers the following features: Ability to service more loads of the data lifecycle: Enabling development teams to seamlessly analyze, transform, and move data while reducing reliance on batch processes or ETL jobs Built on a modern data model: Aligning to the way developers think and code Integrated: Delivering an elegant developer experience Figure 2. Atlas is a developer data platform built on three pillars: the document model, a unified interface for different data use cases, and a multi-cloud, enterprise-ready foundation. MongoDB for IoT-powered healthcare apps IoT and specifically wearables will play a major role in solving the two-fold challenge of delivering better quality care for increasingly larger populations of patients. The soaring adoption of wearables is accelerating the need for a developer data platform that helps software delivery teams build and manage health applications with: Scalable and efficient storage Global coverage and data synchronization Interoperability Security and privacy MongoDB Atlas is a developer data platform designed to manage the heavy lifting for you, by providing an elegant developer experience and unifying a broad range of data workloads with world-class privacy and security features. Read Part 1 of this series on connected healthcare data , and learn more about MongoDB Atlas and the healthcare industry .

July 25, 2022

Mobile Edge Computing, Part 1: Delivering Data Faster with Verizon 5G Edge and MongoDB

As you’ve probably heard, 5G is changing everything, and it’s unlocking new opportunities for innovators in one sector after another. By pairing the power of 5G networks with intelligent software, customers are beginning to embrace the next generation of industry, such as powering the IoT boom, enhancing smart factory operations, and more. But how can companies that are leveraging data for daily operations start using data for innovation? In this article series, we’ll explore how the speed, throughput, reliability, and responsiveness of the Verizon network, paired with the sophistication of the next generation MongoDB developer data platform, are poised to transform industries including manufacturing, agriculture, and automotive. Mobile edge computing: The basics Companies everywhere are facing a new cloud computing paradigm that combines the best experiences of hyperscaler compute and storage with the topological proximity of 5G networks. Mobile edge computing , or MEC, introduces a new mode of cloud deployments whereby enterprises can run applications — through virtual machines, containers, or Kubernetes clusters — within the 5G network itself, across both public and private networks. Before we dive in, let’s define a few key terms: What is mobile edge computing? The ability to deploy compute and storage closer to the end user What is public mobile edge computing? Compute and storage deployed with the carrier data centers What is private mobile edge computing? On-premise provisioned compute and storage Verizon 5G Edge , Verizon’s mobile edge compute portfolio, takes these concepts from theoretical to practical. By creating a unified compute mesh across both public and private networks, Verizon 5G Edge produces a seamless exchange of data and stateful workloads — a simultaneous deployment of both public and private MEC best characterized as a hybrid MEC. In this article, we’ll primarily focus on public MEC deployment. Although MEC vastly increases the flexibility of data usage by both practitioners and end users, the technology is not without its challenges, including: Deployment: Given a dynamic fleet of devices, in an environment with 20-plus edge zones across both public and private MEC, to which edge(s) should the application be deployed? Orchestration: For Day 2 operations and beyond, what set of environmental changes, — be it on the cloud, network, or on device(s) — should trigger a change to my edge environment? Edge discovery: Throughout the application lifecycle, for a given connected device, which edge(s) is the optimal endpoint for connection? Fortunately for developers, Verizon has developed a suite of network APIs tailored to answer these questions. From edge discovery and network performance to workload orchestration and network management, Verizon has drastically simplified the level of effort required to build resilient, highly available applications at the network edge without the undifferentiated heavy lifting previously required. Edge discovery API workflow Using the Verizon edge discovery API, customers can let Verizon manage the complexity of maintaining the service registry as well as identifying the optimal endpoint for a given mobile device. In other words, with the edge discovery API workflow in place of the self-implemented latency tests, a single request-response would be needed to identify the optimal endpoint, as shown in Figure 1. Figure 1.   A single request-response is used to identify the optimal endpoint Although this API addresses challenges of service discovery, routing, and some advanced deployment scenarios, other challenges exist outside of the scope of the underlying network APIs. In the case of stateful workloads, for example, how might you manage the underlying data generated from your device fleet? Should all of the data live at the edge, or should it be replicated to the cloud? What about replication to the other edge endpoints? Using the suite of MongoDB services coupled with Verizon 5G Edge and its network APIs, we will describe popular reference architectures for data across the hybrid edge. Delivering data with MongoDB Through Verizon 5G Edge, developers can now deploy parts of their application that require low latency at the edge of 4G and 5G networks using the same APIs, tools, and functionality they use today, while seamlessly connecting back to the rest of their application and the full range of cloud services running in a cloud region. However, for many of these use cases, a persistent storage layer is required that extends beyond the native storage and database capabilities of the hyperscalers at the edge. Given the number of different edge locations where an application can be deployed and consumers can connect, ensuring that appropriate data is available at the edge is critical. It is also important to note that where consumers are mobile (e.g., vehicles), the optimal edge location can vary. At the same time, having a complete copy of the entire dataset at every edge location to cater for this scenario is neither desirable nor practical due to the potentially large volumes of data being managed and the associated multi-edge data synchronization challenges that would be introduced. The Atlas solution The solution requires having an instantaneous and comprehensive overview of the dataset stored in the cloud while synchronizing only required data to dedicated edge data stores on demand. For many cases, such as digital twin, this synchronization needs to be bi-directional and may potentially include conflict resolution logic. For others, a simpler unidirectional data sync would suffice. These requirements mean you need a next-gen data platform, equipped with all the power to simplify data management while also delivering data in an instant. MongoDB Atlas is the ideal solution for the central, cloud-based datastore. Atlas provides organizations with a fully managed, elastically scalable application data platform upon which to build modern applications. MongoDB Atlas can be simultaneously deployed across any of the three major cloud providers (Amazon Web Services, Microsoft Azure, and Google Cloud Platform) and is a natural choice to act as the central data hub in an edge or multi-edge based architecture, because it enables diverse data to be ingested, persisted, and served in ways that support a growing variety of use cases. Central to MongoDB Atlas is the MongoDB database, which combines a flexible document-based model with advanced querying and indexing capabilities. Atlas is, however, more than just the MongoDB database and includes many other components to power advanced applications with diverse data requirements, like native search capabilities, real-time analytics, BI integration, and more. Read the next post in this blog series to explore the real-world applications and innovations being powered by mobile edge computing.

July 21, 2022

Mobile Edge Computing, Part 2: Computing in the Real World

It would be easy to conceptualize mobile edge computing (MEC) as a telecommunications-specific technology ; but, in fact, edge computing has far-reaching implications for real-world use cases across many different industries. Any organization that requires a solution to common data usage challenges, such as low-latency data processing, cloud-to-network traffic management, Internet of Things (IoT) application development, data sovereignty, and more, can benefit from edge-based architectures. In our previous article , we discussed what mobile edge computing is, how it helps developers increase data usage flexibility, and how Verizon 5G Edge and MongoDB work in conjunction to enable data computing at the edge, as shown in Figure 1. Figure 1.  Verizon and MongoDB work in conjunction to deliver data to consumers and producers faster than ever with mobile edge computing. In this article, we’ll look at real-world examples of how mobile edge computing is transforming the manufacturing, agriculture, and automotive industries. Smart manufacturing Modern industrial manufacturing processes are making greater use of connected devices to optimize production while controlling costs. Connected IoT devices exist throughout the process, from sensors on manufacturing equipment to mobile devices used by employees on the factory floor to connected vehicles transporting goods — all generating large amounts of data. For companies to realize the benefits of all this data, it is critical that the data be processed and analyzed in real time to enable rapid action. Moving this data from the devices to the cloud for processing introduces unnecessary latency and data transmission that can be avoided by processing at the edge. As seen in Figure 2, for example, sensors, devices, and other data sources in the smart factory use the Verizon 5G Edge Discovery Service to determine the optimal edge location. After that, data is sent to the edge where it is processed before being persisted and synchronized with MongoDB Atlas — all in an instant. Figure 2.   Data sources in smart factories use the Verizon 5G Edge Discovery Service to determine the optimal edge location. Process optimization Through real-time processing of telemetry data, it’s possible to make automated, near-instantaneous changes to the configuration of industrial machinery in response to data relayed from a production line. Potential benefits of such a process include improved product quality, increased yield, optimization of raw material use, and ability to track standard key performance indicators (KPIs), such as overall equipment efficiency (OEE). Preventative maintenance Similar to process optimization, real-time processing of telemetry data can enable the identification of potential impending machinery malfunctions before they occur and result in production downtime. More critically, however, if a situation has the potential either to damage equipment or pose a danger to those working in the vicinity, the ability to automatically perform shut downs as soon as the condition is detected is vital. Agriculture One of the most powerful uses of data analytics at scale can be seen in the agriculture sector . For decades, researchers have grappled with challenges such as optimal plant breeding and seed design, which to date have been largely manual processes. Through purpose-built drones and ground robotics, new ways to conduct in-field inspection using computer vision have been used to collect information on height, biomass, and early vigor, and to detect anomalies. However, these robots are often purpose-built with large data systems on-device, requiring manual labor to upload the data to the cloud for post-processing. Using the edge, this entire workflow can be optimized. Starting with the ground robotics fleet, the device can be retrofitted with a 5G modem to disintermediate much of the persistent data collection. Instead, the device can collect data locally, extract relevant metadata, and immediately push data to the edge for real-time analytics and anomaly detection. In this way, field operators can collect insights about the entirety of their operations — across a given crop field or nationwide — without waiting for the completion of a given task. Automotive Modern vehicles are more connected than ever before, with almost all models produced today containing embedded SIM cards that enable even more connected experiences. Additionally, parallel advances are being made to enable roadside infrastructure connectivity. Together, these advances will power not just increased data sharing between vehicles but also between vehicles and the surrounding environment (V2V2X). In the shorter term, edge-based data processing has the potential to yield many benefits both to road users and to vehicle manufacturers . Data quality and bandwidth optimization Modern vehicles have the ability to transmit large amounts of data not only in terms of telemetry relating to the status of the vehicle but also in regard to the observed status of the roads. If a vehicle detects that it is in a traffic jam, for example, then it might relay this information so that updates can be made available to other vehicles in the area to alert drivers or replan programmed routes, as shown in Figure 3. Figure 3.  Mobile edge computing enables data generated from multiple sources within a vehicle to be shared instantly. Although this is a useful feature, many vehicles may be reporting the same information. By default, all of this information will be relayed to the cloud for processing, which can result in large amounts of redundant data. Instead, through edge-based processing: Data is shared more quickly between vehicles in a given area using only local resources. Costs relating to cloud-based data transfer are better controlled. Network bandwidth usage is optimized. While improving control of network usage is clearly beneficial, arguably a more compelling use of edge-based processing in the automotive industry relates to aggregating data received from many vehicles to improve the quality of data sent to the cloud-based data store. In the example of a traffic jam, all of the vehicles transmitting information about the road conditions will do so based on their understanding gained through GPS as well as internal sensors. Some vehicles will send more complete or accurate data than others, but, by aggregating the many different data feeds at the edge, this process results in a more accurate, complete representation of the situation. The future Read Part 1 of this blog series . Download our latest book on computing at the edge .

July 21, 2022

The Power of Qualified Research That Companies Can Trust

Research is invaluable. Companies need research on prospective and current customers to analyze trends, evaluate purchasing decisions, and produce the best products that they can. Customers need research on what to buy, where to buy, and who to buy it from. Phonic is a research software startup with a mission to break down the barriers between quantitative and qualitative research to allow businesses to collect genuine insights at a scale that they can trust. Phonic CTO Mitch Catoen says that the pain point the company is addressing revolves around scaling. “Qualitative research is really good, and yields really good information, but it doesn’t scale quite like quantitative research does,” Catoen explains. The key, Catoen says, is that Phonic takes an analytics-first approach.“When we think about emotional intelligence, multimodal sentiment, and tag extraction, these are things that were built for our research platform, and the research platform sits on top of them. The customers of Phonic know they can trust our analytics more than any other platform,” When it came time to decide how Phonic was going to build its technology, one of the most critical pieces the company wanted was a NoSQL database. Catoen wanted Phonic to pick a NoSQL database and a document-based database because Phonic was changing schema so frequently, and did not have a rigid data model from day one. For these reasons, Catoen says that working with MongoDB Atlas was "a pretty obvious choice." The decision to go with MongoDB meant that Catoen and other company leaders could spend their valuable time thinking about the business and how to provide actual business value for their customers. “With MongoDB, we effectively leverage the entire core feature set,” Catoen says. “We run a lot of aggregation pipelines, which are super, super useful with dealing with large amounts of data. We can scale up and down with our cluster to support more, and that’s been fantastic.” Phonic’s tech stack is pretty simple in order to keep developer velocity high, Catoen says, the same motivations the team had when picking its NoSQL database. Phonic runs a React frontend, a Python backend, communicates with their MongoDB cluster, and uses RabbitMQ for event streaming. Google Cloud has been a critical part of Phonic’s success, as well, Catoen says. Phonic utilizes its cloud functions on Google Cloud, including its storage for distributed file storage, and on App Engine because of Google Cloud’s auto-scaling, especially when the company gets hit with a jump in traffic overnight. As for plans for the future, Catoen says the company is looking for a Series A round of funding and will launch an asynchronous research product. “Building up this conversation intelligence suite is going to be very important to the Phonic ecosystem going forward,” Catoen says. “We’re very excited about that.” Learn more about the MongoDB for Startups program.

July 13, 2022

Rockets, Rock ’n’ Roll, and Relational Databases — a Look Back at the Year of RDBMS

When you reach a certain age, you’d rather not be reminded of how old you are on your birthday. At 52 years old this summer, the relational database management system (RDBMS) has reached that point. But also at 52, it’s close to that other stage in life, the one in which you no longer care what others say about you. We’re talking things like: “It’s overly rigid and doesn't adapt to change very well.” “Queries aren’t fast enough to support my application’s needs.” “They’re prone to contention.” Happy 52nd birthday, RDBMS Let’s put things in perspective. The relational database was invented as close to World War I as it was to 2022. In fact, most developers using relational databases today weren’t even born when Edgar F. Codd, an English computer scientist working for IBM, published his paper " A Relational Model of Data for Large Shared Data Banks " in June 1970. At a time when computer calculations cost hundreds of dollars, Codd’s radical model offered a highly efficient method for expressing queries and extracting information. Once Codd’s relational model was implemented, it allowed unprecedented flexibility to work with data sets in new ways. His innovation laid a foundation for database theory that would dominate the next 40 years, culminating in today’s multi-billion-dollar database market. As a service to the developer community — and to commemorate 52 years as the workhorse database — we present other events in 1970, the year the relational database was born. Turning over a new leaf Relational databases were a huge leap forward when they were first conceived. Although there are many use cases that are still a good fit for relational databases, modern apps consist of smaller, modular microservices , each with unique query patterns, data modeling requirements, and scale requirements . Being able to model data according to the exact query patterns of an app is a huge benefit, which is where MongoDB Atlas comes in. MongoDB Atlas stores data in documents using a binary form of JavaScript Object Notation (JSON). Documents provide an intuitive and natural way to model data that is closely aligned with the objects developers work with in code. Rather than spreading out a record across multiple columns and tables, each record is stored in a single, hierarchical document. This model accelerates developer productivity, simplifies data access, and, in many cases, eliminates the need for expensive join operations and complex abstraction layers. MongoDB offers online courses for users with RDBMS and SQL knowledge to learn how to map relational databases to MongoDB. You can also set up a cluster and try MongoDB Atlas free .

July 12, 2022

Migrating Terabytes of IoT Data from Azure Cosmos DB to MongoDB Atlas

In 2020, a large European energy company began an ambitious plan to replace its traditional metering devices — all 7.6 million of them — with smart meters. That would allow the energy company to monitor gas use remotely and allow customers’ bills to more accurately reflect their energy consumption. At the same time, the company began installing smart components along their production network to monitor operations in real-time, manage alarms, use predictive maintenance tools, and find leaks using advanced technologies. The energy company knew this shift would result in a massive amount of data coming into their systems, and they thought they were ready for it. They understood the complexities of managing and leveraging data from the Internet of Things (IoT), such as the high velocity at which data must be ingested and the need for time-based data aggregations. They rolled out an IoT platform with big data and analytics tools to help them make progress toward their objectives of high-quality, efficient, and safe service. This article looks at how the company migrated their system to MongoDB Atlas in order to handle the massive influx of data. Managing data The energy company was managing 3TB of data on Microsoft’s Azure Cosmos DB , with the remainder housed and managed on a relational database. However, they started facing challenges with Cosmos DB, including a lack of scalability, increasing costs, and poor performance. The costs to maintain the pre-production and production environments were also becoming unsustainable. And, the situation wasn’t going to get better: By 2023, the energy company planned to increase the number of IoT devices and sensors by a factor of five, so they knew that Cosmos DB was not a viable solution for the long term. Migrating to MongoDB Atlas The energy company decided to migrate to MongoDB Atlas for several reasons. Atlas’ online archive, combined with the ability to create time-series sharded collections, makes Atlas an ideal fit for IoT data, as does the flexibility of the document data model. Additionally, a Cosmos DB-compatible API would minimize the impact on application code and make it easier to migrate applications. The customer chose PeerIslands to be its technical partner and help them make the migration. PeerIslands, a MongoDB partner, is an enterprise-class digital transformation company with an expert, multilingual team with significant experience working across multiple technologies and cloud platforms. PeerIslands has developed solutions for both homogenous and heterogenous workload migrations. Among these solutions is a Cosmos to MongoDB tool that helps perform one-time migrations and change data capture while minimizing downtime. The tool is fully GUI-based, and tasks such as infrastructure provisioning, dump and restore, change stream listeners, and processors have all been automated. For change capture, the tool uses the native MongoDB change stream APIs. Migration challenges In working with the energy company to perform the migration, the PeerIslands team faced two particular challenges: The large volume of data. Initial snapshotting of the data would take about one day. The application had significant write loads. On average, it was writing about 12,000 messages per second. However, the load was unevenly distributed, with spikes when devices would “wake up” and report their status. These two factors quickly generated close to 20 million change events in Cosmos DB that had to be synced to MongoDB. Meanwhile, new data was constantly being written into the Cosmos DB source. Cosmos2Atlas tool PeerIslands’ Cosmos2Atlas tool uses mongodump and mongorestore for one-time data migration and MongoDB Kafka Connector for real-time data synchronization. By using Apache Kafka, the Cosmos2Atlas tool was able to handle the large amount of change stream data and successfully manage the migration. To address the complexity of the migration, PeerIslands also enhanced the Cosmos2Atlas tool with additional capabilities: Parallelize the Kafka change stream processing using partitions. The Kafka partitioning strategy was in sync with the target Atlas sharding strategy. Use ReplaceOneBusinessKeyStrategy as the write model for Kafka MongoDB sink connector to write into sharded Atlas collections. By using its in-house Cosmos2Atlas tooling, PeerIslands was able to successfully complete the migration with near-zero downtime. Improved performance With the migration complete, the customer has already begun to realize the benefits of MongoDB Atlas for their massive amounts of IoT data. The user interface has become extremely responsive, even in front of more expensive queries. Because of the improved performance of the database, the customer is now able to pursue improvements and efficiencies in other areas. With better performance, the company expects consumption of the data to rise and their schema design to evolve. They’re looking to leverage the time-series benefits of MongoDB both to simplify their schema design and deliver richer IoT functionality. They’re also better equipped to rapidly respond to and fulfill business needs, because the database is no longer a limitation. Importantly, costs have decreased for the production environment, and even more dramatic cost reductions have been seen for the pre-production environment. Learn more about the Cosmos2Atlas tool and MongoDB’s time series capabilities . Interested in your own data migration? Contact us .

July 11, 2022

MACH Aligned for Retail: API-First

Retailers must constantly evolve to meet growing customer expectations and remain competitive. Both their internal- and external-facing applications must be developed using principles that promote agility and innovation, moving away from siloed architectures. As discussed in the first article of this series , the MACH Alliance promotes the development of modern applications through open tech ecosystems. MACH is an acronym that represents Microservices, API-first, Cloud-native SaaS, and Headless. MongoDB is a proud member of the Alliance, providing retailers with the tools to build highly flexible and scalable applications. This is the second in a series of blog posts focused on MACH and how retail organizations can leverage this framework to gain a competitive advantage. In this article, we’ll discuss concepts relating to the second letter of MACH: API-first. Read the first post in this series, "MACH Aligned for Retail: Microservices." What is an API-first approach and why is it important? An application programming interface (API) is a set of routines, protocols, and tools that allow applications, or services within a microservices architecture, to talk to each other. APIs can be seen as messengers that deliver requests and responses. Applications built around APIs are said to be API-first. With this approach, the design and development of APIs come before the software implementation. Typically, an interface is created that is used to host and develop the API. The development team will then leverage the interface to build the rest of the application. This methodology enables developers to have access to specific functionalities of external applications or other microservices within the same application, depending on their needs. It promotes reusability because functionalities are interoperable with mobile and other client applications. In addition, applications developed with an API layer in mind can adapt to new requirements more easily because additional services and automation can be integrated into production when new requirements arise, therefore remaining competitive for longer. An API-first approach to developing applications The role of API-first in retail APIs play a crucial role in deeply interconnected systems that need to interface with other internal applications, third-party partners, and customers — all key areas when it comes to developing powerful retail applications. Think about how an e-commerce platform connects to the different systems making up the purchase process, such as inventory management, checkout, payment processing, shipping, and loyalty programs. The use of APIs is deeply interlinked with the concept of microservices . Software and data need to be decoupled to enable retailers to meet ever-increasing requirements, including omnichannel and cross-platform integration, seamless experiences across physical and online stores, and the ability to leverage real-time capabilities that enable differentiating features, such as live inventory updates and real-time analytics. APIs can be seen as a bridge for loosely coupled microservices to communicate with each other. Besides enabling a microservices architecture, an API-first approach offers the following additional benefits: Avoid duplication of efforts and accelerate time to market . Developers can work on multiple frontends at the same time, being confident that functionalities can be integrated by embedding the same APIs once ready. Think of multiple development teams working on an e-commerce web application, mobile portal, and internal inventory management system all at the same time. An API enabling the placement of a new order can be seamlessly leveraged by the web and mobile application and fed into the inventory management system to aid warehouse workers. Bug-fixing and feature enhancements can happen simultaneously, avoiding duplication of efforts and allowing new capabilities to be released to market more quickly. Reduce risks and operating costs . An API-first approach enables system stability and interoperability from the beginning because API efficiency is placed at the center of the development lifecycle and is no longer an afterthought once the application or functionality has been developed. This approach reduces the risk for retailers and saves money and effort in troubleshooting unstable systems. Enable new opportunities and scale faster . A flexible approach revolving around APIs provides more opportunities when it comes to integrating and refactoring the way different client applications and microservices communicate with each other, allowing retailers to improve and scale their IT offering in a fraction of the time. This approach also changes the way retailers can interact with external partners and do business with them since they can be provided with the tools to easily integrate with the retailer’s offering. Achieve language flexibility . Effective retailers need to have the capability to adapt their digital offering to different regions and languages. The plug-in capabilities of API-first allow developers to offer language-agnostic solutions that different microservices can integrate with, leveraging region-specific frontends. Steps to an API-first application What is the alternative? The four MACH Alliance principles combined (Microservices, API-first, Cloud-native SaaS, Headless) act as a disrupting force compared to the way applications were built until recently. Adapting to a new technology paradigm requires effort and a different developer mindset. But what was there before? From an API-first perspective, it can be said that the opposite is code-first. With this approach, application development starts in the integrated development environment (IDE), in which code is written and the software takes shape. Development teams know that they will need to build an interface to be able to interact with each function of the code, but it is seldom a priority; developing core functionalities takes precedence over the interface where those functionalities will be hosted and accessed. When the time comes for the interface to be developed, the code has already been defined. This means the API is developed around existing code rather than vice versa, which poses limitations. For example, developers might not be able to return data the way they want because of the underlying data schema. The code-first approach Bottlenecks can also occur as other teams requiring the API will need to wait until the code is finalized to be able to embed it in their underlying applications. Any delays in the software development lifecycle will hold them up and delay progress. Although a code-first approach might have worked in the past, it is no longer suitable for dealing with highly interconnected applications. Learn more about how MongoDB and MACH are changing the game for ecommerce. How MongoDB helps achieve an API-first approach Simply lifting and shifting monolithic applications to a microservice and API-first architecture will only provide minimal benefits if they are still supported by a relational data layer. This is where most of the bottlenecks occur. Changes to application functionalities will require constant refactoring of the database schemas, object-relational mapping (ORM), and refining at the microservice level. Moving to a modern MACH architecture requires a modern data platform that removes data silos. The MongoDB developer data platform provides a flexible data model, along with automation and scalability features to adapt to even the most challenging retail use cases and to multiple platforms (e.g., on-premises, cloud, mobile, and web applications). MongoDB Atlas, MongoDB’s fully managed cloud database, also provides capabilities to manage the data layer end to end via APIs, such as the MongoDB Atlas Data API . This is a REST-like, resilient API for accessing all Atlas data that enables CRUD operations and aggregations with instantly generated endpoints. This is a perfect answer to an API-first approach, since developers can access their data using the same principles leveraged to connect to other applications and services. The MongoDB Atlas Data API workflow MongoDB’s Atlas Data API provides several other benefits, allowing developers to: Build faster with developer-friendly data access. Developers work with a familiar, REST-like query and response format, no client-side drivers are necessary. Scale confidently with a resilient, fully managed API that reduces the operational complexity needed to start reading and writing your data. Integrate your MongoDB Atlas data seamlessly into any part of your stack — from microservices to analytics workloads. This article has provided only a sample of what can be leveraged via MongoDB’s APIs. The MongoDB Query API provides a comprehensive set of features to seamlessly work with data in a native, familiar way. It supports multiple index types, geospatial data, materialized views, full-text search, and much more. In the next part in this MongoDB and MACH Alliance series, we will discuss how a cloud-native SaaS architecture can enable full application flexibility and scalability. Read the first post in this series, "MACH Aligned for Retail: Microservices."

June 24, 2022

MongoDB and Clarity Business Solutions: Enabling Modernization for Public Sector Clients

Cloud-based transformation is now a must-have for federal agencies. And, in partnership with Clarity Business Solutions, MongoDB is making that transformation easier for government agencies, particularly those that work in closed, air-gapped environments and that require security clearances for their support staff. In this article, we’ll look at specific ways MongoDB and Clarity Business Solutions are working together to support public sector clients. Cloud challenges IT teams within government agencies want many of the same cloud benefits that their colleagues in the private sector enjoy: better performance, the ability to outsource the management of their infrastructure, and a path to building more resilient applications. And government leaders, at the national level, are pushing for more cloud adoption. A May 2021 executive order from President Joe Biden called on all federal agencies to “accelerate movement to secure cloud services.” In the mission to support U.S. troops, Danielle Metz, the U.S. Department of Defense’s Deputy CIO for Information Enterprise, said , “It all comes down to harnessing the power of cloud compute and then being able to natively build applications continuously and often in that space.” For government agencies, security is an overriding concern. However, some of the precautions that enable the highest levels of security also make it more difficult to keep applications up to date, to modernize them, and to move them to the cloud. Government agencies often work in closed networks, without access to the internet. The need for security clearances makes it difficult for agencies to take advantage of support from software companies, consultants, and other members of the technology ecosystem. Even personnel with appropriate security clearances aren’t always allowed to go on-site to assist their government clients. These extra layers of security can also make it difficult for MongoDB to support public sector clients that require top secret clearances. Now, we’re pleased to announce that our ability to service these clients has been enhanced through our partnership with Clarity Business Solutions , a software and systems engineering company that focuses on data analytics, processing, and data flow. Clarity specializes in working with the federal government and understands the constraints under which government agencies operate — as well as the requisition procedures and security protocols unique to the federal government. The company has experience working in closed, air-gapped environments, and all but two of Clarity’s employees hold security clearances. Joint solutions In the first phase of our partnership, MongoDB and Clarity are jointly offering three unique solutions to better support our public sector clients: Application modernization Trusted Tier Support Rapid start Let’s look at each of these solutions in turn. Application modernization Together, Clarity and MongoDB offer public sector clients a proven, iterative approach to application modernization. Clarity’s security clearances allow them to sit side-by-side with public sector clients when necessary, and Clarity has deep experience with the environments common to government agencies. Clarity and MongoDB leverage a strategic process for analyzing legacy applications and modernizing and migrating them iteratively, rather than trying to update an entire system in a “big bang” approach. This iterative approach allows clients to modernize without downtime. Legacy and modernized systems can run in parallel for a period of time, enabling troubleshooting and increasing confidence. Clarity and MongoDB combine the power of the MongoDB application data platform with Clarity’s extensive client domain knowledge. This partnership allows teams to focus on the application feature development and quickly get the data platform operational. Public sector clients often operate systems with significant accumulated technical debt. Clarity and MongoDB are partnering to help clients increase efficiency, improve performance and scalability, and optimize maintenance as each monolithic application is modernized, rather than waiting years for an entire system to be replaced. Trusted tier support MongoDB and Clarity Business Solutions are offering a concierge support service specifically for the public sector. Trusted Tier Support engages U.S.-only technical staff, with appropriate clearances, to provide phone, online, or even on-site support for MongoDB government customers. Trusted Tier Support provides continuity between call-in support and support offered by individuals with on-site clearance. Clarity TrustedTier Support engineers are tightly integrated with the MongoDB support team and can rely on the expertise of the broader MongoDB engineering organization while ensuring that all necessary details remain confidential. Service-level agreements are twice the MongoDB published response times for commercial support. Rapid start This new service helps public sector clients get operational with MongoDB as quickly and efficiently as possible. This intense, short engagement ensures the following: Networking layouts are optimized and secured using appropriate firewall rules and TLS to encrypt all data in transit. Data storage is set up to meet applications’ needs. Backup and recovery are properly enabled. Agencies have the proper guidance to achieve environment security requirements. For example, data at rest is encrypted according to client requirements through disk encryption and/or MongoDB’s encrypted storage engine. Data in use can be protected with client-side field-level encryption. Additionally, Clarity engineers can consult and provide input on schema design, leveraging key MongoDB features and working with training staff on the best practices for using MongoDB. We believe these three new offerings will significantly ease the way for our government clients, enabling them to make the best use of MongoDB and cloud technologies and to better serve their end customer — all of us. Learn more about Clarity Business Solutions . For more information, check out our Solutions Briefs: Trusted Tier Support for Secure Environments RapidStart: The Power of MongoDB at the Speed of Relevance Rapid, Predictable Modernization for Public Sector Clients

June 23, 2022

How Telcos Are Transforming to Digital Services Providers

The telecommunications industry is in the midst of a digital revolution, shifting from a traditional service delivery model to one that is increasingly customer-centric and that extends beyond the provision of traditional connectivity services to include diverse digital services. Telcos undergoing this modernization journey are digital services–focused first, offering apps, streaming services, retail platforms, peer-to-peer payment platforms, and more. As telcos delve into the complex 5G, IoT, and AI technologies powering personalized and real-time user experiences, pressure is increasing on aging networks and business support system (BSS) infrastructures. MongoDB customers like TIM and Telefónica are using the MongoDB Atlas developer data platform to deliver a robust platform-focused experience that complements existing technologies. Through an integrated modernization approach, telcos are improving both customer and developer experiences, building innovative new applications. In a recent roundtable discussion , Boris Bialek , MongoDB global head of industry & solutions, sat down with telco IT leaders Paolo Bazzica , head of digital solutions at Italy’s TIM, and Carlos Carazo , global CTO of Spain’s Telefónica Tech IoT and Big Data division. This article provides an overview of the discussion and insights into how platform thinking is invigorating telco IT teams. From communications services providers to digital services providers The shifting value chain in telecommunications. Source: Kearney The shift and expansion from traditional communications services to a comprehensive digital services suite requires global telecommunications companies to rethink their monetization strategies. Even before the pandemic, an evolution was well underway for telecommunications providers. From 2010 to 2020, overall revenue coming from connectivity services grew by only 2%, according to research compiled by Kearney. During the same period, digital services experienced a five-fold increase. Although telecommunications providers successfully sparked a revolution that grew into a $6.3 trillion digital economy, only those capitalizing on digital services reaped the benefits. In 2020, digital services like e-commerce and online advertising surged, capturing nearly 80% of growth. Leveraging platform thinking As network operators evolve to digital service providers, the idea of platform thinking is rippling across the industry. Network connectivity was tested with the hardships of the March 2020 COVID-19 lockdown in Italy, but TIM’s digital platform project Fly Together , which was initiated in 2018, helped bridge the divide. “People went from their normal lives to a full lockdown in one day. People realized that telco was a key point, because you need to stay at home, but you still need to communicate to work and go to school,” said Bazzica in the virtual roundtable discussion hosted by MongoDB. “Our digital platform was the way to refill or top up your account, and access ebooks and so on, so I think it’s more than just an evolution for the business; it's a different positioning.” Today, customer trust is a key differentiator and essential focus for TIM. People rely on TIM’s services to keep the country going. And TIM continues to modernize the digital experiences of its customers through the Fly Together platform. “From my perspective, this is definitely a trend, and I think it’s the evolutionary stalwart of the digital life of the people to be relevant and continue to be their trusted partner,” Bazzica said. A similar dynamic led to the creation of Telefónica Tech two years ago, a division of Spain’s Telefónica SA, according to Carazo. The new business is split into two units: one dedicated to offering cloud or cybersecurity solutions and the other offering IoT or big data digital services, which are the services customers need to pursue their own digital transformations. “We are strongly convinced that connectivity is the basis for any new digital economy, so we are really proud to offer connectivity for these customers,” Carazo said. At the center of Telefónica Tech’s transformation is its Kite Platform , run on MongoDB, which is a managed connectivity platform running close to 30 million IoT devices all over the world. The platform provides connectivity, but it goes beyond IoT connectivity and provides multidimensional benefits across all IoT environments from the devices to the product connecting the clouds. This is the foundational component of Telefónica Tech’s portfolio, which delivers new business use cases across industries. Modernizing applications and evolving to microservices and APIs How can a telco simplify this complex journey to modernization? For TIM, the change was driven by a desire to modernize 700 different applications before effectively going into the digital business. TIM launched Fly Together to build a digital layer that serves the scalability and latency needed to transform customers’ digital service experiences. Before, a customer could be querying up to 14 systems, depending on which apps were open. Without the digital experience layer, you can’t express an SLA or determine how long it takes to open an app, according to Bazzica. The first task of Fly Together was to build the layer that decoupled the backend systems from the model that helps run TIM’s digital channels. Through its work with MongoDB over the past four years, TIM launched a resilient platform that doesn’t require exotic hardware to run efficiently. Because the platform was developed in a cloud-native environment, it comprises containerized microservices and RESTful APIs, setting a new standard for the company’s development of applications. “We are able to modernize, but gradually. We still have our mainframe running,” Bazzica said. “The real experience is seeing the company learning and experimenting. That’s another value with this type of technology; we can try a lot of different things with minimum effort and make big discoveries.” Four digital services trends to watch IoT is driving many exciting use cases for Telefónica Tech’s new business division. Within the B2B sector, there is healthy growth across four key industry use cases, according to Telefónica’s Carazo. Connected Industry and IoT — Telefónica starts with providing private network solutions. These technologies are expected to evolve to more complex use cases like robotics and predictive maintenance in small and medium factories within the next five years. Smart metering — Massive growth is expected in smart metering, which uses electronic devices to measure energy consumption. The implementation of this trend could spur demand for millions of connected devices. Connected cars — This sector is expected to grow significantly in the next five to 10 years as operators deploy new digital services like infotainment, security, and safety applications. Smart cities — Cities around the world are seeking services for their digital citizens looking to live in more sustainable and flexible communities. These use cases are critical to building modern cities, societies, and industries. Platform thinking and an integrated approach to modernization will help telcos create modern applications, extending their businesses beyond conventional services to include novel digital services. Watch our webinar to learn more about TIM and Telefónica’s transformation to digital services providers.

June 22, 2022

Ready to get Started with MongoDB Atlas?

Start Free