Unifying Identity to Drive Customer Experience at a Leading Telco
As telecommunications companies around the world diversify product portfolios, adhere to new regulations and execute acquisitions and mergers to excel in a mature industry, customers’ expectations for flawless service, speed, and availability are only growing. All of these combined challenges put pressure on companies' applications and tech stack. More and more, telecommunications leaders are turning to open digital architectures to modernize the legacy enterprise architectures that can’t keep up with today’s customer demands. Throughout the industry, TM Forum’s Open APIs are becoming an integral part of digital transformations. Open source APIs make it easier for telecommunications companies to enable seamless connectivity, interoperability, and portability across a complex ecosystem of services in a consistent way across the industry. MongoDB’s customer, one of the largest telecommunications companies in the world, is on the trend. Read on to learn how this telecommunications giant collaborated with MongoDB Professional Services to modernize and implement TM Forum Open APIs to unlock data to provide great customer experience. The challenge: Delivering a simple, consistent customer experience in the telecommunications industry With founding roots dating back more than 150 years, MongoDB’s customer has a long history that led to a large number of subsidiaries covering a variety of services for end-consumers, corporate clients, and governments. The company’s surging customer base began to outgrow its data and systems architecture. It became difficult to identify customers who held multiple products across the rapidly expanding portfolio of products and services, especially since many of the customers were adopted through business acquisitions. As the customer base grew, it became more difficult to provide a positive customer experience, and also resulted in missed marketing and cross-sell opportunities. To improve customer interactions, the telecommunications enterprise envisioned the creation of a hub that unified customer identity across all services, products, and partners with MongoDB at the heart of it. At a high level, this would be a data layer that accesses customer information in accordance with TM Forum specifications, creating a consistent single view of the customer . The company also aimed to decouple access to customer information from their underlying legacy systems to empower internal teams to drive their own transformation projects. The end goals: Deliver good customer experiences for accessing, purchasing, and managing accounts across the company’s existing services portfolio. Make it easier for future services and acquisitions to be seamlessly integrated. The solution: A profile hub using TM Forum Open APIs To build this hub, the telecommunications company turned to MongoDB Professional Services , which provided a Jumpstart team in partnership with gravity9 . Think of this combination as a complete application development team in a box, ready to bring this solution to life. This single view of identity, called Profile Hub, would put the company’s customer profile at the core of their data concepts, flipping their previous legacy data model on its head. Going forward, everything would start with the customer profile, and move into products and services from there—instead of the other way around. Profile Hub is an implementation of several Open APIs established by TM Forum , a global industry association for service providers and their suppliers in the telecommunications industry. The APIs we implemented form the basis for representing a customer, their role, and a set of permissions on that role. The API microservice applications were built using Java Spring Boot and are powered by MongoDB Atlas running in AWS. MongoDB change streams and Kafka were used to create an event-driven architecture, and a behavior-driven testing approach was used to run more than 1,000 automated tests to form a “living specification” for developed code. Figure 1: Functional structure of each TM Forum API microservice application Each project contains: Implementation of REST APIs (CRUD) Event notification upon each successful API operation Integration with Amazon SNS and Kafka Each TM Forum API application is a separate microservice that implements the appropriate TM Forum specification, conforming to the REST API Design Guidelines ( TMF630 .) Each one exposes the following operations: Retrieval of a single object List a collection of objects (supports limit, offset, sort, projection, and filtering by properties) Partial update of an existing object Creation of a new object Deletion of an existing object There are two ways an object may be updated via the application: JSON Patch: performs an update as a series of operations which transforms a source document. We can filter/search documents using JSON Pointer or JSON Path JSON Merge Patch: represents the change as a lite version of the source document Each TM Forum API Application exposes REST interfaces to exchange data with the client. After receiving the payload, the application stores it in the MongoDB collection. After each successful API operation, MongoDB triggers a change stream event to be fired off—the application is listening for change stream events, and after receiving one it sends an event to the customer. Our microservice application supports fanout messaging scenarios using AWS services: Amazon Simple Notification Service (SNS) and Amazon Simple Queue Service (SQS). In this scenario, messages are pushed to multiple subscribers, which eliminates the need to periodically check or poll for updates. This also enables parallel asynchronous processing of the message by the subscribers. There are application configuration parameters which decide where the given message should be routed to. Each event sent to the external system is also stored in the audit collection. If it does not reach the destination, we can replay the events sequence again to restore the desired state. Another library also provides operations logging functionality. It can trace each request and response sent to the application, push it to the Kafka topic, then through the MongoDB connector to reach the MongoDB collection. This operations logging application can easily be integrated with every TM Forum API microservice application. For security, we encrypt data on the client side before it gets sent over the network using MongoDB’s built-in Client-side Field Level Encryption . Paired with this, we use a couple AWS services: First is AWS Key Management Service (KMS) , which gives centralized control over the cryptographic keys used to protect the data. Second, we use AWS Secrets Manager , which is a secure and convenient storage system for API keys, passwords, certificates, and other sensitive data. Stripped change streams also help us limit the information we send instead of sending the whole payload as a change stream body. Every TM Forum API is different as they have different domain models to work on. To test these unique applications, we use data-driven testing with the Spock framework. This lets us test the same behavior multiple times with different parameters and assertions, letting us hit upwards of 1200 unit tests per application with only a few test cases implemented. The results: A modern, customer-centric architecture The Profile Hub APIs form the core of the company’s new standards-based customer data architecture. This creates a better cx strategy by allowing upstream applications to easily leverage customer data. The enterprise will be able to reuse these components to accelerate new use case implementations on MongoDB Atlas. In addition, as the company modernizes its architecture, it will realize cost savings by moving off of legacy infrastructure with increased maintenance and licenses costs. Developers also benefit by getting to spend less time maintaining and having more time to build and launch new applications and products. By working with MongoDB Professional Services and gravity9, the company was able to develop this solution in under 12 weeks, shaving several months off of their original plans. New fully compliant TM Forum APIs can also now be delivered in a single sprint, allowing the telco to respond quickly to new business requirements. Looking forward, our client’s modern, customer-centric architecture will make it easier for them to navigate customer journeys and unlock revenue opportunities as they provide their customers with better, more connected products and experiences. Are you ready to meet with us to use MongoDB’s blueprint for accelerating TM Forum Open API implementation? Reach out to our Professional Services team to get started!
Modernizing Banking Technology Architecture with MongoDB and Gravity9
The banking industry has historically relied on legacy, on-premises systems to store critical financial data in a secure and resilient technology fabric. Today, as transactions happen in real-time and customer demands soar, legacy systems fail to support accelerated modernization and restrict innovation. This change has driven banking enterprises to consider transitioning to newer technologies that can plug better capabilities into business-critical systems while preparing them for tomorrow's challenges. By leveraging MongoDB, Gravity9 enables customers to migrate from legacy systems to more sophisticated technologies in a planned and piecemeal technique. In its effort to modernize technologies that are slowly turning obsolete, Gravity9 recognizes and retains the prominence of legacy systems while migrating with minimal disruption to the status quo. This article describes how Gravity9 performed a data offload from a core retail banking platform to MongoDB using a digital decoupling approach, as shown in Figure 1. Figure 1: Technologies used – Kafka Streams, MongoDB Kafka connectors, Apache Avro, Spring WebFlux, Spring Data Reactive MongoDB repository. The challenge Gravity9's client encountered many performance issues with their core banking platform, Temenos 24, backed by Oracle. Their attempt to offload frequently accessed data from the primary system to the Oracle database failed to give the edge of performance and scalability. The platform was being leveraged for analytical and transactional operations and could not cater to both areas as envisioned. In some instances, this approach resulted in customer transaction rejections. Additionally, the client faced the following challenges: The offload platform proved difficult to manage and extend with most of the logic for business processes written in the stored procedures. The data model within the core banking system was not inherently relational, which amplified the complexity of the platform due to the presence of only two columns: RECID for the ID or Unique Key of the record, and XMLRECORD to store the data. The complexities of the old data source compelled the client to adopt a migration strategy and gradually move into MongoDB, which offered better scalability and a flexible document data model while also migrating the corresponding business logic code. The client expected an improvement in overall performance, maintainability, extensibility, and the potential to deliver better customer experiences. The approach Gravity9 leveraged a digital decoupling approach to create message-driven microservices on top of the client's data. The approach was oriented to a continuous migration process executed in parts rather than in one fell swoop. Using this methodology, Gravity9 could help the client move from the old offload system to MongoDB by building fully scalable microservices for each business domain. The biggest advantage to this approach was that the client's legacy systems could continue as usual behind the scenes. Using a CDC pattern on the underlying Oracle database, the data from the core banking system was gathered, and snapshots modified in the core legacy system were published in real time. An event was generated every time a modification was made in the legacy/core system to keep track of changed data. Gravity9 built an application to learn about the new changes, transform the raw XML format into a structured message and refine it as necessary while broadcasting it as a message to other customers, who could process it or store it on the new MongoDB datastore. Specific dictionaries were developed for this purpose with clear directions about field markers and object structure for the transformed objects. The outcome Although the execution was intended to offload only a part of the data stored on the client's old system, the approach used helped build capabilities to support future migrations of larger volumes. The use of MongoDB Atlas for the implementation delivered efficiency and security in data offloading and met the desired level of performance. Consequently, it resolved scalability issues that would otherwise have occurred with on-premises systems. Although the migrated use cases work on MongoDB-based offload, the implementation left the remaining use cases untouched, allowing them to work on the older systems without handicapping the functioning. Seeking to modernize your banking technology environment? Speak to us today to learn how Gravity9 and MongoDB can help you make the best of new technologies and secure your performance-critical financial data.
Application Modernization with gravity9 And MongoDB Atlas: How Digital Decoupling Supports the Customer Offering
The goal of most organizations is pretty clear: to improve customer offerings and become more operationally efficient, streamlined and profitable. But is it possible for organizations to excel in an agile fashion when they are reliant upon legacy systems? It’s the age-old dilemma between risk and innovation. How can you mitigate the former while accelerating the latter? Nearly all organizations operate with some type of legacy system in place that, often, is central to the operation of the business or the customer offering—i.e., one that would be highly costly and disruptive to move away from. To solve this predicament, the process of digital decoupling enables organizations to detach incrementally from legacy systems while acknowledging the critical role they often play. In this blog, we’ll further explore the value of digital decoupling as well as introduce how gravity9 with MongoDB Atlas delivers the smoothest transition possible. Why Not Simply Upgrade? Digital decoupling is not a “big-bang” upgrade where one system is fully replaced by another overnight; rather, it allows for the continued existence of your legacy system as part of your digital architecture while simultaneously unlocking innovation. But why not simply upgrade? Isn’t “out with the old and in with the new” the faster route to take? Not always. When a big-bang upgrade focuses on the replacement of a legacy system that is central to a customer offering or business operations, it becomes a much more complex, risky, and time-intensive undertaking. Often, many months or years can pass before any value is delivered to customers. But while your organization is focusing time and effort on a long-term, large-scale system replacement initiative, your customers’ needs will be changing and your competitors will be continuing to innovate. Once your new system is finally ready to be deployed to the marketplace, there’s a good chance it may already be rendered obsolete. Digital decoupling offers a faster, less risky, and more flexible alternative. The legacy system is maintained as the core of your business, but strategic portions are exposed through modern microservices to allow for the rapid creation of new digital products and offerings. The organization can utilize new, modern technologies to maintain the functionality of the legacy system while building a more advanced, digital architecture around it. By maintaining the existing legacy system, the organization significantly reduces disruption and risk while unlocking the ability to innovate new products on a rapid timescale. How it Works Applying a digital decoupling approach makes quickly innovating new digital products and services on top of your data possible by way of microservices and an event-driven architecture. Microservice Architecture after Digital Decoupling By utilizing event-driven architecture, individual systems and capabilities can be built as fully scalable microservices each with their own database, allowing solutions to be built around each microservice that can be combined to provide limitless additional capabilities and services for customers in a rapid and agile fashion. Digital decoupling creates a customer experience delivered via a modern, feature-rich UI or website that is intuitive, user friendly and continuously evolving, while the legacy system still operates behind the scenes. After years of working with large organizations, the solutions architects at gravity9 have a deep understanding of event-driven architecture as a solution to digital decoupling. Our adherence to domain-driven design is in our DNA , it is how we build solutions and is core to the way we work….We build event-driven microservices on top of monolithic legacy architecture. Noel Ady, gravity9 Founding Partner. By utilizing domain-driven design, system actions are communicated or triggered by way of an event, with colloquialized messages sent between the legacy application and the new architecture via a bus. An adaptor is created to sit in front of the legacy system and speak to your “new IT” in the language of events. This adaptor looks at the data in your legacy system and raises events when changes occur, then optionally writes back changes raised by other systems, allowing your legacy system to participate in the event-driven architecture. The use of APIs ensures the traffic is two-way and non-intrusive to the legacy application so that it can continue to operate as expected. One of the key technology concerns related to adaptors for legacy systems is the concept of a “delta store.” Events in an event-driven architecture should contain the context for the event, often including the previous value, to help receiving systems properly respond to the event. In more modern systems it’s possible to get this data from webhooks or similar alternatives, but these mechanisms won’t exist in older legacy systems so a different approach via a delta store is needed. A delta store will contain the history of changes on a value (the ‘deltas’) to allow the adaptor to properly construct the event context and to ensure that events are only raised for true changes in values. Why MongoDB? MongoDB’s flexible data schema makes it an excellent implementation technology for a delta store, allowing a dynamic mechanism that can flex to new data and event types on demand. gravity9 partners with MongoDB Atlas, MongoDB’s multi-cloud, secure and flexible database service, as an integral technology enabler of digital decoupling to increase flexibility of the resulting architecture. Importantly, Atlas also enhances reliability for mission-critical production databases with continuous backups and point-in-time recovery. It’s secure for sensitive data and automates key processes like infrastructure provisioning, setup and deployment so teams can access the database resources they need, when they need them. Best of all, MongoDB’s features and benefits help free up developer time so they can focus their talent on more innovative tasks. What Should I do Next? The logic is just as important as the physical in digital decoupling when it comes to modelling your events. Utilizing best-practice, domain-driven design alongside a proven approach is the key to success. Together, gravity9 and MongoDB have replicated this success time and time again, enabling organizations to lay the foundations for newer more modern architecture without the disruption of removing their legacy systems. Interested in learning more about MongoDB’s Modernization Program? Contact us today!