Joerg Schmuecker

6 results

Transforming Industries with MongoDB and AI: Financial Services

This is the fourth in a six-part series focusing on critical AI use cases across several industries . The series covers the manufacturing and motion, financial services, retail, telecommunications and media, insurance, and healthcare industries. In the dynamic world of financial services, the partnership between artificial intelligence (AI) and banking services is reshaping traditional practices, offering innovative solutions across critical functions. Relationship management support with chatbots One key service that relationship managers provide to their private banking customers is aggregating and condensing information. Because banks typically operate on fragmented infrastructure with information spread across different departments, solutions, and applications, this can require a lot of detailed knowledge about this infrastructure and how to source information such as: When are the next coupon dates for bonds in the portfolio? What has been the cost of transactions for a given portfolio? What would be a summary of our latest research? Please generate a summary of my conversation with the client. Until now, these activities would be highly manual and exploratory. For example, a relationship manager (RM) looking for the next coupon dates would likely have to go into each of the clients' individual positions and manually look up the coupon dates. If this is a frequent enough activity, the RM could raise a request for change with the product manager of the portfolio management software to add this as a standardized report. But even if such a standardized report existed, the RM might struggle to find the report quickly. Overall, the process is time-consuming. Generative AI systems can facilitate such tasks. Even without specifically trained models, RAG can be used to have the AI generate the correct answers, provide the inquirer with a detailed explanation of how to get to the data, and, in the same cases directly execute the query against the system and report back the results. Similar to a human, it is critical that the algorithm has access to not only the primary business data, e.g. the portfolio data of the customer, but also user manuals and static data. Detailed customer data, in machine-readable format and as text documents, is used to personalize the output for the individual customer. In an interactive process, the RM can instruct the AI to add more information about specific topics, tweak the text, or make any other necessary changes. Ultimately, the RM will be the quality control for the AI’s output to mitigate hallucinations or information gaps. As outlined above, not only will the AI need highly heterogeneous data from highly structured portfolio information to text documents and system manuals to provide a flexible natural language interface for the RMs, it will also have to have timely processing information about a customer's transactions, positions, and investment objectives. Providing transactional database capabilities as well as vector search makes it easy to build RAG-based applications using MongoDB’s developer data platform. Risk management and regulatory compliance Risk and fraud prevention Banks are tasked with safeguarding customer assets and detecting fraud , verifying customer identities, supporting sanctions regimes (Sanctions), and preventing various illegal activities (AML). The challenge is magnified by the sheer volume and complexity of regulations, making the integration of new rules into bank infrastructure costly, time-consuming, and often inadequate. For instance, when the EU's Fifth Anti-Money Laundering Directive was implemented, it broadened regulations to cover virtual currencies and prepaid cards . Banks had to update their onboarding processes swiftly, and software, train staff, and possibly update their customer interfaces to comply with these new requirements. AI offers a transformative approach to fraud detection and risk management by automating the interpretation of regulations, supporting data cleansing, and enhancing the efficacy of surveillance systems. Unlike static, rules-based frameworks that may miss or misidentify fraud due to narrow scope or limited data, AI can adaptively learn and analyze vast datasets to identify suspicious activities more accurately. Machine learning, in particular, has shown promise in trade surveillance, offering a more dynamic and comprehensive approach to fraud prevention. Regulatory compliance and code change assistance The regulatory landscape for banks has grown increasingly complex, demanding significant resources for the implementation of numerous regulations. Traditionally, adapting to new regulations has required the manual translation of legal text into code, provisioning of data, and thorough quality control—a process that is both costly and time-consuming, often leading to incomplete or insufficient compliance. For instance, to comply with the Basel III international banking regulations , developers must undertake extensive coding changes to accommodate the requirements laid out in thousands of pages of documentation. AI has the capacity to revolutionize compliance by automating the translation of regulatory texts into actionable data requirements and validating compliance through intelligent analysis. This approach is not without its challenges, as AI-based systems may produce non-deterministic outcomes and unexpected errors. However, the ability to rapidly adapt to new regulations and provide detailed records of compliance processes can significantly enhance regulatory adherence. Financial document search and summarization Financial institutions, encompassing both retail banks and capital market firms, handle a broad spectrum of documents critical to their operations. Retail banks focus on contracts, policies, credit memos, underwriting documents, and regulatory filings, which are pivotal for daily banking services. On the other hand, capital market firms delve into company filings, transcripts, reports, and intricate data sets to grasp global market dynamics and risk assessments. These documents often arrive in unstructured formats, presenting challenges in efficiently locating and synthesizing the necessary information. While retail banks aim to streamline customer and internal operations, capital market firms prioritize the rapid and effective analysis of diverse data to inform their investment strategies. Both retail banks and capital market firms allocate considerable time to searching for and condensing information from documents internally, resulting in reduced direct engagement with their clients. Generative AI can streamline the process of finding and integrating information from documents by using NLP and machine learning to understand and summarize content. This reduces the need for manual searches, allowing bank staff to access relevant information more quickly. MongoDB can store vast amounts of both live and historical data, regardless of its format, which is typically needed for AI applications. It offers Vector Search capabilities essential for retrieval-augmented generation (RAG). MongoDB supports transactions, ensuring data accuracy and consistency for AI model retraining with live data. It facilitates data access for both deterministic algorithms and AI-driven rules through a single interface. MongoDB boasts a strong partnership ecosystem , including companies like Radiant AI and Mistral AI, to speed solution development. ESG analysis Environmental, social, and governance (ESG) considerations can have a profound impact on organizations. For example, regulatory changes—especially in Europe—have compelled financial institutions to integrate ESG into investment and lending decisions. Regulations such as the EU Sustainable Finance Disclosure Regulation (SFDR) and the EU Taxonomy Regulation are examples of such directives that require financial institutions to consider environmental sustainability in their operations and investment products. Investors' demand for sustainable options has surged, leading to increased ESG-focused funds. The regulatory and commercial requirements, in turn, drive banks to also improve their green lending practices . This shift is strategic for financial institutions, attracting clients, managing risks, and creating long-term value. However, financial institutions face many challenges in managing different aspects of improving their ESG analysis. The key challenges include defining and aligning standards and processes and managing the flood of rapidly changing and varied data to be included for ESG analysis purposes. AI can help to address these key challenges in not only an automatic but also adaptive manner via techniques like machine learning. Financial institutions and ESG solution providers have already leveraged AI to extract insights from corporate reports, social media, and environmental data, improving the accuracy and depth of ESG analysis. As the market demands a more sustainable and equitable society, predictive AI combined with generative AI can also help to reduce bias in lending to create fairer and more inclusive financing while improving the predictive powers. The power of AI can help facilitate the development of sophisticated sustainability models and strategies, marking a leap forward in integrating ESG into broader financial and corporate practices. Credit scoring The convergence of alternative data, artificial intelligence, and generative AI is reshaping the foundations of credit scoring, marking a pivotal moment in the financial industry. The challenges of traditional models are being overcome by adopting alternative credit scoring methods, offering a more inclusive and nuanced assessment. Generative AI, while introducing the potential challenge of hallucination, represents the forefront of innovation, not only revolutionizing technological capabilities but fundamentally redefining how credit is evaluated, fostering a new era of financial inclusivity, efficiency, and fairness. The use of artificial intelligence, in particular generative artificial intelligence, as an alternative method to credit scoring has emerged as a transformative force to address the challenges of traditional credit scoring methods for several reasons: Alternative data analysis: AI models can process a myriad of information, including alternative data such as utility payments and rental history, to create a more comprehensive assessment of an individual's creditworthiness. AI offers unparalleled adaptability : As economic conditions change and consumer behaviors evolve, AI-powered models can quickly adjust. Fraud detection: AI algorithms can detect fraudulent behavior by identifying anomalies and suspicious patterns in credit applications and transaction data. Predictive analysts: AI algorithms, particularly ML techniques, can be used to build predictive models that identify patterns and correlations in historical credit data. Behavioral analysis: AI algorithms can analyze behavioral data sets to understand financial habits and risk propensity. By harnessing the power of artificial intelligence, lenders can make more informed lending decisions, expand access to credit, and better serve consumers (especially those with limited credit history). However, to mitigate potential biases and ensure consumer trust, it's crucial to ensure transparency, fairness, and regulatory compliance when deploying artificial intelligence in credit scoring. AI in payments A lack of developer capacity is one of the biggest challenges for banks when delivering payment product innovation. Banks believe the product enhancements they could not deliver in the past two years due to resource constraints would have supported a 5.3% growth in payments revenues . With this in mind and the revolutionary transformation with the integration of AI, it is imperative to consider how to free up developer resources to make the most of these opportunities. There are several areas in which banks can apply AI to unlock new revenue streams and efficiency gains. The image below provides a high-level view of eight of the principal themes and areas. This is not an exhaustive view but does demonstrate the depth and breadth of current opportunities. In each example, there are already banks that have begun to bring services or enhancements to the market using AI technologies or are otherwise experimenting with the technology. Learn more about AI use cases for top industries in our new ebook, How Leading Industries are Transforming with AI and MongoDB Atlas .

April 4, 2024

Every Operational Data Layer (ODL) Can Benefit From Search

In today's digital landscape, organizations frequently encounter the daunting challenge of managing complex data architectures. Multiple systems, diverse technologies, and a variety of programming languages become entwined, making smooth operations a significant struggle. A frequent example of this issue is seen in some major banks still relying on a banking system built in the 1970s, continuing to run on a mainframe with minimal updates. The consequence is a complex architecture as seen in Figure 1, where data is scattered across various systems, creating inefficiencies and hindering seamless operations. Offloading the data from one or more monolithic systems is a well-proven approach to increase agility and deliver new innovative services to external and internal customers. In this blog we will speak about how search can make Operational Data Layers (ODL) – an architectural pattern that centrally integrates and organizes siloed enterprise data, making it available to consuming applications – an even more powerful tool. Figure 1: Complex Data Architecture Operational Data Store (ODS) as a solution To tackle the complexities of their existing data architecture, organizations have turned to Operational Data Stores (ODS). An ODS serves as a secondary data store, holding data replicated of primary transactional systems as seen in Figure 2. Organizations can feed their ODS with change data capture technologies. Figure 2: Conceptual model of an Operational Data Layer The evolutionary path of adoption Implementing an ODS requires a thoughtful approach that aligns with the organization's digital transformation journey. Typically, the adoption path consists of several stages as seen in Figure 3. Initially, organizations focus on extracting data from one system into their Operational Data Store, allowing them to operate on a more unified dataset. Gradually, they can retire legacy systems and eliminate the need for intermediate data streams. The key benefit of this incremental approach is that it delivers value (e.g. offloading mainframe operations) to the business at every step by eliminating the need for a complete overhaul and minimizing disruption. Figure 3: Evolution of a basic ODS into a system of records Areas of application ODS are used to support the business in three different ways: Data Access Layers allow organizations to free their data from the limitations imposed by data silos and technological variations. Organizations consolidate data from different sources that often use different data storage technologies and paradigms, creating a unified view that simplifies data access and analysis. This pattern is mainly used to enable modern APIs, speed up development of new customer services, and improve responsiveness and resiliency. Operational Data Layer (ODL): The ODL is an internal-focused layer that aids in complex processing workflows. It serves as a hub for orchestrating and managing data across various stages of processing. The ODL empowers organizations to enrich and improve data iteratively, resulting in more powerful and accurate insights. It provides a holistic view of data and process information, an improved customer experience, and reduced operational costs. Developer ODL: Building a developer-focused ODL can provide significant advantages during the development cycle. By making data readily available to developers, organizations can accelerate the development process and gain a comprehensive understanding of their data structures. This, in turn, helps in identifying and addressing issues early on, leading to improved data models and better system performance. In a nutshell, this pattern helps reduce developer training time, streamlines development and speeds up testing and test automation. The power of search in ODS So how can every ODL benefit from search capabilities and how can MongoDB Atlas Search help? Atlas Search plays a crucial role in maximizing the value of an ODS. When we have questions or are searching for an answer, our natural interaction with information is primarily through search. We excel at interpreting imprecise queries and extracting relevant information from vast datasets. By incorporating search capabilities with Atlas Search into an ODS, organizations can empower their users to explore, analyze, and gain valuable insights from their data. Consider the example of a banking organization with a complex web of interconnected systems. Searching for specific transactions or identifying patterns becomes a daunting task, especially when dealing with numeric identifiers across multiple systems. Traditionally, this involved manual effort and navigating through numerous systems. However, with a search-enabled ODS, users can quickly query the relevant data and retrieve candidate matches. This greatly streamlines the process, saves time, and enhances efficiency. Practical examples: Leveraging ODS and Atlas Search Let's explore a few practical examples that demonstrate the power of ODS and the Atlas Search functionality. Operational Data Layer for Payments Processing: A financial institution implemented an ODS-based operational layer for processing payments. By aggregating data from multiple sources and leveraging search capabilities, they achieved faster and more accurate payment processing. This enabled them to investigate issues, ensure consistency, and deliver a superior customer experience. Customer 360 View: Another organization leveraged an ODS to create a comprehensive view of their customers, empowering relationship managers and bank tellers with a holistic understanding. With search functionality, they could quickly locate customer information across various systems, saving time and improving customer service. Post-trade Trading Platform: A global broker operating across 25 different exchanges utilized an ODS to power their post-trade trading platform. By leveraging search capabilities, they simplified the retrieval of data from various systems, leading to efficient and reliable trading operations. Conclusion In the dynamic world of data management, Operational Data Stores (ODS) have emerged as a crucial component for organizations seeking to streamline their data architectures. By adopting an incremental approach and leveraging search functionality such as Atlas Search , organizations can enhance data accessibility, improve operational efficiency, and drive valuable insights. The power of search within ODS lies in its ability to simplify data retrieval, accelerate development cycles, and enable users to interact with data in a more intuitive and efficient manner. By embracing these practices, organizations can unlock the true potential of their data, paving the way for a more productive and data-driven future. For more information on Atlas Search, check out the following resources: Watch this MongoDB.local talk which expands on this blog: Every ODS Needs Search: A Practical Primer Based on Client Experiences Discover MongoDB’s search functionalities Learn how Helvetia accelerates cloud-native modernization by 90% with MongoDB Atlas and MongoDB Atlas Search

November 1, 2023

Los 5 Pasos Necesarios para Modernizar el Mainframe de los Bancos

Enriquecida, cómoda y personalizada son las palabras clave para cualquier empresa que construya una experiencia de cliente digital moderna. No es diferente para los bancos minoristas tradicionales, especialmente cuando intentan defenderse de los bancos emergentes y diseñar sus propias experiencias de banca online y en las sucursales para captar nuevos clientes y retener a los existentes. Sin embargo, para vencer a la competencia y crear experiencias que superen las ofrecidas por los neobancos, los bancos minoristas establecidos deben dominar su patrimonio de datos. En concreto, deben liberarse de las rígidas arquitecturas de datos asociadas a los mainframes heredados y a las aplicaciones bancarias empresariales monolíticas. Sólo entonces los bancos establecidos podrán hacer que sus desarrolladores se pongan a trabajar en la creación de aplicaciones de alta calidad orientadas al cliente, en lugar de gestionar miles de tablas SQL, luchar por rehacer los esquemas o mantener sistemas heredados que flaquean. El primer paso en este proceso es modernizar el mainframe. Modernización avanzada en 5 fases La mejor opción de modernización es un modelo por fases que utilice una capa de datos operativos (ODL, del inglés "Operational Data Layer"). Una ODL actúa como puente entre los actuales y los nuevos sistemas de un banco. El uso de una ODL permite un enfoque iterativo, lo que permite a los bancos ver el progreso hacia la modernización en cada paso del camino sin dejar de proteger los activos existentes y las operaciones críticas para el negocio. Los bancos pueden ver mejoras rápidas en un periodo de tiempo relativamente corto, al tiempo que conservan los componentes heredados mientras sean necesarios para mantener el negocio en funcionamiento. El enfoque de modernización en cinco fases de MongoDB permite a los bancos modernizarse de forma progresiva al tiempo que equilibran el rendimiento y el riesgo. Si los bancos están deseando modernizarse y sus clientes exigen experiencias bancarias modernas, ¿por qué tardan tanto en abandonar los sistemas heredados que limitan su capacidad de innovación? ¿Y por qué fracasan tantos esfuerzos de modernización? Acceda al informe Las 5 fases de la modernización bancaria para empezar a trazar su camino. Técnicas de modernización del mainframe Con una ODL, la infraestructura heredada puede desconectarse pieza a pieza y retirarse a medida que se añaden más funcionalidades. En este escenario, las operaciones de base de datos son mucho más eficientes porque los objetos se almacenan juntos en lugar de en ubicaciones inconexas. Las lecturas se ejecutan en paralelo a través de los nodos de un conjunto de réplicas. Las escrituras no se ven afectadas. Para aportar beneficios similares a las escrituras, los bancos pueden optar por implantar un ODL con sharding y shards regionales , acercando las escrituras al usuario real. A continuación, las cargas de trabajo pueden trasladarse gradualmente de los sistemas heredados al ODL, con el objetivo final de desmantelar el sistema heredado. Lo interesante de este enfoque de la modernización es que comienza por contestar al siguiente caso de uso: ¿A qué problemas se enfrenta el banco en su gestión de datos y qué funcionalidades solicitan los clientes? Si la principal prioridad es dar a los clientes acceso a los datos históricos de las transacciones, los bancos pueden abordar ese problema inmediatamente creando un repositorio (o dominio) para descargar los datos de los clientes del mainframe. Si la prioridad es la reducción de costes, entonces un ODL puede actuar como una capa intermedia, permitiendo a las aplicaciones acceder a los datos que necesitan, sin necesidad de ejecutar costosas consultas contra los datos del mainframe. Las ventajas de un ODL MongoDB es ideal para conectar mainframes y bases de datos tradicionales a arquitecturas más modernas, como un data mesh mediante una ODL. Una ODL tiene una serie de ventajas. Combinadas, estas ventajas facilitan enormemente el acceso a los datos y su uso, y hacen que las aplicaciones sean más fáciles y rápidas de desarrollar. Una ODL permite a una organización procesar y aumentar datos que residen en silos separados, y luego utilizar esos datos para alimentar un producto derivado, como un sitio web o un cajero automático. Con una ODL, los datos se copian físicamente a una nueva ubicación. Los sistemas heredados de un banco permanecen en su lugar, pero las nuevas aplicaciones pueden acceder a los datos a través de la ODL en lugar de interactuar directamente con los sistemas heredados. Un ODL puede extraer datos de uno o varios sistemas de origen y alimentar una o varias aplicaciones consumidoras, unificando datos de múltiples sistemas en una única plataforma en tiempo real. Un ODL libera el mainframe de cargas de trabajo. Un subproducto útil es evitar las interrupciones del servicio al consumidor provocadas por las ventanas de mantenimiento en sistemas heredados, como Oracle Exadata. Una ODL puede utilizarse para servir sólo lecturas, aceptar escrituras que luego se escriben en los sistemas de origen, o evolucionar hasta convertirse en un sistema de registro que acabe sustituyendo a los sistemas heredados y simplifique la arquitectura de la empresa. Debido a su capacidad para trabajar con sistemas heredados, o para sustituirlos gradualmente, y a su capacidad para apoyar un enfoque evolutivo de la modernización heredada, muchos bancos consideran que una ODL es un paso crítico en el camino hacia la modernización completa de su arquitectura empresarial. En términos de configuración arquitectónica, algunos bancos pueden querer una ODL para cada uno de sus dominios de datos, pero otros pueden considerar que ciertos dominios pueden compartir una ODL. El modelo de ODS/ODL puede aplicarse de diversas maneras, sin infringir las normas internas del banco. Por ejemplo, imaginemos un cajero automático conectado a un ODL basado en MongoDB. Con el ODL en funcionamiento, los datos del mainframe se replican en tiempo real y se ponen a disposición del consumidor para que compruebe sus transacciones más recientes y el saldo de su cuenta en el cajero automático. Sin embargo, la información del saldo del cliente sigue residiendo en el sistema de origen. Utilizar el ODL para replicar y mostrar información del mainframe evita a los clientes tener que enfrentarse a retrasos molestos mientras esperan a que se cargue la información de un mainframe. Al mismo tiempo, los informes normativos y de gestión de riesgos pueden seguir ejecutándose contra un mainframe como un proceso batch "end of day". Con un ODL en funcionamiento, los datos pueden fluir desde el mainframe a una arquitectura más nueva, lo que proporciona al cajero automático capacidades más amplias que amplían las experiencias bancarias de los clientes, como la posibilidad de pagar facturas, cambiar direcciones o incluso abrir cuentas adicionales. Actualizaciones en batch nocturnos, masivas o en tiempo real: MongoDB es lo suficientemente flexible como para conectarse a cualquier fuente de datos, ya sea DB2 clásico para zOS, Oracle, SQL Server, legado basado en Hadoop o incluso hojas de cálculo Excel. MongoDB dispone de la conectividad adecuada para ingerir cualquier dato en cualquier momento y desde cualquier lugar. Enriquecimiento, dominios de datos y mercados de datos: Con su modelo de datos de documentos, MongoDB tiene la capacidad de llevar los datos a dominios de datos frente al uso de enrevesados esquemas de tablas y procesos ETL. Los dominios surgen de forma natural en función de los requisitos de la aplicación y de la comunidad de usuarios. Seguridad, esquemas y validación: MongoDB cuenta con múltiples capas de seguridad, incluida la protección por contraseña sobre el cifrado in flight y at rest, además del cifrado granular field-level. Todo ello con gestión externa de claves. Dé el siguiente paso en la modernización del mainframe Dado que muchas funciones bancarias básicas son transaccionales y pueden gestionarse con el procesamiento por batch diarios, los mainframes siguen siendo la columna vertebral de nuestro sistema financiero. La modernización de los mainframes puede parecer desalentadora, pero no tiene por qué serlo. Los bancos pueden optar por seguir un camino sencillo y predecible que les permita modernizarse de forma iterativa. Pueden recibir los beneficios de la modernización en un área de la organización incluso si otros grupos se encuentran más adelantados en su camino de modernización. Es posible hacerlo sin dejar de cumplir la cada vez más compleja normativa sobre privacidad de datos y, lo que es más importante, minimizando los riesgos. Los bancos y otras instituciones financieras que se han modernizado con éxito han experimentado reducciones de costes, un rendimiento más rápido, prácticas de cumplimiento más sencillas y ciclos de desarrollo más rápidos. Las arquitecturas nuevas y flexibles han acelerado la creación de servicios de valor añadido para consumidores y clientes corporativos. Si está preparado para obtener más información sobre cómo puede acelerar su transformación digital minimizando el riesgo, acceda ahora al documento " Las 5 fases de la modernización bancaria "

March 23, 2023

Forrester Study: How IT Decision Makers Are Using Next-Generation Data Platforms

Data is critical to every financial institution; it is recognized as a core asset to drive customer growth and innovation. As the need to leverage data efficiently increases, however, the legacy technology that still underpins many organizations is not built to support today’s requirements. Not only is this infrastructure costly and complex, it doesn’t support the diversity of workloads and functions that modern applications require. To overcome these challenges, organizations are increasingly adopting an integrated data platform that offers a seamless developer experience, runs anywhere, and scales to meet growing business needs. To better understand how such data platforms are being used, MongoDB commissioned Forrester Consulting to survey global IT decision makers at financial services and fintech organizations. In this article, we’ll share findings from the survey to help answer questions such as: What impact are legacy technologies having on financial services? What are the requirements for a data platform? And, for those already adopting next-generation data platforms, what benefits are they experiencing? According to the survey, the majority of decision makers are aware of issues related to legacy technologies: 57% of respondents said that their legacy technology was too expensive and doesn’t fulfill the requirements of modern applications. 50% said legacy technology cannot support the volume, variety, and velocity of transactional data. 47% noted that their systems landscape struggled to handle the rate of change required to stay up to date with customer expectations. Download the full study: What’s Driving Next-Generation Data Platform Adoption in Financial Services What is a next-generation data platform? Within the context of this study, a next-generation data platform is defined as supporting flexible and versatile data models, offering multiple access patterns (e.g., document, relational, graph), and catering to the speed, scale, performance, integration, and security needs of small or large organizations for new development or modernization efforts. All of these features are included in a single platform that delivers real-time, consistent, and trusted data to support a business. Adoption of next-generation data platforms in the financial services and fintech space is already high, with nearly 90% of respondents saying they are already adopting. The benefits are already understood, with 74% of respondents acknowledging not only that there are technology benefits but also that a next-generation data platform frees up teams to focus on innovation and enables faster software builds and iterating at scale (76%). The key to innovation - What's driving the adoption of next-gen data platforms? Security and risk management are key use cases Given the huge amount of confidential client and customer data that the financial services industry deals with on a daily basis — and the strict regulations — security must be of highest priority. The perceived value of this data also makes financial services organizations a primary target for data breaches. Many organizations are still working to realize the full potential of adopting next-generation data platforms; however, it’s understood that such platforms are the only way to manage cost, maximize security, and continue to innovate. Fraud protection (51%), risk management (46%) and anti-money laundering (46%) are high priorities for any new data platform, according to respondents. And, these findings directly correlate with 40% of respondents saying that their current database is unable to meet security requirements. Multi-cloud is driving investment Regardless of their size and business mix, most financial institutions have come to understand the benefits of cloud and multi-cloud services. Multi-cloud — the practice of leveraging cloud services from more than one provider — is no longer just a nice-to-have option. Regulators, however, are increasingly focused on cloud concentration risk as so much of the technology underpinning global financial services relies on so few large cloud services providers. Regulators have so far offered financial institutions warnings and guidance rather than enacting new regulations, although they are increasingly focused on ensuring that the industry is considering plans. An outage or cyberattack at a large public cloud provider, they worry, could derail the global financial system. Decision makers are finding that multiple clouds provide them with lower costs, higher performance, and greater flexibility. This is why, according to the survey, the top driver for investment for decision makers when adopting next-generation data platforms is multi/hybrid cloud capabilities (49%), followed by scalability (44%). Improving real-time analytics capabilities The ability to perform real-time analytics is key for financial institutions, as they need to provide more personalized customer experiences, react more quickly to market trends, and detect and prevent potential threats. With legacy systems, few of these organizations can respond to changes in data minute by minute or second by second. Among survey respondents, real-time analytics was the top feature (54%) that organizations are interested in with regard to next-generation data platforms. With improved analytics capabilities, businesses can analyze any data in place and deliver insights in real time. Legacy infrastructure is holding organizations back To remain competitive and build experiences that retain customers, financial institutions need to master their data estate. Specifically, they need to free themselves from the rigid data architectures associated with legacy mainframes and monolithic enterprise banking applications. Only then can developers build high-quality customer-facing applications rather than maintain legacy systems. High costs and data complexity are the top challenges driving organizations to modernize legacy workloads and unlock business agility. According to 57% of IT decision-makers questioned, legacy technology is too expensive and does not fulfill the requirements of modern applications. This correlates with 79% of respondents seeking a data platform that will address multiple workloads — ranging from transactional to analytical — as data continues to expand. What is the impact? Financial organizations use next-generation data platforms to replace legacy technologies that fragment and duplicate data and cause internal silos. This change also addresses key needs like reducing costs, lowering complexity, better onboarding for customers, and meeting security requirements. Once in place, a next-generation data platform provides several advantages, including minimizing data inconsistencies (43%), expanding geographical coverage (42%), freeing up resources (40%), and reducing time-to-market for new ideas (37%). Other advantages include eliminating the impact of database downtime for upgrades, migrations, and schema changes. And, additional benefits can be seen within the customer and employee experience, as they engage with and access information. Based on these benefits, financial services organizations are looking to increase investment in next-generation data platforms by an average of one million dollars or more in the next one to three years. The volume and variety of data that financial services companies must deal with will only increase in the coming years. As such, figuring out how to leverage, protect, and innovate around that data will put organizations in good stead moving forward. A next-generation data platform can be the key to making this happen. About the study MongoDB commissioned Forrester Consulting to conduct a study questioning global IT decision makers at financial services and fintech organizations to evaluate the impact they are experiencing when adopting next-generation data platforms. The study evaluates the benefits, challenges, and barriers of adoption that decision makers are experiencing, as well as the outcomes after adoption. To create this study, Forrester Consulting supplemented this research with custom survey questions asked of database/data platform strategy decision-makers in finserv (73%) or fintech (27%) from North America (22%), Europe (39%), and APAC (39%). The organizations questioned had 1,000+ employees. The custom survey began and was completed in September 2022. Download the full study — What’s Driving Next-Generation Data Platform Adoption in Financial Services — to learn about organizations’ needs and plans for using next-generation data platforms.

December 13, 2022

Hybrid Cloud: Flexible Architecture for the Future of Financial Services

Financial services companies are reimagining how they apply technology to meet the growing service demands of a digital-first world. As they recognize the operational and competitive advantages of the public cloud, many companies are migrating their computing needs to it as quickly as possible. For an industry with tight regulations, a vast amount of private data, and complex legacy infrastructure, however, moving every workload to the cloud isn’t feasible just yet. Instead, some companies are moving to hybrid cloud, an architecture that enables them to use the public cloud wherever possible, while keeping those applications and data with tricky legal or reputational exposure on in-house systems. In this article, we’ll examine advantages of a hybrid cloud approach and outline steps to consider when preparing for such a shift. Overview Hybrid cloud integrates public cloud and on-premises infrastructure into a single functioning unit. Through the public cloud, institutions gain valuable versatility, agility, and scale to run applications more efficiently and to turbo-charge experimentation. They can use existing infrastructure to handle sensitive workloads — including those storing Personally Identifiable Information (PII) — within a familiar, time-tested environment. Deciding where to host applications is usually a function of a workload’s data secrecy and sovereignty requirements and an institution’s assessment of risks and opportunities related to them. Developing the technical flexibility to move between public and private infrastructures makes it easier to match those requirements to the environment best suited to fulfill them. Advantages to hybrid cloud A hybrid cloud approach offers many advantages. For example, institutions can use public cloud infrastructure for tasks with dynamic resource requirements, such as payments processing over holidays or risk calculations for end-of-month reporting. This setup can reduce the delays, data center overhead, and sunk costs associated with adding in-house servers, some of which may ultimately be used situationally or not at all. Companies also save on capital expenses and improve responsiveness to internal and external demands. A hybrid cloud setup can also help organizations address compliance, resilience, and performance needs. Those operating in multiple countries can use in-house and public cloud resources across different regions to satisfy disparate requirements around data sovereignty and residency. This geographic and infrastructure diversity can also enhance a company’s failover and disaster recovery profile. By co-locating applications in public cloud regions near customers, institutions can also improve service performance — an important factor as the industry moves toward mobile-first solutions. As institutions pursue more efficient ways to work, the insight gained through planning and executing a hybrid cloud strategy can help inform and transform an organization’s operations. Institutions can begin the shift to the continuous cadence of DevOps, DevSecOps, and MLOps teams by incorporating public cloud tools and methods. This approach includes using process automation and orchestration tools to streamline delivery and maintenance, and management applications to free up in-house IT resources from undifferentiated work. The following section describes other ways a hybrid approach can encourage changes to institutional conventions. Rethinking budgeting Although fixed infrastructure costs and investments can limit an organization’s flexibility, most companies still budget for fixed costs and may find the usage-based billing of public cloud services unnerving. Making the shift from the transparency and stability of capital expenses for on-premises infrastructure to the unpredictability of operating expenses in public-cloud procurement requires an organizational adjustment. Vendors do offer cost-management tools to help budget for and accommodate these changes. Hybrid cloud can help ease this transition as the organization moves into an infrastructure-as-a-service model. Expanding a security mindset The financial services sector is a high-target industry for cyberattacks. Data loss and leakage are also significant concerns. Organizations, therefore, often struggle to transfer any control over security and system integrity to a third party, and disparate regulations increase those hurdles. Sometimes, though, organizations overestimate the effectiveness of their in-house security teams and underestimate the security capabilities of the largest cloud providers who, like banks, are charged with deflecting the most sophisticated attacks all day, every day. Cloud services providers and other third-party vendors invest heavily in security research and resources. They regularly certify to the highest compliance standards. They’re also constantly developing new solutions to help institutions bridge gaps in their homegrown security measures and team capabilities. The result is that the security capabilities of the public cloud providers are often more advanced than those of in-house teams. Simplifying infrastructure complexity The largest financial institutions with the greatest global coverage face the biggest challenges in building a hybrid architecture. What’s more, sunk investments in on-premises infrastructure can make it cost- and ROI-prohibitive to shift workloads to the public cloud. An architecture that can support a hybrid of public cloud, on-premises cloud, and bare-metal deployments offers a flexible solution to address this complexity. Preparing for the shift As with any big shift in technology, the move to hybrid presents a set of challenges that are as much cultural and operational as they are technical. Preparation for a hybrid cloud project, therefore, must include organizational readiness assessments across functions. It must take into consideration not just the technical, business, and monetary impact but also the legacy mindset and organizational rituals that can jeopardize the best-laid technology strategies. In pursuing a hybrid cloud strategy, institutions can begin to modernize outdated operating principles as they transform their approach to technology. Given the high uptake within the industry, the steps to adopt a public cloud-only model are well-documented. In a hybrid cloud approach, however, lack of expertise in integrating public and private cloud technologies is a frequent challenge. This, coupled with staff who may be reluctant to adopt unfamiliar technology, can create resistance among technology teams. Early successes with high management attention can create excitement; for a wider adoption, strong central platform support through the infrastructure team, as well as training and transparency, can help staff get on board. Other effective ways for an organization to prepare for a hybrid cloud future include setting clear business and technical goals, creating inventories of data and applications, and evaluating how customers might react to changes in responsiveness and security brought about by the switch to hybrid. Assessing in-house skills and managing transformation anxiety of existing technical staff are also crucial to team preparedness. The following steps can help financial institutions prepare for a hybrid approach. Know your company and your customer Set goals: Companies that articulate clear goals for their hybrid strategy are more likely to achieve them. These goals might include gains in operating efficiency, more flexible development, cost savings, speed of innovation, IT resiliency, or regulatory flexibility. Evaluate your customer profiles: Retail and institutional customers require different services and protections from their financial institutions. An understanding of these needs and concerns should inform any analysis of the potential for a hybrid cloud implementation. The storage of PII, for example, demands special consideration. Profile your assets: Financial institutions house data and applications that perform business functions. Understanding these in a regulatory context, from a commercial perspective and through a technical lens, will influence decisions about how best to optimize them in a hybrid cloud environment. Blend private and public cloud: To decrease effort, organizations should reduce differences between the two deployment methods. This aim is crucial for a successful adoption. Initiatives that require teams to manually request assets on the private cloud usually fail. Build your team Engage stakeholders: An effective hybrid cloud strategy engages functions across the enterprise. It incorporates business, legal, IT, and security priorities into a comprehensive plan. Engaging compliance officers and security professionals early on is critical, as compliance and system safeguards must be woven into the DNA of any hybrid cloud plan from the outset.. Assess skills and educate the team: At the start of a journey to hybrid cloud, organizations often lack the expertise and mindset to confidently shift to a new model. Simply understanding the myriad services offered by public cloud providers can be daunting. Combining public and private clouds in a hybrid setup requires another whole level of up-skilling. Evaluating in-house teams to determine education and training needs is essential for the new paradigm. Foster transparency: In any effective cloud strategy, transparency across the organization is crucial to gaining buy-in, just as education is crucial to building skills. A cloud adoption team can ensure that the training and cultural needs of the organization are met alongside financial, customer, and business imperatives. Engaging your cloud provider(s) in this process can help. Map your migration Start small: Small proofs of concept build confidence and allow teams to expand incrementally on the back of those successes.. Manage risk: Start with a low-risk approach. For example, organizations may choose to move workloads that are highly dynamic and less sensitive first. These might include some customer-facing apps that contain little PII. Institutions often start with retail applications, while still running their institutional-focused applications within their on-premises data centers. This approach may change over time as they become more comfortable with public cloud security and managing a hybrid environment. Moving to hybrid Financial services institutions are already adapting to greater demands for innovation and efficiency by designing responsive IT environments and taking advantage of the public cloud — and often, hybrid cloud is a crucial part of that pathway. A hybrid cloud strategy is a great solution to help organizations to meet their technical and business objectives more cost-efficiently and effectively than with either a public or private cloud alone. A hybrid cloud approach offers the flexibility that institutions need to meet rapidly changing customer demands as well as competition from a new wave of challengers. As best practices become clear and more implementation lessons emerge, the industry will further embrace hybrid cloud as an important step in an evolution to a fully managed multi-cloud solution. Finding the right partners, of course, is crucial. Experienced teams and the best technical solutions greatly increase the odds of executing a successful hybrid strategy. The team at MongoDB offers solutions and advice to help financial institutions progress toward a more functional, flexible, and future-forward enterprise technology platform. To learn more about how MongoDB can help you on your cloud adoption journey, check out the following resources: Finance, Multi-Cloud, and The Elimination of Cloud Concentration Risk How Financial Services Achieve A Strategic Advantage With Data-Driven Disruption The Road to Smart Banking The 5-Step Guide to Mainframe Modernization for Banks

November 3, 2022

The 5-Step Guide to Mainframe Modernization for Banks

Enriched, convenient, and personalized are the watchwords for any business building a modern, digital customer experience. It’s no different for traditional retail banks, especially as they try to fend off challenger banks and design their own online banking and in-branch experiences to win new business and retain existing customers. But in order to beat the competition and build experiences that best those offered by neobanks, established retail banks need to master their data estate. Specifically, they need to free themselves from the rigid data architectures associated with legacy mainframes and monolithic enterprise banking applications. Only then can established banks have their developers get to work building high-quality customer-facing applications rather than managing thousands of SQL tables, scrambling to rework schema, or maintaining creaky legacy systems. The first step on this journey is modernizing the mainframe. Enriched modernization in 5 phases The best way to modernize is through a phased model that uses an operational data layer (ODL). An ODL acts as a bridge between a bank’s existing systems and its new ones. Using an ODL allows for an iterative approach, allowing banks to see progress toward modernization at each step along the way while still protecting existing assets and business-critical operations. Banks can see rapid improvements in a relatively short amount of time while preserving the legacy components for as long as they’re needed to keep the business running. MongoDB’s five-phase approach to modernization enables banks to modernize iteratively while balancing performance and risk. If banks are eager to modernize and their customers are demanding modern banking experiences, what’s taking banks so long to move away from the legacy systems that are restricting their ability to innovate? And why do so many legacy modernization efforts fall short? Download The 5 Phases of Banking Modernization to start plotting your path forward. Mainframe modernization techniques With an ODL, the legacy infrastructure can be switched off piece by piece and retired as more functionality is added. In this scenario, database operations become much more efficient because objects get stored together rather than in disjointed locations. Reads are executed in parallel via the nodes in a replica set. Writes are largely unaffected. To bring similar benefits to writes, banks may choose to implement an ODL with sharding and regional shards , bringing writes closer to the actual user. Workloads can then be gradually moved from legacy systems to the ODL, with the ultimate goal to decommission the legacy system. The beauty of this approach to modernization is that it starts with the use case: What problems does the bank face in its data management and what functionalities are customers requesting? If the first priority is giving customers access to historical transaction data, then banks can tackle that problem immediately by building a repository (or domain) to offload customer data from the mainframe. If the priority is cost reduction, then an ODL can act as an interim layer, allowing applications to access the data they need without the need to run expensive queries against mainframe data. The advantages of an ODL MongoDB is ideal for connecting legacy mainframes and databases to newer architectures, such as a data mesh, by way of an ODL. An ODL has a number of advantages. Combined, these advantages make data massively easier to access and use — and applications easier and faster to build. An ODL allows an organization to process and augment data that resides in separate silos, and then use that data to power a downstream product, such as a website or an ATM. With an ODL, data is physically copied to a new location. A bank’s legacy systems remain in place, but new applications can access data through the ODL rather than interacting directly with legacy systems. An ODL can draw data from one or many source systems and power one or many consuming applications, unifying data from multiple systems into a single real-time platform. An ODL relieves the mainframe of workloads. One useful by-product is in avoiding consumer service interruptions brought about by maintenance windows on legacy systems, like Oracle Exadata. An ODL can be used to serve only reads, accept writes that are then written back to source systems, or evolve into a system of record that eventually replaces legacy systems and simplifies the enterprise architecture. Because of its ability to work with legacy systems, or to gradually replace them, and its ability to support an evolutionary approach to legacy modernization, many banks find that an ODL is a critical step on the path to full modernization of their enterprise architecture. In terms of architectural setup, some banks may want one ODL for each of their data domains but others may find certain domains can share an ODL. The ODS/ODL template can be applied in a variety of ways — without breaking the bank’s internal standards. For example, imagine an ATM terminal connected to a MongoDB-based ODL. With the ODL in place, data from the mainframe is replicated in real time and made available for the consumer to check their most recent transactions and account balance on the ATM. Customer balance information, however, also still resides on the source system. Using the ODL to replicate and display information from the mainframe avoids customers having to face annoying delays while they wait for the information from a mainframe to load. At the same time, risk management and regulatory reports can still be run against a mainframe as a batch “end of day” process. With an ODL in place, data can flow from the mainframe to a newer architecture, giving the ATM broader capabilities that expand customers’ banking experiences, such as the ability to pay invoices, change addresses, or even open additional accounts. Nightly batch, bulk load, or real-time updates: MongoDB is flexible enough to connect to any data source, be it classic DB2 for zOS, Oracle, SQL Server, Hadoop-based legacy, or even Excel spreadsheets. MongoDB has the appropriate connectivity to ingest any data at any time from anywhere. Enrichment, data domains, and data marketplaces: With its document data model, MongoDB has the capability to bring data into data domains versus using convoluted table schema and ETL processes. The domains emerge naturally based on the application and user community requirements. Security, schemas, and validation: MongoDB has multiple layers of security, including password protection over encryption in flight and at rest, plus granular field-level encryption. All with external key management. MongoDB can be used as an operational data layer Take the next step in mainframe modernization Because many core banking capabilities are transactional and can be handled with daily batch processing, mainframes remain the backbone of our financial system. Mainframe modernization might sound daunting, but it doesn’t have to be. Banks can choose to proceed along a straightforward and predictable path that allows them to modernize iteratively. They can receive the benefits of modernization in one area of the organization even if other groups are earlier in their modernization path. It’s possible to do this while supporting increasingly complex data privacy regulations and, importantly, minimizing risk. Banks and other financial institutions that have successfully modernized have seen cost reductions, faster performance, simpler compliance practices, and rapid development cycles. New, flexible architectures have accelerated the creation of value-added services for consumers and corporate clients. If you’re ready to learn more about how you can accelerate your digital transformation and minimize risk, “ The 5 Phases of Banking Modernization ” now.

April 21, 2022