Boris Bialek

15 results

Maximizing Growth: The Power of AI Unleashed in Payments

This post is also available in: Deutsch , Français , Español , Português , Italiano , 한국어 , 简体中文 . Artificial Intelligence (AI) technologies are an integral part of the banking industry. In areas such as risk, fraud , and compliance, for example, the use of AI has been commonplace for years and continues to deepen. The success of these initiatives (and others), and the potential to unlock further benefits, is driving further investment in this area in 2024, with Generative AI attracting particular interest. Financial tech analyst Celent created a report commissioned by MongoDB and Icon Solutions that dives into how AI is currently being used in the banking industry today, as well as some of the key use cases for AI adoption in payments to improve operational agility, automate workflows, and increase developer productivity. Download Celent’s report: Harnessing the Benefits of AI in Payments to discover how you can make the most of your AI investments and unlock the limitless possibilities that AI holds for the future of payments. Unlocking a range of workflow and product enhancements AI technologies are used today to address a wide range of different workflows and customer-facing services from process automation and optimization in the middle and back office, to areas such as real-time risk and liquidity management, cashflow forecasting, and service personalization in the front office. Virtual assistants and bots have also become an important part of the customer support process. In this blog, we'll cover some of the key findings from Celent’s Harnessing the Benefits of AI in Payments report and what this means for the banking and payments industry. Advanced analytics, intelligent automation, and AI technologies lead the investment agenda in 2024 Over time, banks have steadily increased their investments in projects to make better and more efficient use of data. In part, this has been driven by the need to respond to rising customer expectations over the speed and quality of digital services, but it also reflects a growing understanding of the true value of account and transaction data. Most important of all, though, has been enabling the technologies required to deliver use cases supported by AI and advanced analytics. It is no surprise to see that projects supported by data analytics and AI technologies are high on the agenda globally. Advanced analytics and machine learning investments are a leading technology priority for 33% of corporate banks, ranking higher than projects relating to robotics and automation (a focus for 31% of the market). Artificial intelligence and natural language processing (NLP) are not far behind and were highlighted as a priority by 28% of banks. Many are also exploring Generative AI While the excitement around genAI is understandable given the obvious potential, the conversation became more nuanced through the latter part of 2023. This is understandable given the complexities of applying large language models (LLMs) to potentially sensitive customer data, as well as broader regulatory concerns over the explainability (and potential auditability) of LLM outputs. That said, there are many areas in which genAI is already being used to support advisors and relationship managers and further innovation in areas such as this is expected. According to the report, 58% of banks are evaluating or testing Generative AI in some capacity while a further 23% have projects using this technology in their roadmap. Emerging use cases for AI in payments and the potential revenue growth A lack of developer capacity is one of the biggest challenges for banks when it comes to delivering payment product innovation. Banks believe the product enhancements they could not deliver in the past two years due to resource constraints would have supported a 5.3% growth in payments revenues. With this in mind and the revolutionary transformation with the integration of AI, financial institutions must consider how to free up developer resources to make the most of these opportunities. As the payments industry continues to evolve, the integration of AI is poised to reshape the landscape, offering innovative solutions that prioritize security, efficiency, personalized user experience. The emerging use cases for AI in payments are a testament to its transformative potential in shaping the future of financial transactions. Leveraging modern technologies to make the most of AI adoption In the rapidly evolving landscape of AI, constant technological advancements and evolving customer needs necessitate strategic investments. To stay competitive, banks and payment providers should not only focus on current product enhancements but also future-proof their capabilities through payment infrastructure modernization . When adopting advanced technologies like AI and ML which require data as the foundation, organizations often grapple with the challenge of integrating these innovations into legacy systems due to their inflexibility and resistance to modification. For example, adding a new payment rail and a new customer access point could be very difficult. Establishing a robust data architecture with a modern data platform that enables banks to enrich the payments experience by consolidating and analyzing data in any format in real-time, driving value-added services and features to consumers. The following recommendations will help ensure financial services organizations can unlock the transformative potential of generative AI at scale while ensuring privacy and security concerns are adequately addressed: Train AI/ML models on the most accurate and up-to-date data , thereby addressing the critical need for adaptability and agility in the face of evolving technologies. By unifying data from backend payment processing to customer interactions, banks can surface insights in real-time to create a seamless, connected, and personalized customer journey. Future-proof with a flexible data schema capable of accommodating any data structure, format, or source. This flexibility facilitates seamless integration with different AI/ML platforms, allowing financial institutions to adapt to changes in the AI landscape without extensive modifications to the infrastructure. Address security concerns with built-in security controls across all data. Whether managed in a customer environment or through MongoDB Atlas, a fully managed cloud service, MongoDB ensures robust security with features such as authentication (single sign-on and multi-factor authentication), role-based access controls, and comprehensive data encryption. These security measures act as a safeguard for sensitive financial data, mitigating the risk of unauthorized access from external parties and providing organizations with the confidence to embrace AI and ML technologies. Launch and scale always-on and secure applications by integrating third-party services with APIs. MongoDB's flexible data model and ability to handle various types of data, including structured and unstructured data, is a great fit for orchestrating your open API ecosystem to make data flow between banks, third parties, and consumers possible. The MongoDB Atlas developer data platform puts powerful AI and analytics capabilities directly in the hands of developers and offers the capabilities to enrich payment experiences by consolidating, ingesting, and acting on any payment data type instantly. MongoDB Atlas is designed to help financial services organizations overcome data challenges. It features a flexible document data model and seamless third-party integration capabilities that are necessary to create composable payment systems that scale effortlessly, are always-on, secure, and ACID compliant. Stay ahead of the curve — download Celent’s report now and unlock the limitless possibilities that AI holds for the future of payments. If you prefer a visual exploration, a discussion featuring Celent, Icon Solutions, and MongoDB, register for our upcoming webinar, Using AI to Unlock New Opportunities in Payments with Celent, Icon Solutions, and MongoDB . If you would like to discover more about building AI-enriched payment applications with MongoDB, take a look at the following resources: Discover how the financial sector can make use of Generative AI Deliver AI-enriched payment apps with the right security controls in place, and at the scale and performance users expect Sign up for our Atlas for Industries programme to get access to our solution accelerators to drive innovation

February 12, 2024

Temenos Banking Cloud Scales to Record High Transactions with MongoDB Atlas and Microsoft Azure

This post is also available in: Deutsch , Français , Español , Português , Italiano , 한국어 , 简体中文 . Banking used to be a somewhat staid, hyper-conservative industry, seemingly evolving over eons. But the emergence of Fintech and pure digital players in the market paired with alternatives in technology is transforming the industry. The combination of MACH , BIAN and composable designs enables true innovation and collaboration within the banking sector, and the introduction of cloud services makes these approaches even easier to implement. Just ask Temenos, the world's largest financial services application provider, providing banking for more than 1.2 billion people . Temenos is leading the way in banking software innovation and offers a seamless experience for their client community in over 150 countries. Temenos embraces a cloud-first, microservices-based infrastructure built with MongoDB, giving customers flexibility, while also delivering significant performance improvements. Financial institutions can embed Temenos components, like Pay-as-you-go, which delivers new functionality to their existing on-premises environments, on their own cloud deployments or through a full banking as a service experience with Temenos Transact powered by MongoDB on various cloud platforms. This new MongoDB-based infrastructure enables Temenos to rapidly innovate on its customers' behalf, while improving security, performance, and scalability. Fintech, payments and core banking Temenos and MongoDB joined forces in 2019 to investigate the path toward data in a componentized world. Over the past few years, our teams have collaborated on a number of new, innovative component services to enhance the Temenos product family, and several banking clients are now using those components in production. However, the approach we've taken allows banks to upgrade on their own terms. By putting components “in front” of the Temenos Transact platform , banks can start using a componentization solution without disrupting their ability to serve existing customer requirements. From May 2023 onwards, banks will have the capability to deploy Temenos Infinity microservices as well as the core banking Temenos Transact exclusively on the developer data platform from MongoDB and derive even more value. Making the composable approach even more valuable, Temenos implemented their new data backend firmly based on JSON and the document model . MongoDB allows fully transparent access to data and the exploitation of additional features of the developer data platform. These features include Atlas Search , application-driven analytics , and AI through workload isolation. Customers also benefit from the geographic distribution of data based solely on the customer requirements, be it in a single country driven by sovereignty requirements or distributed across continents to ensure always-on and best possible data access and speed for trading. Improved performance and scale In contrast to the retail-centric benchmark last year , the approach this time was to test broader functionality and include more diverse business areas – all while increasing the transaction volume by 50%. The benchmark scenario simulated a client with 50 million retail customers, 100 million accounts and a Banking-as-a-Service (BaaS) offering for 10 brands and 50 million embedded finance customers on a single cloud instance. In the test, Temenos Banking Cloud processed 200 million embedded finance loans and 100 million retail accounts at a record-breaking 150,080 transactions per second. In doing so, Temenos proved its robust and scalable platform can support banks’ business models for growth through BaaS or distributing their products themselves. The benchmark included not just core transaction processing, but a composed solution combining payments, financial crime mitigation (FCM), a data hub, and digital channels. "No other banking technology vendor comes close to the performance and scalability of Temenos Banking Cloud. We consistently invest more in cloud technologies and have more banks live with core banking in the cloud than any of our peers. With global non-cash transaction volumes skyrocketing in response to fast-emerging trends like BaaS, banks need a platform that allows them to elastically scale based on business demand, provide composable capabilities on-demand at a low cost, while reducing their environmental impact. This benchmark with Microsoft and MongoDB proves the capability of Temenos’ platform to power the world’s biggest banks and their BaaS offerings with hundreds of millions of customers, efficiently and sustainably in the cloud." Tony Coleman, Chief Technology Officer, Temenos This solution landscape reflects an environment where everyone on the planet runs two banking transactions a day on a single bank. This throughput should cater to any Tier 1 banking deployment, in size and performance, and cover any future growth plans that they have. Below are the transaction details that comprise the actual benchmark mix. As mentioned above it is a broad mix of different functionalities behaving like a retail bank and a fintech institute, which provides multiple product brands, e.g. cards for different retails. Besides the sheer performance of the benchmark, the ESG footprint of the overall landscape shrunk again versus last year’s configuration as the MongoDB Atlas environment was the sole database and no secondary systems were required. Temenos Transact optimized with MongoDB The JSON advantage Temenos made significant engineering efforts to decapsulate the data layer, which was previously stored as PIC, and make JSON formatted data available to their user community. MongoDB was designed from its inception to be a database focused on delivering a great development experience. JSON’s ubiquity made it the obvious choice for representing data structures in MongoDB’s document data model. Below you can see how Temenos Transact stores data vs Oracle or MSSQL vs MongoDB. Temenos and MongoDB have an aligned data store – Temenos Transact application code operates on documents (JSON) and MongoDB stores documents in JSON in one place, making it the perfect partnership. MongoDB enables the user community through its concept of additional nodes in the replica set to align further secondary applications integrated into the same database without interrupting and disturbing the transactional workload of Temenos Transact. The regular occurring challenge with legacy relational database management systems (RDBMS) where secondary applications suddenly have unexpected consequences to the primary application is a problem of the past with MongoDB. Workload Isolation with MongoDB MongoDB Atlas will operate in most cases in three availability zones, where two zones are located in the same region for pure availability and a single node is located in a remote region for disaster recovery. This environment provides the often required RPO/RTO “0” while delivering unprecedented performance. Two nodes in each of the first availability zones provision the transactional replica set and ensure the consistency and operation of the Temenos Transact application. In each availability zone, a third isolated workload node is co-located with the same data set as the other two nodes but is excluded from the transactional processing. These isolated workload nodes provide capacity for additional functionalities. In the example above, one node provides access to the MongoDB Atlas Federation and a second node provides the interface for MongoDB Atlas Search. As the nodes store data in near real-time – replication is measured in sub milliseconds as they are in the same availability zone – this allows exciting new capabilities like real-time large language model (LLM), e.g. ChatGPT, or machine learning connecting to a Databricks lake house. The design is discussed in more detail in this article . The below diagram shows a typical configuration for such a cluster setup in the European market for Microsoft Azure: one availability zone in Zurich, one availability zone in Geneva, and an additional node out of both in Ireland. Additionally, we configured isolated workloads in Zurich and Geneva. MongoDB Atlas allows the creation of such a cluster within seconds, configured to the specific requirements of the solution deployed. Typical configuration for a cluster setup for the European market for Microsoft Azure Should the need arise, MongoDB can have up to 50 nodes in a single replica set so for each additional isolated workload, one or more nodes can be made available when and where needed. Even at locations beyond the initial three chosen! For this benchmark the use of a MongoDB Atlas cluster M600 was utilized which was oversized based on the CPU utilization of 20-60% depending on the node type. Looking backward a smaller MongoDB Atlas M200 would have been easily sufficient. Nonetheless MongoDB Atlas delivered the needed database performance with one third of the resources of last year's result, but delivering 50% more throughput. Additionally MongoDB Atlas performed two times faster in throughput per transaction (measured in milliseconds). Signed, sealed, and delivered. This benchmark gives clients peace of mind that the combination of core banking with Temenos Transact and MongoDB is ready to support the needs of even the largest global banks. While thousands of banks rely on MongoDB for many parts of their operations ranging from login management and online banking, to risk and treasury management systems, Temenos' adoption of MongoDB is a milestone. It shows that there is significant value in moving from a legacy database technology to MongoDB, allowing faster innovation, eliminating technical debt along the way, and simplifying the landscape for financial institutions, their software vendors, and service providers. PS: We know benchmarks can be deceiving and every scenario in each organization is different. Having been in the benchmark business for a long time, you should never trust just ANY benchmark. In fact, my colleague, MongoDB distinguished engineer John Page, wrote a great blog about how to benchmark a database . Thank you to Ainhoa Múgica and Karolina Ruiz Rogelj for their contributions to this post. If you would like to learn more about how you can use MongoDB to move towards a composable system, architecting for real-time adaptability, scalability, and resilience, take a look at the below resources: Componentized core banking built upon MongoDB Tony Coleman, CTO at Temenos and Boris Bialek, Global Head, Industry Solutions at MongoDB discuss the partnership at MongoDB World 2022 Remodel your core banking systems with MongoDB

May 9, 2023

Tendencias del 2023: Las Medidas de Modernización en el Sector de los Servicios Financieros

Ante la recesión mundial que se avecina, los bancos se enfrentan a unas condiciones económicas difíciles en 2023. Reducir los costes será vital para que muchas organizaciones sigan siendo competitivas en un entorno con un uso intensivo de datos y altamente regulado. Por ello, es importante que cualquier inversión en IT acelere la transformación digital con tecnologías innovadoras que rompan los silos de datos, aumenten la eficiencia operativa y creen experiencias personalizadas para los clientes. Siga leyendo para conocer las áreas en las que los bancos están buscando modernizarse en 2023 para construir mejores experiencias de cliente con un coste menor y a escala. Diseñando un futuro bancario mejor mediante diseños flexibles En un momento en que los bancos están deseando modernizarse e innovar, las entidades deben alejarse de los sistemas heredados que limitan su capacidad de progresar. Situar a los consumidores en el centro de una experiencia bancaria compuesta por servicios interconectados pero independientes ofrece a los bancos, que apuestan por la tecnología, la oportunidad de remodelar sus modelos de negocio y, en consecuencia, aumentar su cuota de mercado y su rentabilidad. Estas oportunidades han hecho posible un diseño de arquitectura componible que permite una innovación más rápida, una mayor eficiencia operativa y la creación de nuevas fuentes de ingresos mediante la ampliación de la cartera de servicios y productos. De este modo, los bancos pueden adoptar el mejor software de su categoría y el que mejor se adapte a sus necesidades, organizando asociaciones estratégicas con las empresas de tecnología financiera y los proveedores de software pertinentes. Esta nueva generación de proveedores puede ofrecer desde servicios de conocimiento del cliente (KYC, del inglés "Know Your Customer") hasta reservas integradas, servicios de carga o funcionalidades básicas de marketing y gestión de portfolios. Este enfoque es más rentable para las entidades que tener que construir y mantener ellas mismas la infraestructura, y es significativamente más rápido en términos de tiempo de comercialización y tiempo de obtención de ingresos. Los bancos que adoptan este enfoque ven a las fintech menos como competidores y más como parte de un ecosistema con el que colaborar para acelerar la innovación y llegar a los clientes. Eficiencia operativa mediante automatización inteligente Las entidades financieras seguirán centrándose en la eficiencia operativa y el control de costes mediante la automatización de los procesos manuales y basados en papel. Los bancos han hecho algunos progresos en la digitalización y automatización de lo que antes eran procesos manuales basados casi exclusivamente en papel. Sin embargo, el principal motor de esta transformación ha sido el cumplimiento de la normativa local, en lugar de una estrategia global para conocer realmente al cliente y lograr su satisfacción. El mercado demanda mejores decisiones automatizadas y basadas en datos, y los sistemas heredados no pueden seguir el ritmo. Crear las experiencias hiperpersonalizadas que demandan los clientes, como chatbots, portales de autoservicio y análisis forense digital, es difícil para las entidades que utilizan tecnología obsoleta. Además, tener una infraestructura de datos en silos impide cualquier experiencia moderna verdaderamente integrada. Mediante una combinación de automatización robótica de procesos (RPA, del inglés "Robotic Process Automation"), aprendizaje automático (ML, del inglés "Machine Learning") e inteligencia artificial (IA), las entidades financieras pueden agilizar los procesos, liberando así a los empleados para que se centren en tareas que tengan un mayor impacto para el cliente y la empresa. Las entidades no deben digitalizar sin tener en cuenta la interacción humana que será sustituida, ya que los clientes prefieren un enfoque híbrido. La capacidad de actuar sobre los datos en tiempo real es el camino a seguir para impulsar el valor y transformar las experiencias de los clientes, lo que debe ir acompañado de la modernización de la arquitectura de datos subyacente. El prerrequisito para alcanzar este objetivo implica la disociación de los datos y las fuentes en un paisaje de datos holístico. Algunos lo llaman data mesh otros fuentes de datos componibles o datos virtualizados. Resolviendo los retos que plantean los datos ESG (del inglés "Environmental, Social, and Governance") Junto con la elevada inflación, la crisis del coste de la vida, las turbulencias energéticas y la subida de los tipos de interés, los aspectos medioambientales, sociales y de gobernanza (ESG) también están en el punto de mira. Los reguladores presionan cada vez más para que se faciliten datos ESG y los inversores para que los portfolios sean sostenibles. El papel de los datos ESG en la realización de análisis de mercado, el apoyo a la asignación de activos y la gestión de riesgos, y proporcionar información sobre la sostenibilidad a largo plazo de las inversiones sigue creciendo. La naturaleza y variabilidad de muchas métricas ESG es un reto importante al que se enfrentan las empresas hoy en día. A diferencia de los datasets financieros, que son en su mayoría numéricos, las métricas ESG pueden incluir datos cuantitativos y cualitativos para ayudar a los inversores y otras partes interesadas a comprender las acciones e intenciones de una empresa. Esta complejidad, unida a la falta de una norma de información ESG de aplicación universal, significa que las instituciones deben considerar diferentes normas con diferentes requisitos de datos. Para dominar la elaboración de informes ESG, incluida la integración de los KPIs pertinentes, se necesitan datos adecuados y de alta calidad que, además, tengan el nivel de granularidad adecuado y cubran los sectores y la región requeridos. Dado el volumen y la complejidad de los datos, las entidades financieras están construyendo plataformas ESG sustentadas en modernas plataformas de datos capaces de consolidar distintos tipos de datos de varios proveedores, crear vistas personalizadas, modelizar datos y realizar operaciones sin barreras. Pagos digitales - Ofreciendo una experiencia enriquecida Impulsado por las nuevas tecnologías y las tendencias mundiales, el mercado de los pagos digitales está floreciendo en todo el mundo. Con una valoración de más de 68.000 millones de dólares en 2021 y expectativas de crecimiento de dos dígitos en la próxima década, los mercados emergentes lideran la expansión relativa. Este crecimiento se ha visto impulsado por los pagos sin efectivo inducidos por la pandemia, el comercio electrónico, el impulso gubernamental y las fintech. Los pagos digitales están transformando la experiencia de pago. Mientras que antes bastaba con que los proveedores de servicios de pago facilitaran información sobre las cuentas y orquestaran transacciones sencillas, ahora los consumidores exigen una experiencia mucho más completa en la que cada transacción ofrezca nuevas perspectivas y servicios de valor añadido. Satisfacer estas expectativas es difícil, especialmente para las empresas que dependen de tecnologías obsoletas que se crearon mucho antes de que las transacciones se realizaran con unos pocos clicks en un dispositivo móvil. Para satisfacer las necesidades de los clientes, las instituciones financieras están modernizando su infraestructura de datos de pagos para crear experiencias de pago personalizadas, seguras y en tiempo real, todo ello protegiendo a los consumidores del fraude. Esta modernización permite a las entidades financieras ingerir cualquier tipo de datos, poner en marcha servicios más rápidamente, a un coste menor y tener la libertad de ejecutarlos en cualquier entorno, desde el local hasta la multi-nube. Seguridad y gestión de riesgos Los datos son fundamentales para todas las instituciones financieras; se reconocen como un activo esencial para impulsar el crecimiento de los clientes y la innovación. Sin embargo, a medida que aumenta la necesidad de aprovechar los datos de forma eficiente, según el 57% de los responsables de la toma de decisiones , la tecnología heredada que aún sustenta a muchas organizaciones es demasiado cara y no cumple los requisitos de las aplicaciones modernas. Esta infraestructura heredada no sólo es compleja, sino que además es incapaz de cumplir los requisitos de seguridad actuales. Dado el enorme volumen de datos confidenciales de clientes y consumidores que el sector de los servicios financieros maneja a diario -y la estricta normativa que los regula-, la seguridad debe ser la máxima prioridad. El valor percibido de estos datos también convierte a las organizaciones de servicios financieros en el principal objetivo de las filtraciones de datos. La protección contra el fraude, la gestión de riesgos y la lucha contra el blanqueo de dinero son prioridades importantes para cualquier nueva plataforma de datos, según el estudio de Forrester What's Driving Next-Generation Data Platform Adoption in Financial Services . Para hacer frente a estos retos, la adopción de plataformas de datos de nueva generación seguirá creciendo a medida que las instituciones financieras se den cuenta de todo su potencial para gestionar costes, maximizar la seguridad y fomentar la innovación. Descargue el estudio completo de Forrester - What's Driving Next-Generation Data Platform Adoption in Financial Services - para obtener más información.

March 23, 2023

Modernizing Core Banking: A Shift Toward Composable Systems

Modernizing core banking systems with MongoDB can bring many benefits such as faster innovation, flexible deployment, and instant scalability. According to McKinsey & Company , it is critical for banks to modernize their core banking platforms with a “flexible back end” in order to stay competitive and adapt to new business models. With the emergence of better data infrastructure based on JSON and the ongoing evolution of software design, the next generation of composable core banking processes can be built on MongoDB's developer data platform, offering greater flexibility and adaptability than traditional systems. The current market: Potential core banking solutions Financial disruptors such as fintechs and challenger banks are growing their businesses and attracting customers by building on process-centric core banking systems, while traditional banks struggle with inflexible, legacy systems. As seen in Figure 1 below, two potential solutions are the core banking “platform” and “suite”. The platform solution involves using a single vendor and several closely integrated modules. It also includes a single, large database and a single roadmap. On the other hand, the suite solutions refers to using multiple vendors, multiple loosely integrated modules, multiple databases and roadmaps. However, both of these systems are inflexible and result in vendor lock-in, preventing the adoption of best-of-breed functionalities from other vendors. Figure 1: Core banking solutions: platform, suite and composable ecosystem. A new approach, known as a composable ecosystem as seen on the far right of Figure 1, is being adopted by some financial institutions. This approach consists of distinct independent services and functions, with the ability to incorporate "best of breed" functionality without major integration challenges, multiple loosely coupled roadmaps, and individual component deployment without vendor lock-in. This allows for specialization and the development of advanced individual components that can be combined to deliver the best products and services and is better at adopting new technologies and approaches. Composable ecosystems with MongoDB's developer data platform MongoDB’s developer data platform is the best choice for financial institutions to build a composable core banking ecosystem. Such an ecosystem is made up of four key building blocks as seen below in Figure 2: JSON, BIAN, MACH, and data domains. JSON is a widely-used data format in the financial industry, and MongoDB's BSON extension allows for the storage of additional data types. BIAN is a standard that defines a component business blueprint for banking, and MongoDB's technology supports BIAN and embodies MACH principles. MACH is a set of design principles for component-based architectures, and data domains enable the mapping of business capabilities to applications and data. By using MongoDB's developer data platform, financial institutions can implement flexible and scalable core banking systems that can adapt to the ever-changing market demands. Figure 2: MongoDB, the developer data platform for your core banking system. MongoDB in action: Core banking use cases Companies such as Temenos and Current have utilized MongoDB's capabilities to deliver innovative services and improve performance. As Tony Coleman, CTO of Temenos, said, "Implementing a good data model is a great start. Implementing a great database technology that uses the data model correctly, is vital. MongoDB is a really great fit for banking." MongoDB and Temenos have worked on a number of new, component-based services to enhance the Temenos product family. Financial institutions can embed Temenos components to deliver new functionality in their existing on-premises environments or through a full banking-as-a-service experience with Temenos T365, powered by MongoDB on various cloud platforms. Temenos has a cloud-first, microservices-based infrastructure built with MongoDB, which gives customers flexibility while improving performance. Current is a digital bank that was founded with the aim of providing its customers with a modern, convenient, and user-friendly banking experience. To achieve this, the company needed to build a robust, scalable, and flexible technology platform. Current decided to build its core technology ecosystem in-house, using MongoDB as the underlying database technology. "MongoDB gave us the flexibility to be agile with our data design and iterate quickly," said Trevor Marshall, CTO of Current. In addition, MongoDB's strong security features make it a secure choice for handling sensitive financial data. Overall, MongoDB's capabilities make it a powerful choice for driving innovation and simplifying landscapes in the financial sector. Conclusion In conclusion, the financial industry is in need of modernizing their core banking systems to stay competitive in the face of rising disruptors and new business models. A composable ecosystem, utilizing a developer data platform like MongoDB, offers greater flexibility and adaptability than traditional legacy systems. If you’d like to learn more about how MongoDB can optimize your core banking functionalities, take a look at our white paper: Componentized Core Banking: The next generation of composable banking processes built upon MongoDB .

January 26, 2023

Predictions 2023: Modernization Efforts in the Financial Services Industry

As a global recession looms, banks are facing tough economic conditions in 2023. Lowering costs will be vital for many organizations to remain competitive in a data-intensive and highly regulated environment. Thus, it’s important that any IT investments accelerate digital transformation with innovative technologies that break down data silos, increase operational efficiency, and build personalized customer experiences. Read on to learn about areas in which banks are looking to modernize in 2023 to build better customer experiences at a lower cost and at scale. Shaping a better banking future with composable designs With banks eager to modernize and innovate, institutions must move away from the legacy systems that are restricting their ability to show progress. Placing consumers at the center of a banking experience made up of interconnected, yet independent services offers technology-forward banks the chance to reshape their business models and subsequently grow market share and increase profitability. These opportunities have brought to fruition a composable architecture design allows faster innovation, improved operational efficiency, and creates new revenue streams by extending the portfolio of services and products. Thus, banks are able to adopt the best-of-breed and perfect-fit-for-purpose software available by orchestrating strategic partnerships with relevant fintechs and software providers. This new breed of suppliers can provide everything from know your customer (KYC) services to integrated booking, load services or basic marketing and portfolio management functionalities. This approach is more cost efficient for institutions than having to build and maintain the infrastructure themselves, and it is significantly faster in terms of time to market and time to revenue. Banks adopting such an approach are seeing fintechs less as competitors and more as part of an ecosystem to collaborate with to accelerate innovation and reach customers. Operational efficiency with intelligent automation Financial institutions will continue to focus on operational efficiency and cost control through automating previous manual and paper-driven processes. Banks have made some progress digitizing and automating what were once almost exclusively paper-based, manual processes. But, the primary driver of this transformation has been compliance with local regulations rather than an overarching strategy for really getting to know the client and achieving true customer delight. The market is eager for better automated and data-driven decisions, and legacy systems can’t keep up. Creating hyper-personalized experiences that customers demand, which include things like chatbots, self-service portals, and digital forensics, is difficult for institutions using outdated technology. And, having data infrastructure in siloes prohibits any truly integrated modern experience. Using a combination of robotic process automation (RPA), machine learning (ML), and artificial intelligence (AI), financial institutions are able to streamline processes, thereby freeing the workforce to focus on tasks that drive a bigger impact for the customer and business. Institutions must not digitize without considering the human interaction that will be replaced, as customers prefer a hybrid approach. The ability to act on real-time data is the way forward for driving value and transforming customer experiences, which must be accompanied by the modernization of the underlying data architecture. The prerequisite for this goal involves the de-siloing of data and sources into a holistic data landscape. Some people call it a data mesh , some composable data sources, virtualized data. Solving ESG data challenges Along with high inflation, the cost-of-living crisis, energy turmoil, and rising interest rates, environmental, social, and governance (ESG) is also in the spotlight. There is growing pressure from regulators to provide ESG data and from investors to make sure portfolios are sustainable. The role of ESG data in conducting market analysis, supporting asset allocation and risk management, and providing insights into the long-term sustainability of investments continues to expand. The nature and variability of many ESG metrics is a major challenge facing companies today. Unlike financial datasets that are mostly numerical, ESG metrics can include both quantitative and qualitative data to help investors and other stakeholders understand a company’s actions and intentions. This complexity, coupled with the lack of a universally applicable ESG reporting standard, means institutions must consider different standards with different data requirements. To master ESG reporting, including the integration of relevant KPIs, appropriate, high-quality data is needed that is also at the right level of granularity and covers the required industries and region. Given the data volume and complexity, financial institutions are building ESG platforms underpinned by modern data platforms that are capable of consolidating different types of data from various providers, creating customized views, modeling data, and performing operations with no barriers. Digital payments - Unlocking an enriched experience Pushed by new technologies and global trends, the digital payments market is flourishing globally. With a valuation of more than $68 billion in 2021 and expectations of double-digit growth over the next decade, emerging markets are leading the way in terms of relative expansion. This growth has been driven by pandemic-induced cashless payments, e-commerce, government push, and fintechs. Digital payments are transforming the payments experience. While it was once enough for payment service providers to supply account information and orchestrate simple transactions, consumers now expect an enriched experience where each transaction offers new insights and value-added services. Meeting these expectations is difficult, especially for companies that rely on outdated technologies that were created long before transactions were carried out with a few taps on a mobile device. To meet the needs of customers, financial institutions are modernizing their payments data infrastructure to create personalized, secure, and real-time payment experiences — all while protecting consumers from fraud. This modernization allows financial institutions to ingest any type of data, launch services more quickly at a lower cost, and have the freedom to run in any environment, from on-premises to multi-cloud . Security and risk management Data is critical to every financial institution; it is recognized as a core asset to drive customer growth and innovation. As the need to leverage data efficiently increases, however, according to 57% of decision makers , the legacy technology that still underpins many organizations is too expensive and doesn’t fulfill the requirements of modern applications. Not only is this legacy infrastructure complex, it is unable to meet current security requirements. Given the huge amount of confidential client and customer data that the financial services industry deals with on a daily basis — and the strict regulations surrounding that data — security must be of highest priority. The perceived value of this data also makes financial services organizations a primary target for data breaches. Fraud protection, risk management, and anti-money laundering are high priorities for any new data platform according to Forrester’s What’s Driving Next-Generation Data Platform Adoption in Financial Services study. To meet these challenges, adoption of next-generation data platforms will continue to grow as financial institutions realize their full potential to manage costs, maximize security, and foster innovation. Download Forrester’s full study — What’s Driving Next-Generation Data Platform Adoption in Financial Services — to learn more.

January 17, 2023

Forrester Study: How IT Decision Makers Are Using Next-Generation Data Platforms

Data is critical to every financial institution; it is recognized as a core asset to drive customer growth and innovation. As the need to leverage data efficiently increases, however, the legacy technology that still underpins many organizations is not built to support today’s requirements. Not only is this infrastructure costly and complex, it doesn’t support the diversity of workloads and functions that modern applications require. To overcome these challenges, organizations are increasingly adopting an integrated data platform that offers a seamless developer experience, runs anywhere, and scales to meet growing business needs. To better understand how such data platforms are being used, MongoDB commissioned Forrester Consulting to survey global IT decision makers at financial services and fintech organizations. In this article, we’ll share findings from the survey to help answer questions such as: What impact are legacy technologies having on financial services? What are the requirements for a data platform? And, for those already adopting next-generation data platforms, what benefits are they experiencing? According to the survey, the majority of decision makers are aware of issues related to legacy technologies: 57% of respondents said that their legacy technology was too expensive and doesn’t fulfill the requirements of modern applications. 50% said legacy technology cannot support the volume, variety, and velocity of transactional data. 47% noted that their systems landscape struggled to handle the rate of change required to stay up to date with customer expectations. Download the full study: What’s Driving Next-Generation Data Platform Adoption in Financial Services What is a next-generation data platform? Within the context of this study, a next-generation data platform is defined as supporting flexible and versatile data models, offering multiple access patterns (e.g., document, relational, graph), and catering to the speed, scale, performance, integration, and security needs of small or large organizations for new development or modernization efforts. All of these features are included in a single platform that delivers real-time, consistent, and trusted data to support a business. Adoption of next-generation data platforms in the financial services and fintech space is already high, with nearly 90% of respondents saying they are already adopting. The benefits are already understood, with 74% of respondents acknowledging not only that there are technology benefits but also that a next-generation data platform frees up teams to focus on innovation and enables faster software builds and iterating at scale (76%). The key to innovation - What's driving the adoption of next-gen data platforms? Security and risk management are key use cases Given the huge amount of confidential client and customer data that the financial services industry deals with on a daily basis — and the strict regulations — security must be of highest priority. The perceived value of this data also makes financial services organizations a primary target for data breaches. Many organizations are still working to realize the full potential of adopting next-generation data platforms; however, it’s understood that such platforms are the only way to manage cost, maximize security, and continue to innovate. Fraud protection (51%), risk management (46%) and anti-money laundering (46%) are high priorities for any new data platform, according to respondents. And, these findings directly correlate with 40% of respondents saying that their current database is unable to meet security requirements. Multi-cloud is driving investment Regardless of their size and business mix, most financial institutions have come to understand the benefits of cloud and multi-cloud services. Multi-cloud — the practice of leveraging cloud services from more than one provider — is no longer just a nice-to-have option. Regulators, however, are increasingly focused on cloud concentration risk as so much of the technology underpinning global financial services relies on so few large cloud services providers. Regulators have so far offered financial institutions warnings and guidance rather than enacting new regulations, although they are increasingly focused on ensuring that the industry is considering plans. An outage or cyberattack at a large public cloud provider, they worry, could derail the global financial system. Decision makers are finding that multiple clouds provide them with lower costs, higher performance, and greater flexibility. This is why, according to the survey, the top driver for investment for decision makers when adopting next-generation data platforms is multi/hybrid cloud capabilities (49%), followed by scalability (44%). Improving real-time analytics capabilities The ability to perform real-time analytics is key for financial institutions, as they need to provide more personalized customer experiences, react more quickly to market trends, and detect and prevent potential threats. With legacy systems, few of these organizations can respond to changes in data minute by minute or second by second. Among survey respondents, real-time analytics was the top feature (54%) that organizations are interested in with regard to next-generation data platforms. With improved analytics capabilities, businesses can analyze any data in place and deliver insights in real time. Legacy infrastructure is holding organizations back To remain competitive and build experiences that retain customers, financial institutions need to master their data estate. Specifically, they need to free themselves from the rigid data architectures associated with legacy mainframes and monolithic enterprise banking applications. Only then can developers build high-quality customer-facing applications rather than maintain legacy systems. High costs and data complexity are the top challenges driving organizations to modernize legacy workloads and unlock business agility. According to 57% of IT decision-makers questioned, legacy technology is too expensive and does not fulfill the requirements of modern applications. This correlates with 79% of respondents seeking a data platform that will address multiple workloads — ranging from transactional to analytical — as data continues to expand. What is the impact? Financial organizations use next-generation data platforms to replace legacy technologies that fragment and duplicate data and cause internal silos. This change also addresses key needs like reducing costs, lowering complexity, better onboarding for customers, and meeting security requirements. Once in place, a next-generation data platform provides several advantages, including minimizing data inconsistencies (43%), expanding geographical coverage (42%), freeing up resources (40%), and reducing time-to-market for new ideas (37%). Other advantages include eliminating the impact of database downtime for upgrades, migrations, and schema changes. And, additional benefits can be seen within the customer and employee experience, as they engage with and access information. Based on these benefits, financial services organizations are looking to increase investment in next-generation data platforms by an average of one million dollars or more in the next one to three years. The volume and variety of data that financial services companies must deal with will only increase in the coming years. As such, figuring out how to leverage, protect, and innovate around that data will put organizations in good stead moving forward. A next-generation data platform can be the key to making this happen. About the study MongoDB commissioned Forrester Consulting to conduct a study questioning global IT decision makers at financial services and fintech organizations to evaluate the impact they are experiencing when adopting next-generation data platforms. The study evaluates the benefits, challenges, and barriers of adoption that decision makers are experiencing, as well as the outcomes after adoption. To create this study, Forrester Consulting supplemented this research with custom survey questions asked of database/data platform strategy decision-makers in finserv (73%) or fintech (27%) from North America (22%), Europe (39%), and APAC (39%). The organizations questioned had 1,000+ employees. The custom survey began and was completed in September 2022. Download the full study — What’s Driving Next-Generation Data Platform Adoption in Financial Services — to learn about organizations’ needs and plans for using next-generation data platforms.

December 13, 2022

4 Ways to Create a Zero Trust Environment in Financial Services

For years, security professionals protected their IT much like medieval guards protected a walled city — they made it as difficult as possible to get inside. Once someone was past the perimeter, however, they had generous access to the riches within. In the financial sector, this would mean access to personal identifiable information (PII), including a “marketable data set” of credit card numbers, names, social security information, and more. Sadly, such breaches occurred in many cases, adversely affecting end users. A famous example is the Equifax incident, where a small breach led to years of unhappy customers. Since then, the security mindset has changed as users increasingly access networks and applications from any location, on any device, on platforms hosted in the cloud — the classic point-to-point security approach is obsolete. The perimeter has changed, so reliance on it as a protective barrier has changed as well. Given the huge amount of confidential client and customer data that the financial services industry deals with on a daily basis — and the strict regulations — security needs to be an even higher priority. The perceived value of this data also makes financial services organizations a primary target for data breaches. In this article, we’ll examine a different approach to security, called zero trust , that can better protect your assets. Paradigm shift Zero trust presents a new paradigm for cybersecurity. In a zero trust environment, the perimeter is assumed to have been breached; there are no trusted users, and no user or device gains trust simply because of its physical or network location. Every user, device, and connection must be continually verified and audited. Here are four concepts to know about creating a zero trust environment. 1. Securing the data Although ensuring access to banking apps and online services is vital, the database, which is the backend of these applications, is a key part of creating a zero trust environment. The database contains much of an organization’s sensitive, and regulated, information, along with data that may not be sensitive but is critical to keeping the organization running. Thus, it is imperative that a database be ready and able to work in a zero trust environment. As more databases are becoming cloud-based services, an important aspect is ensuring that the database is secure by default—meaning it is secure out of the box. This approach takes some of the responsibility for security out of the hands of administrators, because the highest levels of security are in place from the start, without requiring attention from users or administrators. To allow access, users and administrators must proactively make changes— nothing is automatically granted. As more financial institutions embrace the cloud, securing data can get more complicated. Security responsibilities are divided between the clients’ own organization, the cloud providers, and the vendors of the cloud services being used. This approach is known as the shared responsibility model. It moves away from the classic model where IT owns hardening of the servers and security and then needs to harden the software on top—for example, the version of the database software—and then harden the actual application code. In this model, the hardware (CPU, network, storage) are solely in the realm of the cloud provider that provisions these systems. The service provider for a Data-as-a-Service model then delivers the database hardened to the client with a designated endpoint. Only then does the actual client team and their application developers and DevOps team come into play for the actual solution. Security and resilience in the cloud are only possible when everyone is clear on their roles and responsibilities. Shared responsibility recognizes that cloud vendors ensure that their products are secure by default, while still available, but also that organizations take appropriate steps to continue to protect the data they keep in the cloud. 2. Authentication for customers and users In banks and finance organizations, there is a lot of focus on customer authentication, or making sure that accessing funds is as secure as possible. It’s also important, however, to ensure secure access to the database on the other end. An IT organization can use various methods to allow users to authenticate themselves to a database. Most often, the process includes a username and password. But, given the increased need to maintain the privacy of confidential customer information by financial services organizations, this step should only be viewed as a base layer. At the database layer, it is important to have transport layer security and SCRAM authentication , which enables traffic from clients to the database to be authenticated and encrypted in transit. Passwordless authentication should also be considered—not just for customers, but for internal teams as well. This can be done in multiple ways with the database, for example, auto-generated certificates may be required to access the database. Advanced options exist for organizations already using X.509 certificates that have a certificate management infrastructure. 3. Logging and auditing In the highly regulated financial industry, it is also important to monitor your zero trust environment to ensure that it remains in force and encompasses your database. The database should be able to log all actions or have functionality to apply filters to capture only specific events, users, or roles. Role-based auditing lets you log and report activities by specific roles, such as userAdmin or dbAdmin, coupled with any roles inherited by each user, rather than having to extract activity for each individual administrator. This approach makes it easier for organizations to enforce end-to-end operational control and maintain the insight necessary for compliance and reporting. 4. Encryption With large amounts of valuable data, financial institutions also need to make sure that they are embracing encryption —in flight, at rest, and even in use. Securing data with client-side, field-level encryption allows you to move to managed services in the cloud with greater confidence. The database only works with encrypted fields and organizations control their own encryption keys, rather than having the database provider manage them. This additional layer of security enforces an even more fine-grained separation of duties between those who use the database and those who administer and manage it. Also, as more data is being transmitted and stored in the cloud—some of which are highly sensitive workloads—additional technical options to control and limit access to confidential and regulated data is needed. However, this data still needs to be used. So, ensuring that in-use data encryption is part of your zero trust solution is vital. This approach enables organizations to confidently store sensitive data, meeting compliance requirements while also enabling different parts of the business to gain access and insights from it. Conclusion In a world where security of data is only becoming more important, financial services organizations rank among those with the most to lose if data gets into the wrong hands. Ditching the perimeter mentality and moving toward zero trust—especially as more cloud and as-a-service offerings are embedded in infrastructure—is the only way to truly protect such valuable assets. Learn more about developing a strategic advantage in financial services. Read the ebook now .

September 26, 2022

Open Banking: How to Future-Proof Your Banking Strategy

Open banking is on the minds of many in the fintech industry, leading to basic questions such as: What does it mean for the future? What should we do today to better serve customers who expect native open banking services? How can we align with open banking standards while they’re still evolving? In a recent panel discussion , I spoke with experts in the fintech space: Kieran Hines, senior banking analyst at Celent; Toine Van Beusekom, strategy director at Icon Solutions; and Charith Mendis, industry lead for banking at AWS. We discussed open banking standards, what the push to open banking means for innovation, and more. This article provides an overview of that discussion and offers best practices for getting started with open banking. Watch the panel discussion Open Banking: Future-Proof Your Bank in a World of Changing Data and API Standards to learn how you can future-proof your open banking strategy. Fundamentals To start, let’s answer the fundamental question: What is open banking ? The central tenet of open banking is that banks should make it easy for consumers to share their financial data with third-party service providers and allow those third parties to initiate transactions on their behalf — adding value along the way. But, as many have realized, facilitating open banking is not so easy. At the heart of the open banking revolution is data — specifically, the infrastructure of databases, data standards, and open APIs that make the free flow of data between banks, third-party service providers, and consumers possible. What does this practice mean for the banking industry? In the past, banks almost exclusively built their own products, which has always been a huge drain on teams, budgets, and infrastructure. With open banking, financial services institutions are now partnering with third-party vendors to distribute products, and many regulations have already emerged to dictate how data is shared. Because open banking is uncharted territory, it presents an array of both challenges — mostly regulatory — and opportunities for both established banks and disruptors to the space. Let’s dig into the challenges first. Challenges As open banking, and the technology practices that go along with it, evolve, related compliance standards are emerging and evolving as well. If you search for “open banking API,” you’ll find that nearly every vendor has their own take on open banking and that they are all incompatible to boot. As with any developing standard, open banking standards are not set in stone and will continue to evolve as the space grows. The fast-changing environment will hinder those banks that do not have a flexible data architecture that allows them to quickly adapt to provider standards as needed. An inflexible data architecture becomes an immediate roadblock with unforeseen consequences. Closely tied to the challenge of maintaining compliance with emerging regulations is the challenge that comes with legacy architecture. Established banks deliver genuine value to customers through time-proven, well-worn processes. In many ways, however, legacy operations and the technology that underpins them are doomed to stand in the way not only of open banking but also operational efficiency goals and the ability to meet the customer experience expectations of a digital-native consumer base. To avoid the slow down of clunky legacy systems, banks need an agile approach to ensure the flexibility to pivot to developing challenges. Opportunities The biggest opportunity for institutions transitioning into open banking is the potential for rapid innovation. Banking IP is headed in new and unprecedented directions. Pushing data to the cloud, untangling spaghetti architecture, or decentralizing your data by building a data mesh frees up your development teams to innovate, tap into new revenue streams, and achieve the ultimate goal: Providing greater value to your customers. As capital becomes scarce in banks, the ability to repeatedly invest in new pilots is limited. Instead of investing months or years worth of capital into an experiment, building new features from scratch, or going to the board to secure funding, banks need to succeed immediately, be able to scale from prototype to global operation within weeks, or fail fast with new technology. Without the limiting factors of legacy software or low levels of capital, experimentation powered by new data solutions is now both free and low risk. Best Practices Now that we’ve described the potential that open banking presents for established and emerging industry leaders, let’s look at some open banking best practices, as described in the panel discussion . Start with your strategy. What’s your open banking strategy in the context of your business strategy? Ask hard questions like: Why do you want to transform? What’s wrong with what’s going on now? How can you fix current operations to better facilitate open banking? What new solutions do you need to make this possible? An entire shift for a business to open banking means an entirely new business strategy, and you need to determine what that strategy entails before you implement sweeping changes. View standards as accelerators, not inhibitors. Standards can seem like a burden on financial institutions, and in most cases, they do dictate change that can be resource intensive. But you can also view changing regulations as the catalyst needed to modernize. While evolving regulations may be the impetus for change, they can also open up new opportunities once you’re aligned with industry standards. Simplify and unify your data. Right now, your data likely lives all over the place, especially if you’re an established bank. Legacy architectures and disparate solutions slow down and complicate the flow of data, which in turn inhibits your adoption of open banking standards. Consider how you can simplify your data by reducing the number of places it lives. Migrating to a MongoDB makes it faster and easier to move data from your financial institution to third parties and back again. Always consider scale. When it comes to open banking, your ability to scale up and scale down is crucial — and is also tied to your ability to experiment, which is also critical. Consider the example of “buy now pay later” service offerings to your clients. On Black Friday, the biggest shopping day of the year, financial institutions will do exponentially more business than, say, a regular Tuesday in April. So, to meet consumer demand, your payments architecture needs to be able to scale up to meet the influx of demand on a single, exceptional day and scale back down on a normal day to minimize costs. Without the ability to scale, you may struggle to meet the expectations of customers. Strive for real time. Today, everyone — from customers to business owners to developers — expect the benefits of real-time data. Customers want to see their exact account balance when they want to see it, which is already challenging enough. If you add the new layer of open banking to the mix, with data constantly flowing from banks to third parties and back, delivering data in real-time to customers is more complex than ever. That said, with the right data platform underpinning operations, the flow of data between systems can be simplified and made even easier when your data is unified on a single platform. If you can unlock the potential of open banking, you can innovate, tap into new revenue streams, shake off the burden of legacy architecture, and ultimately, achieve a level of differentiation likely to bring in new customers. Watch the panel discussion to learn more about open banking and what it means for the future of banks.

May 19, 2022

From Core Banking to Componentized Banking: Temenos Transact Benchmark with MongoDB

Banking used to be a somewhat staid, hyper-conservative industry, seemingly evolving over eons. But banking in recent years has dramatically changed. Under pressure from demanding consumers and nimble new competitors, development cycles measured in years are no longer sufficient in a market expecting new products, such as Buy-Now-Pay-Later, to be introduced within months or even weeks. Just ask Temenos, the world's largest financial services application provider, providing banking for more than 1.2 billion people . Temenos is leading the way in banking software innovation and offers a seamless experience for their client community. Financial institutions can embed Temenos components, which delivers new functionality in their existing on-premises environments (or in their own environment in their cloud deployments) or through a full banking as a service experience with Temenos T365 powered by MongoDB on various cloud platforms. Temenos embraces a cloud-first, microservices-based infrastructure built with MongoDB, giving customers flexibility, while also delivering significant performance improvements. This new MongoDB-based infrastructure enables Temenos to rapidly innovate on its customers' behalf, while improving security, performance, and scalability. Architecting for a better banking future Banking solutions often have a life cycle of 10 or more years, and some systems I am involved in upgrading date back to the 1980s. Upgrades and changes, often focussed on regulatory or technical upgrades (for example, operating system versions), hardware upgrades, and new functionality, are bolted on. The fast pace of innovation, a mobile-first world, competition, crypto, and Defi are demanding a massive change for the banking industry, too. The definition of new products and roll outs measured in weeks and months versus years requires an equally drastic change in technology adoption. Banking is following a path similar to the retail industry. Retail was built upon a static design approach with monolithic applications connected through ETL (Extract, Transform, and Load) and “unloading of data,” that was robust and built for the times. The accelerated move to omnichannel requirements brought a component-driven architecture design to fruition that allowed faster innovation and fit-for-purpose components being added (or discarded) from a solution. The codification of this is called MACH (Microservices, API first, Cloud-native, and Headless) and a great example is the flexibility brought to bear through companies such as Commercetools . Temenos is taking the same direction for banking. Its concept of components that are seamlessly added to existing Temenos Transact implementations empowers banks to start an evolutionary journey from existing status on-premises environments to a flexible hybrid landscape delivering best of breed banking experiences. Key for this journey is a flexible data concept that meshes the existing environments with requirements of fast changing components available on premises and in the cloud. Temenos and MongoDB joined forces in 2019 to investigate the path toward data in a componentized world. Over the past few years, our teams have collaborated on a number of new, innovative component services to enhance the Temenos product family, and several banking clients are now using those components in production. However, the approach we've taken allows banks to upgrade on their own terms. By putting components “in front” of the Temenos Transact platform , banks can start using a componentization solution without disrupting their ability to serve existing customer requirements. Similarly, Temenos offers MongoDB's critical data infrastructure with an array of deployment capabilities, from full-service multi- or hybrid cloud offerings, to on-premises self-managed, depending on local regulations and the client’s risk appetite. In these and other ways, Temenos makes it easier for its banking clients to embrace the future without upsetting existing investments. From an architectural perspective, this is how component services utilize the new event system of Temenos Transact and enable a new way of operating: Temenos Transact optimized with MongoDB Improved performance and scale All of which may sound great, but you may still be wondering whether this combination of MongoDB and Temenos Transact can deliver the high throughput needed by Tier 1 banks. Based on extensive testing and benchmarking, the answer is a resounding yes . Having been in the benchmark business for a long time, I know that you should never trust just ANY benchmark. (In fact, my colleague, MongoDB distinguished engineer John Page, wrote a great blog post about how to benchmark a database .) But Temenos, MongoDB, and AWS jointly felt the need to remove this nagging itch and deliver a true statement on performance, delivering proof of a superior solution for the client community. Starting with the goal of reaching a throughput of 25,000 transactions, it quickly became obvious that this rather conservative goal could easily be smashed, so we decided to quadruple the number to 100,000 transactions using a more elaborate environment. The newly improved version of Temenos Transact in conjunction with component services proved to be a performance giant. One hundred thousand financial transactions per second with a MongoDB response time under 1ms was a major milestone compared to earlier benchmarks with 79ms response time with Oracle, for example. Naturally, this result is in large part due to the improved component behavior and the AWS Lambda functions that now run the business functionality, but the document model of MongoDB in conjunction with the idiomatic driver concept has proven superior over the outdated relational engine of the legacy systems. Below, I have included some details from the benchmark. As Page once said, “You should never accept single benchmark numbers at face value without knowing the exact environment they were achieved in.” Configuration: table, th, td { border: 1px solid black; border-collapse: collapse; } J-meter Scripts Number of Balance Services Number of Transact Services MongoDB Atlas Cluster Number of Docs in Balance Number of Docs in Transaction 3 6 GetBalance - 4 GetTransactions - 2 4 M80 (2TB) 110M 200M Test Results table, th, td { border: 1px solid black; border-collapse: collapse; } Functional TPS API Latency ms DB Latency ms Get Balance 46751 79.45 0.36 Get Transaction 22340 16.58 0.36 Transact Service 31702 117.15 1.07 Total 100793 71.067 0.715 The underlying environment consists of 200-million accounts with 100-million customers, which shows the scalability the configuration is capable of working with. This setup would be suitable for the largest Tier 1 banking organizations. The well-versed MongoDB user will realize that the used cluster configuration for MongoDB is small. The M80 cluster, 32 VCores with 128GB RAM, is configured with 5 nodes. Many banking clients prefer those larger 5-node configurations for higher availability protection and better read distribution over multiple AWS Availability Zones and regions, which would improve the performance even more. In the case of an Availability Zone outage or even a regional outage, the MongoDB Atlas platform will continue to service via the additional region as back up. The low latency shows that the MongoDB Atlas M80 was not even fully utilized during the benchmark. The diagram shows a typical configuration for such a cluster setup for the American market: one East Coast location, one West Coast location, and an additional node out of both regions in Canada. MongoDB Atlas allows the creation of such a cluster within seconds configured to the specific requirements of the solution deployed. The total landscape is shown in the following diagram: Signed, sealed, and delivered. This benchmark should give clients peace of mind that the combination of core banking with Temenos Transact and MongoDB is indeed ready for prime time. While thousands of banks rely on MongoDB for many parts of their operations ranging from login management and online banking, to risk and treasury management systems, Temenos' adoption of MongoDB is a milestone. It shows that there is significant value in moving from a legacy database technology to MongoDB, allowing faster innovation, eliminating technical debt along the way, and simplifying the landscape for financial institutions, their software vendors, and service providers. If you would like to learn more about MongoDB in the financial services industry, take a look at our guide: The Road to Smart Banking: A Guide to Moving from Mainframe to Data Mesh and Data-as-a-Product

May 18, 2022

Finance, Multi-Cloud, and The Elimination of Cloud Concentration Risk

Regardless of their size and business mix, most financial institutions have come to understand how cloud and multi-cloud computing services can benefit them. There is the cost-effective flexibility to scale, deploy new services, and innovate to stay aligned with rapidly changing customer expectations. There are security and resiliency benefits that can be difficult and expensive to replicate on-premises, especially for smaller institutions trying to keep pace with rapidly changing standards. And there is geographic access to new markets – from China to Canada – that require deployment of local, in-country systems under emerging sovereignty laws. As the industry continues to embrace cloud services, regulators are becoming more aware of the challenges associated with cloud computing, especially those that could expose financial institutions to systemic risks potentially undermining the stability of the financial system. Oversight bodies such as the Financial Stability Board (FSB) and the European Banking Authority have urged regulators worldwide to review their supervisory frameworks to ensure that different types of cloud computing activities are fully scoped into industry guidelines. At the same time, public cloud provider outages have disproved the “never fail” paradigm, and there are growing calls for heightened diligence around cybersecurity risks. Regulators are increasingly focused on cloud concentration risk , or the potential peril created when so much of the technology underpinning global financial services relies on so few large cloud services providers. An outage or cyberattack, they worry, could derail the global financial system. This article will tackle cloud concentration risk for financial services firms, examining how that risk came to be and how multi-cloud can be used to navigate this risk and prepare for future regulations. Part 1: What is cloud concentration risk for financial services? Part 2: Why financial services are evolving from hybrid to multi-cloud Part 3: Solve cloud concentration risk with cross-cloud redundancy Part 4: The limits of a single-vendor public cloud solution Part 5: Commercial and technical benefits of multi-cloud for financial services Part 1: What is cloud concentration risk for financial services? The concern over infrastructure concentration and consolidation is twofold. First is the systemic risk of having too many of the world’s banking services concentrated on so few public cloud platforms. Historically, this problem did not exist as each bank operated its own on-premises infrastructure. Failure in a data center was always limited to one single player in the market. Second is the vulnerability of individual institutions, including many smaller institutions, that outsource critical banking infrastructure and services to a few solution providers. These software-as-a-service “hyperscalers” also tend to run on a single cloud platform, creating cascading problems across thousands of institutions in the event of an outage. In both cases, performance, availability, and security-related concerns are motivating regulators who fear that a provider outage, caused either internally or by bad external actors, could cripple the financial systems under their authority. Such a service shock is much more than a hypothetical worry. In October 2021 Facebook suffered a huge global outage. More than 3.5 billion people who rely on the social network’s applications were without service for more than five hours after Facebook made changes to a single server component that coordinates its data center traffic. Like Facebook, the big three cloud service providers (CSPs), Microsoft Azure , AWS , and Google Cloud , have all suffered similar outages in recent years. For financial services companies, the stakes of a service interruption at a single CSP rise exponentially as they begin to run more of their critical functions in the public cloud. Regulators have so far offered financial institutions warnings and guidance rather than enacting new regulations, though they are increasingly focused on ensuring that the industry is considering plans, such as “cloud exit strategies,” to mitigate the risk of service interruptions and their knock-on effects across the financial system. The FSB first raised formal public concern about cloud concentration risk in an advisory published in 2019, and has since sought industry and public input to inform a policy approach. In June 2021, the Monetary Authority of Singapore issued a sweeping advisory on financial institutions’ cybersecurity risks related to cloud adoption. Meanwhile, authorities are exploring expanding regulations, which could mean action as early as 2022. The European Commission has published a legislative proposal on Digital Operational Resilience aimed at harmonizing existing digital governance rules in financial services including testing, information sharing, and information risk management standards. The European Securities & Markets Authority warned in September 2021 of the risks of “high concentration” in cloud computing services providers, suggesting that “requirements may need to be mandated” to ensure resiliency at firms and across the system. Likewise, the Bank of England’s Financial Policy Committee said it believes additional measures are needed “to mitigate the financial stability risks stemming from concentration in the provision of some third-party services.” Those measures could include the designation of certain third-party service providers as “critical,” introducing new oversight to public cloud providers; the establishment of resilience standards; and regular resilience testing. They are also exploring controls over employment and sub-contractors, much like energy and public utility companies do today. Hoping to get out ahead of regulators, the financial services industry and the hyperscalers are taking steps to address the underlying issues. Part 2: Why financial services are evolving from hybrid to multi-cloud Looking at the existing banking ecosystem, a full embrace of the cloud is extremely rare. While they would like to be able to act like challenger and neo banks, many of the largest and most technology-forward established banks and financial services firms have adopted a hybrid cloud architecture – linking on-premises data centers to cloud-based services – as the backbone of an overarching enterprise strategy. Smaller regional and national institutions, while not officially adopting a cloud-centric mindset, are beginning to explore the advantages of cloud services by working with cloud-based SaaS providers through their existing ISVs and systems integrators. Typically, financial institutions already pair multiple external cloud providers with on-premises infrastructure in an enterprise-wide hybrid cloud approach to IT. In these scenarios, some functions get executed in legacy, on-premises data centers and others, such as mobile banking or payment processing, are operated out of cloud environments, giving the benefits of speed and scalability. Moving to a hybrid approach has itself been an evolution. At first, financial institutions put non-core applications in a single public cloud provider to trial its capabilities. These included non-core systems running customer-facing websites and mobile apps, as well as new digital, data, and analytics capabilities. Some pursued deployments on multiple cloud vendors to handle different tasks, while maintaining robust on-premises primary systems, both to pair with public cloud deployments and to power core services. At MongoDB, we’re increasingly seeing customers, including many financial services companies, run independent workloads on different clouds. However, we believe the real power of multi-cloud applications is yet to be realized. While a hybrid approach utilizing one or two separate cloud providers works for now, the next logical step (taken by many fintech startups ) is to fully embrace the cloud and, eventually, a multi-cloud approach and move away from on-premises infrastructure entirely. Take Wells Fargo. The US-based bank recently announced a two-provider cloud infrastructure and data center strategy, adding that its long-term aspirations are to run most of its services in the public cloud, with an end goal of operating agnostically across providers and free of its own data centers. Are you really multi-cloud? Many large financial institutions will say they are already multi-cloud. For most, that means a hybrid cloud approach , using one or more public cloud service providers to handle distinct workloads while maintaining mission critical services on-premises. In a hybrid cloud deployment both public cloud and private, on-premises infrastructure function as a single unit, with orchestration tools used to deploy and manage workloads between the two components. In recent years, the line between the two cloud types has blurred, with significant advances in the strategy known as hybrid multi-cloud; “hybrid” referring to the presence of a private cloud in the mix, and “multi-cloud” indicating more than one public cloud from more than one service provider. As enterprises increasingly move in this direction, the hybrid multi-cloud (also known simply as hybrid cloud) looks to become the predominant IT environment, at least for larger organizations. The hybrid approach can be seen as a step on the way to harnessing the true potential of a multi-cloud deployment , where data and applications are distributed across multiple CSPs simultaneously, giving financial services firms the ability to: Use data from an application running in one cloud and analyze that data on another cloud without manually managing data movement Use data stored in different clouds to power a single application Easily migrate an application from one cloud provider to another With multi-cloud clusters on MongoDB Atlas, data and applications are free to move across multiple clouds with ease. For financial services firms, the multi-cloud journey is one worth serious consideration, both because it holds the potential to increase performance and meet customer expectations, and because it can reduce the risks of relying on one cloud vendor. Part 3: Solve cloud concentration risk with cross-cloud redundancy For an industry as tightly regulated and controlled as financial services, and with so much sensitive data being moved and stored, security and resilience are critical considerations. Recent service disruptions at the top public cloud providers remind us that no matter how many data centers they run, single cloud providers remain vulnerable to weaknesses created by their own network complexity and interconnectivity across sites. One might argue that even a single cloud provider has better uptime stats than an on-premise solution, but recent outages highlight the need for operational agility, given the high availability and performance requirements of critical applications. When an institution relies on a single provider for cloud services, it exposes its business to the risk of potential service shocks originating from that organization’s technical dependencies, cyberattacks, and vulnerabilities to natural disasters or even freak accidents . Cross-cloud redundancy solves cloud concentration risk Cloud disruptions vary in severity, from temporary capacity constraints to full-blown outages, and financial services companies need to mitigate as much risk as possible. By distributing data across multiple clouds, they can improve high availability and application resiliency without sacrificing latency. With multi-cloud clusters on MongoDB Atlas , financial services firms are able to distribute their data in a single cluster across Azure, AWS, and Google Cloud. MongoDB Atlas extends the number of locations available by allowing users to choose from any of over 80 regions available across major CSPs – the widest selection of any cloud database on the market. This is particularly relevant for financial services firms that must comply with data sovereignty requirements , but have limited deployment options due to sparse regional coverage on their primary cloud provider. In some cases, only one in-country region is available, leaving users especially vulnerable to disruptions in cloud service. For example, AWS has only one region in Canada and Google Cloud has two. With multi-cloud clusters, organizations can take advantage of all three regions, and add additional nodes in the Azure Toronto and Quebec City regions for extra fault tolerance. Several MongoDB customers in the financial services sector have already taken steps toward a true multi-cloud approach by building nodes in a second CSP using MongoDB Atlas. These MongoDB customers are using a 5-and-1 architecture, typically with one CSP as the primary, majority provider, coupled with a secondary backup CSP. In this scenario, the primary CSP holds most of the operations the bank or financial institution needs to run a specific solution, e.g. mobile banking, with the second CSP used for disaster recovery and regulatory compliance in case the first provider has a major outage or service interruption. Often this secondary CSP also acts as a primary for other services at the firm. How Bendigo and Adelaide Bank Simplified Their Architecture and Reached for the Cloud Bendigo and Adelaide Bank, one of Australia’s largest banks, are planning for a multi-cloud future. “As we work to accelerate the transformation of our business, we believe the benefits of cloud will help our business systems by reducing disruption, improving velocity and consistency, and enhancing our risk and vulnerability management position,” said Ash Austin, Bendigo and Adelaide Bank’s cloud platforms service owner . For simplification and cloud centricity, MongoDB Atlas , MongoDB’s cloud database service, was a logical next step. “The fact that MongoDB Atlas supported the three major hyperscalers [Google Cloud, AWS, Azure] helped with portability and supports a multi-cloud future for us,” added Dan Corboy, a Cloud Engineer at Bendigo and Adelaide bank. “It made it really easy for us to choose MongoDB because we didn’t have to then hedge our bets on a particular cloud provider or a particular process – we could be flexible.” Part 4: The limits of a single-vendor public-cloud solution In part 1 we explored the evolution of cloud adoption in the financial services sector and the growing attention on infrastructure concentration risk created from hybrid cloud approaches incorporating only one or two isolated or loosely connected public cloud service providers. Beyond the looming regulatory issues, there are a number of practical business and technology limitations of a single-cloud approach that the industry must address to truly future-proof their infrastructure. Drawbacks to a single-cloud or hybrid approach include: Geographic constraints Not all cloud service providers operate in every business region. Choosing a provider that satisfies today’s location needs seems sensible now, but could prove limiting in the future if an organization expands into new geographies that are underserved by their chosen cloud service provider. A multi-cloud strategy extends the geographic availability of data centers to a longer list of countries served by all the major providers. The availability of local cloud solutions grows increasingly important as more countries adopt data sovereignty and residency laws designed to govern how data is collected, stored and used locally. Sovereignty rules mandate that data collected and stored within a country be subject to the laws, regulations and best practices for data collection of that country. Data residency laws require that data about a country’s citizens be collected and stored inside the country, regardless of whether it ultimately gets replicated and sent abroad. For global financial services companies, this creates thorny technical, operational, and legal issues. Addressing those issues holistically through a single cloud provider is nearly impossible. The topic continues to draw the attention of lawmakers around the world, beyond the handful of countries such as Russia and Canada that drove initial action around these policies. The European Union, for one, is actively scoping a unified EU sovereignty policy and action plan to address its growing concerns about control over its data. Following the success of the General Data Protection Regulation , the Digital Markets Act is set to further shape data policy and regulation in the region. Vendor lock-in Aside from the technical risks of working with a single cloud provider, there is also commercial risk in placing all of an institution’s bets on one cloud provider. The more integrated an institution’s applications are within a single cloud provider, and the more it relies on the third-party services of that single provider, the harder it becomes to negotiate the cost of cloud services or to consider switching to another provider. Over time, as services are customized and adapted to a single cloud provider's protocols and data structures, it becomes operationally challenging to migrate to a different cloud environment. The more intertwined a company’s technical architecture is with a single cloud provider, the more difficult it is to design an exit strategy without putting the business at risk of performance lags, heavy “un-customization” work, or price gouging. By locking in, institutions also lose power to influence service quality should the vendor change the focus of its development, become less competitive, or run into operational problems. Eventually, innovation at the financial services firm slows to the speed of the chosen CSP. Even integrating external apps and services becomes a challenge, reminiscent of the monolithic architecture the new cloud environment was set to replace. Multi-cloud and a robust exit strategy In addition to data portability and high availability, multi-cloud clusters on MongoDB Atlas offer financial services companies a robust set of viable exit strategies when moving workloads to the cloud. While other database services lock clients tightly to one cloud provider and provide little to no leeway to quickly terminate a commercial relationship, MongoDB Atlas can transition database workloads, with zero downtime, from one cloud provider to another. An exit can be made without requiring any application changes, bringing peace of mind for financial services companies planning business continuity and cloud exit scenarios in which either a non-stressed or stressed exit from a cloud vendor might be required. Security homogeneity Cloud service providers invest heavily in security features and are generally considered among the most sophisticated leaders in cyber-security. They proactively manage threats through security measures deployed across customer connection points. For financial services, top cloud providers offer enhanced security to meet strict governance requirements. From a risk standpoint, monitoring and securing a single-cloud hybrid deployment is easier than managing threats across multiple clouds. From the perspective of a threat surface, a single cloud poses fewer risks because there are fewer pathways for would-be hackers. The challenge, though, is responding to an event in a single-cloud environment should an incident, intentional or otherwise, occur. In the event of an infrastructure meltdown or cyberattack, a multi-cloud environment can give organizations the ability to switch providers and to back up and protect their data. Feature limitations Cloud service providers develop new features asynchronously. Some excel in specific areas of functionality and constantly innovate, while others focus on a different set of core capabilities, including Google Cloud’s AI Platform, for instance, Microsoft Azure’s Cognitive Services, and the AWS Lambda platform which enables server-less, event-driven computing. By restricting deployments to one cloud services provider, institutions limit their access to best-of-breed features across the cloud. They’re locked in to using whatever is available on their platform, rather than being able to tap in to advances across clouds. Over time, this can limit innovation and put organizations at a competitive disadvantage. Part 5: Commercial and technical benefits of multi-cloud for financial services As the financial services industry accelerates its cloud-first mindset, more institutions find that a multi-cloud strategy can better position them to meet the rapidly changing commercial, technical, and compliance demands on their business. What’s more, a fully-formed multi-cloud strategy provides an opportunity to partner with the most sophisticated and well-resourced service providers, and to benefit from leading-edge innovation from all of them. The recognition that a single cloud provider is not only limiting them but may be a hindrance is dawning to the leadership of many banks. As the CEO of one large investment bank told MongoDB, “Multi-cloud is an opportunity for us to unlock the full value of each location, not water things down with abstractions and accept the lowest common denominator.” In addition to facilitating access to leading-edge innovations, a multi-cloud approach offers financial services firms multiple additional benefits. Optimize performance Rock-solid service availability and responsiveness are the cornerstones of performance planning in financial services. The goal of any architecture design is to limit downtime and minimize application lag while aligning processing resources to the specific needs of each application. While even single cloud providers log higher uptime than most on-prem solutions involving multiple data centers, a multi-cloud architecture offers additional resiliency and flexibility to meet internal and client performance SLAs that before only mainframe technology (so called Sysplex-cluster) could achieve with 99.9999% availability. In a multi-cloud environment, institutions can dynamically shift workloads among cloud providers to speed up tasks, respond to service disruptions, reduce latency by supporting traffic locally, and address regulatory concerns about one-cloud provider vulnerability. Optimizing for all of these factors yields the best customer experience and the most efficient and cost-effective approach to infrastructure. Scale dynamically for task and geography Scalability and locality is critical. Increasingly, customer demands on product experience are pushing financial services providers to meet new requirements that can sometimes be best delivered through geographic scaling and being close to the end user. It’s not just about who has the greatest amount of storage or the fastest CPU available anymore – it may mean maximizing application responsiveness by running computing resources close to the end-user. This is only becoming more relevant with the roll-out of 5G edge services and the growth in real-time edge computing it requires. Access to multiple clouds creates opportunities to dynamically balance task execution locally for maximum efficiency across geographies, be that California, New York, or Singapore. It also enables institutions to scale storage requirements up and down across providers based on need and cost. In a fast-paced commercial environment, financial institutions can quickly deploy applications at scale in the cloud. By running in multiple clouds, financial institutions have the opportunity to arbitrage cost and performance without compromising their business strategy. Adapt to business changes Financial services companies can stay nimble by building flexible multi-cloud capabilities that enable them to adapt quickly to new regulatory, competitive, and financial conditions. This is as true for challenger banks such as Illimity or Current as it is for established institutions such as Macquarie or NETS . An effective multi-cloud strategy can be a solution to managing regulatory, compliance and internal policy changes by replacing a patchwork of solutions with a common framework across cloud providers. The ability to move seamlessly among cloud providers gives institutions the capability to quickly address situations such as new data sovereignty laws or a merger by shifting workloads to a more advantageous provider. Avoid vendor lock-in With IT costs continuing to grow as a proportion of overall spending, running a multi-cloud strategy can help institutions better manage technology outlays to third-party providers by helping them to avoid vendor lock-in. Not all services are designed equally and switching services between providers can have a multi-million dollar impact on cloud provider bills. In any industry, overreliance on one supplier creates financial and operating risks. The more interconnected, or “sticky”, a single-cloud solution becomes, the more challenging it is to unwind it, should it no longer meet the institution’s needs. And by concentrating services with one provider, companies risk losing financial leverage to negotiate contract terms. By taking a multi-cloud approach, institutions can choose among providers competitively, without being locked in, either commercially by a technical dependency. A multi-cloud approach also allows financial institutions to push harder on providers to develop for their particular needs. Harness innovative features The ability to tap into cloud capabilities such as artificial intelligence and machine learning is a major benefit of working with cloud service providers. Through a multi-cloud approach, developers can select features from across cloud providers and deploy the technical building blocks that best suit their needs. They can run their workloads using different tools on the same data set, without having to do manual data replication. That means institutions can access popular services such as AWS Lambda, Google Tensorflow Cloud AI and Azure Cognitive Services without cumbersome data migrations. As consumers increasingly demand premium product experiences from financial services institutions, those institutions can gain competitive advantages by deploying best-of-breed applications into user services. Looking to learn more about how you can build a multicloud strategy, or what MongoDB can do for financial services? Take a look at the following resources: Get started with multi-cloud clusters on MongoDB Atlas How to create a multi-cloud cluster with MongoDB Atlas MongoDB’s financial services hub

February 25, 2022

금융, 멀티클라우드, 그리고 클라우드 집중 리스크의 제거

규모나 비즈니스 믹스와 관계 없이 대부분의 금융기관은 이제 클라우드와 멀티클라우드 컴퓨팅 서비스가 자사에 어떠한 이점을 제공하는지 이해하게 되었다. 우선, 빠르게 변화하는 고객 기대에 지속적으로 부응하기 위해 서비스를 확장하고 새로운 서비스를 배포하고 혁신을 수행할 때 비용 효율이 높은 유연성을 제공한다. 특히 급변하는 기준에 뒤쳐지지 않기 위해 애쓰는 소규모 기관들이 온-프레미스로 구현하기에는 비싸고 어려울 수 있는 보안 및 복원성 이점도 제공한다. 또한 중국에서부터 캐나다에 이르기까지 새로운 주권 법률에 따른 역내 시스템의 배포가 요구되는 새로운 시장에 대한 지리적 접근성도 있다. 금융 시장이 지속적으로 클라우드 서비스를 수용함에 따라 규제기관들은 클라우드 컴퓨팅과 관련된 문제들, 그 중에서도 금융 시스템 안전성을 해칠 수 있는 리스크에 금융기관을 노출시킬 위험이 있는 문제에 대한 인식을 제고하고 있다. 금융안정위원회(FSB)나 유럽은행감독청(EBA)과 같은 감독 기구들은 전 세계 규제기관에 다양한 유형의 클라우드 컴퓨팅 활동이 업계 가이드라인에 충분히 포함될 수 있도록 자사의 감독 체계를 검토할 것을 촉구해 왔다. 동시에 퍼블릭 클라우드 제공자들의 서비스 중단이 발생하면서 “절대로 장애는 없다”는 패러다임이 깨졌고, 이에 따라 사이버보안 리스크에 대한 경계를 강화해야 한다는 목소리가 커지고 있다. 규제기관들은 클라우드 집중 리스크 , 즉 글로벌 금융 서비스의 근간이 되는 기술이 소수의 대형 클라우드 서비스 제공자에만 지나치게 의존할 때 발생할 수 있는 위험에 대한 경각심을 촉구하면서 서비스 중단이나 사이버공격이 글로벌 금융 시스템의 탈선을 야기할 수 있다고 우려한다. 여기에서는 금융 서비스 기업에 대한 클라우드 집중 리스크에 대해 다루면서 이러한 리스크의 현황과 이러한 리스크를 탐색하고 향후 규제를 마련하는 데 멀티클라우드 의 역할이 무엇인지 살펴본다. 1부: 금융 서비스에 대한 클라우드 집중 리스크란 무엇인가? 2부: 금융 서비스가 하이브리드에서 멀티클라우드로 진화하고 있는 이유는 무엇인가? 3부: 크로스 클라우드 이중화를 통해 클라우드 집중 리스크를 해결하라 4부: 단일 벤더 퍼블릭 클라우드 솔루션의 한계 5부: 금융 서비스를 위한 멀티클라우드의 상업적, 기술적 이점** 전문 보기 1부: 금융 서비스에 대한 클라우드 집중 리스크란 무엇인가?? 인프라 집중 및 통합에 대한 우려는 크게 두 가지다. 첫 번째는 전 세계의 너무 많은 뱅킹 서비스가 소수의 퍼블릭 클라우드 플랫폼에 집중되는 리스크다. 과거에는 은행마다 자체 온-프레미스 인프라를 운영했기 때문에 이런 문제가 없었다. 데이터 센터에서 장애가 발생하더라도 해당 기업 한 곳으로 그 영향이 한정되었다. 두 번째는 중요한 뱅킹 인프라 및 서비스를 소수의 솔루션 제공자에게 위탁한 다수의 소규모 기관들을 포함한 개별 기관들의 취약성이다. 이러한 SaaS “하이퍼스케일러들”도 단일 클라우드 플랫폼에서 실행되어 서비스 중단 시 수많은 기관들에 연쇄적으로 문제를 일으키는 경향이 있다. 두 경우 모두 성능, 가용성 및 보안 관련 문제들이 내부적으로나 외부의 악의적 행위자에 의한 서비스 중단으로 자사의 감독 하에 있는 금융 시스템이 손상을 입는 것을 두려워하는 규제기관을 자극하고 있다. 이러한 서비스 충격은 가설적 우려보다 훨씬 더 큰 것이다. 2021년 10월, 페이스북은 엄청난 규모의 글로벌 서비스 중단을 겪었다. 페이스북이 자사의 데이터 센터 트래픽을 조정하는 단일 서버 구성요소 를 변경하고 나서 이 소셜 네트워크의 여러 애플리케이션에 의존하던 35억 명 이상의 사용자들이 5시간 넘게 서비스를 이용할 수 없었다. 페이스북과 마찬가지로 빅3 클라우드 서비스 제공자(CSP)인 Microsoft Azure , AWS , Google Cloud 도 모두 지난 수년간 유사한 서비스 중단을 경험했다. 금융 서비스 기업의 경우 더 많은 자사의 주요 기능들을 퍼블릭 클라우드에서 실행하기 시작하면서 단일 CSP에서 발생한 서비스 중단 리스크가 기하급수적으로 증가한다. 규제기관들은 금융기관들이 서비스 중단 리스크와 이러한 리스크가 금융 시스템 전반에 미치는 연쇄 효과를 완화하기 위한 “클라우드 출구 전략”과 같은 계획을 고려하도록 좀 더 노력을 기울이고 있으나 지금까지는 새로운 규제를 마련하기 보다 주로 금융기관에 경고와 안내를 제공하는 데 주력해 왔다. FSB는 2019년에 발표한 권고서에서 클라우드 집중 리스크에 관한 공식적인 우려 를 처음으로 제기한 이래 줄곧 정책 수립에 필요한 업계와 일반 대중의 의견을 들어왔다. 2021년 6월에는 싱가포르 금융관리국(MAS)이 클라우드 채택과 관련한 금융기관의 사이버공격 리스크에 관한 포괄적인 권고문 을 발행했다. 한편, 규제기관들은 이르면 2022년부터 시행될 규제 확대를 검토 중이다. 유럽 위원회는 검사, 정보 공유 및 정보 리스크 관리 기준을 포함한 기존의 금융 서비스 디지털 거버넌스 규칙의 국가 간 조화를 목표로 디지털 운용 탄력성(Digital Operational Resilience) 에 관한 입법 제안을 발표했다. 유럽증권시장감독청(ESMA)은 2021년 9월에 클라우드 컴퓨팅 서비스 제공자의 “과밀” 리스크를 경고하면서 기업과 시스템 전체의 탄력성을 위한 요구사항 의무화의 필요성을 제안했다. 마찬가지로 잉글랜드은행(Bank of England)의 금융정책위원회도 “일부 제3자 서비스 제공에 집중되는 데 따른 금융 안정성 리스크를 완화하기 위해” 추가적인 조치가 필요할 것으로 판단한다고 밝혔다. 이러한 조치로는 특정 제3자 서비스 제공자의 “중대” 지정을 통한 퍼블릭 클라우드 제공자에 대한 새로운 감독 기능 도입, 탄력성 기준 수립, 정기적인 탄력성 검사 등이 있을 수 있다. 또한 현재 에너지 및 공익 기업들이 시행하는 것과 같이 고용과 하청업체에 대한 관리도 모색하는 중이다. 규제기관보다 앞서 가기 위해 금융 서비스 기업들과 하이퍼스케일러들은 이러한 근본적인 문제들을 해결하기 위한 조치를 취하는 중이다. 2부: 금융 서비스가 하이브리드에서 멀티클라우드로 진화하고 있는 이유는 무엇인가? 기존의 뱅킹 생태계를 보면 클라우드를 전면적으로 수용하는 경우는 극히 드물다. 도전자나 네오뱅크처럼 행동하고 싶어하는 기술 지향적인 대형 은행 및 금융 서비스 기업들은 온-프레미스 데이터 센터를 클라우드 기반 서비스와 연동시키는 하이브리드 클라우드 아키텍처를 중요한 기업 전략의 근간으로 채택했다. 이들보다는 규모가 작은 지역 및 전국 금융기관들은 클라우드 중심적 마인드셋을 공식적으로 채택하지는 않았으나 자사의 기존 ISV 및 시스템 통합 사업자들을 통해 클라우드 기반 SaaS 제공자들과 협업함으로써 클라우드 서비스의 이점을 찾아보기 시작했다. 일반적으로 금융기관들은 이미 다수의 외부 클라우드 제공자를 온-프레미스 인프라와 IT에 대한 전사적 하이브리드 클라우드 접근법으로 페어링한다. 이러한 시나리오에서는 레거시, 온-프레미스 데이터 센터 등에서 실행되는 모바일 뱅킹이나 결제 처리와 같은 일부 기능들은 클라우드 환경 외부에서 운영되어 속도 및 확장성의 이점을 제공한다. 하이브리드 접근법으로의 이동은 그 자체로 큰 발전이었다. 금융기관들은 먼저 비핵심 애플리케이션들을 단일 퍼블릭 클라우드 제공자에게 투입하여 성능을 시험했다. 여기에는 고객이 이용하는 웹사이트 및 모바일 앱을 실행하는 비핵심 시스템과 새로운 디지털, 데이터 및 분석 기능 등이 포함되었다. 일부는 다양한 작업을 처리하기 위해 다수의 클라우드 벤더에서의 배포를 모색하는 한편 퍼블릭 클라우드 배포와 페어링하고 핵심 서비스를 운영하기 위한 탄탄하고 우수한 온-프레미스 기본 시스템을 유지했다. MongoDB는 여러 금융 서비스 기업을 포함한 고객들이 갈수록 더 많이 다양한 클라우드에서 독립적인 워크로드를 실행하는 것을 목격하고 있다. 그러나 우리는 멀티클라우드 애플리케이션의 진정한 힘은 아직 실현되지 않았다고 믿는다. 한두 곳의 개별 클라우드 제공자를 이용하는 하이브리드 접근법이 현재로서는 효과가 있으나 ( 다수의 핀테크 스타트업 이 채택) 논리적으로 타당한 다음 단계는 클라우드와 궁극적으로는 멀티클라우드 접근법을 전면적으로 수용하고 온-프레미스 인프라와 영구적으로 결별하는 것이 될 것이다. 웰스파고(Wells Fargo)를 예로 들어보자. 미국에 기반을 둔 이 은행은 최근 두 곳의 클라우드 서비스 제공자를 통한 클라우드 인프라 및 데이터 센터 전략을 발표 하면서 대부분의 자사 서비스를 퍼블릭 클라우드에서 실행하고 궁극적으로는 제공자를 가리지 않고 여러 제공자를 통해 자사의 자체 데이터 센터와 무관하게 운영하는 방향으로 간다는 장기적 목표를 제시했다. 진정한 의미의 멀티클라우드인가? 많은 대형 금융기관들이 이미 멀티클라우드를 운용 중이라고 말하지만 대부분에게 중요한 서비스는 온-프레미스에서 유지하면서 하나 이상의 퍼블릭 클라우드 서비스 제공자를 이용해 별개의 워크로드를 처리하는 하이브리드 클라우드 접근법을 의미 한다. 퍼블릭과 프라이빗 클라우드 모두 하이브리드 클라우드 배포에서 온-프레미스 인프라는 단일 유닛으로 기능하며, 이 두 구성요소 간 워크로드 배포 및 관리는 오케스트레이션 툴을 통해 이루어진다. 최근 수년간 이 두 클라우드 유형 간 경계는 모호해졌으며, 하이브리드 멀티클라우드라고 알려진 전략이 크게 발전했다. “하이브리드”는 프라이빗 클라우드가 포함된 것을 의미하고, “멀티클라우드”는 다수의 서비스 제공자로부터 제공되는 다수의 퍼블릭 클라우드를 나타낸다. 기업들이 점점 더 이러한 방향으로 이동함에 따라 하이브리드 멀티클라우드(또는 간단히 하이브리드 클라우드)는 적어도 대규모 조직들에 있어서는 지배적인 IT 환경이 되어가는 것으로 보인다. 하이브리드 접근법은 데이터와 애플리케이션이 동시에 다수의 CSP에 배포되어 금융 서비스 기업들에게 다음과 같은 이점을 제공하는 멀티클라우드 배포의 진정한 잠재력 을 활용하기 위한 과정 중 하나로 여겨질 수 있다. 데이터 이동을 수동으로 관리할 필요 없이 하나의 클라우드에서 실행되는 애플리케이션으로부터 데이터를 이용하고 또 다른 클라우드에서 해당 데이터를 분석 서로 다른 클라우드에 저장된 데이터를 이용하여 단일 애플리케이션을 운영 하나의 클라우드 제공자로부터 또 다른 제공자로 애플리케이션을 손쉽게 마이그레이션 MongoDB Atlas에서의 멀티클라우드 클러스터를 통해 데이터와 애플리케이션을 다수의 클라우드 사이에서 손쉽고 자유롭게 이동시킬 수 있다. 금융 서비스 기업에게 멀티클라우드를 향한 여정은 성능을 개선하고 고객 기대를 만족할 수 있는 잠재력이 있고 단일 클라우드 벤더에 대한 의존도를 낮출 수 있다는 점에서 진지하게 고려해 볼 가치가 있다. 3부: 크로스 클라우드 이중화를 통해 클라우드 집중 리스크를 해결하라 금융 서비스업과 같이 엄격한 규제와 관리가 적용되고 중요한 데이터의 이동 및 저장이 빈번한 분야에서는 보안과 복원력이 매우 중요하다. 최근 유수의 퍼블릭 클라우드 제공자들에서 발생한 서비스 중단 사태를 보면서 이들이 아무리 많은 데이터 센터를 운영하더라도 단일 클라우드 제공자들은 자사의 네트워크 복잡성과 사이트 간 상호연결성에서 기인하는 취약점에서 자유로울 수 없음을 새삼 깨닫게 된다. 혹자는 설령 하나의 단일 클라우드 제공자라 해도 온-프레미스 솔루션보다는 더 나은 업타임 통계를 보여준다고 주장할 수도 있겠으나 최근의 서비스 중단 사태를 보면 중요한 애플리케이션들의 고가용성 및 성능 요건을 고려할 때 운영 민첩성에 대한 필요성을 강조하지 않을 수 없다. 단일 제공자에 의존하여 클라우드 서비스를 이용하는 기관은 자사의 기술적 의존성, 사이버공격, 자연 재해에 대한 취약성 또는 심지어 황당한 사고 등에서 기인하는 잠재적 서비스 충격의 리스크에 노출될 수밖에 없다. 크로스 클라우드 이중화를 통해 클라우드 집중 리스크를 해결하라 클라우드 서비스 중단은 일시적인 용량 제약에서부터 전체 서비스 중단에 이르기까지 그 정도가 다양하며, 금융 서비스 기업들은 가능한 한 많은 리스크를 줄일 필요가 있다. 데이터를 다수의 클라우드에 배포함으로써 지연시간은 그대로 유지한 채 고가용성 과 애플리케이션 복원성을 개선할 수 있다. MongoDB Atlas에서의 멀티클라우드 클러스터 를 통해 금융 서비스 기업들은 Azure, AWS, Google Cloud에 걸친 단일 클러스터에 자사의 데이터를 배포할 수 있다. MongoDB Atlas는 주요 CSP를 통해 클라우드 데이터베이스 중 가장 많은 80개 이상의 지역을 지원하는 등 지역 선택의 폭을 넓혔다. 이는 데이터 주권 요구사항 을 준수해야 하면서도 자사의 기본 클라우드 제공자의 밀도가 낮은 지역 커버리지 때문에 제한된 배포 옵션만을 이용할 수밖에 없는 금융 서비스 기업에 특히 적합하다. 일부의 경우 겨우 하나의 국내 지역만 이용이 가능하기 때문에 사용자들이 클라우드 서비스 중단에 특히 취약할 수밖에 없다. 예를 들어, AWS는 캐나다에 하나의 지역만 있고, Google Cloud도 두 지역밖에 없다. 멀티클라우드 클러스터의 경우 조직들은 세 지역을 모두 활용할 수 있고, Azure Toronto 및 Quebec City 지역에서 추가적인 노드를 통해 더 높은 내고장성을 얻을 수 있다. 금융 서비스 분야의 몇몇 MongoDB 고객은 이미 MongoDB Atlas를 통해 두 번째 CSP에 노드를 구축함으로써 진정한 멀티클라우드 접근법으로 나아가고 있다. 이 고객들은 주로 하나의 기본 CSP와 하나의 보조 백업 CSP로 조합을 이룬 5&1 아키텍처를 이용하고 있다. 이 시나리오에서 기본 CSP는 해당 은행 또는 금융기관이 모바일 뱅킹과 같은 특정 솔루션을 실행하기 위해 필요로 하는 작업의 대부분을 담당하고 보조 CSP는 기본 제공자가 서비스 중단을 겪는 경우 재해 복구 및 컴플라이언스를 목적으로 사용된다. 동일한 기업의 다른 서비스에 대해서는 이러한 보조 CSP가 기본 CSP의 역할을 하는 경우도 많다. Bendigo and Adelaide Bank는 어떻게 자사의 아키텍처를 간소화하고 클라우드를 채택했나 호주의 대형 은행 중 하나인 Bendigo and Adelaide Bank는 멀티클라우드 미래를 준비 중이다.  Bendigo and Adelaide Bank의 클라우드 플랫폼 서비스 책임자 Ash Austin은 “비즈니스 혁신을 가속화하기 위해 노력하는 과정에서 클라우드의 다양한 이점들이 서비스 중단을 줄이고, 속도와 일관성을 제고하고, 리스크 및 취약점 관리를 강화하는 데 도움이 될 것이라고 생각한다”고 말했다. 간소화와 클라우드 중심성을 위해서는 MongoDB의 클라우드 데이터베이스 서비스인  MongoDB Atlas 가 타당한 다음 단계였다. Bendigo and Adelaide Bank의 클라우드 엔지니어 Dan Corboy는 “MongoDB Atlas가 세 주요 하이퍼스케일러(Google Cloud, AWS, Azure)를 모두 지원한다는 점은 이식성과 우리가 생각하는 멀티클라우드 미래에 도움이 된다”고 덧붙였다. “특정 클라우드 제공자나 특정 프로세스에만 의존할 필요 없이 유연하게 대처할 수 있음을 의미했기 때문에 우리는 망설임 없이 MongoDB를 선택할 수 있었다.” 4부: 단일 벤더 퍼블릭 클라우드 솔루션의 한계 1부에서 우리는 금융 서비스 기업들의 클라우드 채택이 어떻게 변화해 왔는지 살펴보고, 한두 곳의 분리되거나 느슨하게 연결된 퍼블릭 클라우드 서비스 제공자만을 포함하는 하이브리드 클라우드 접근법에서 발생하는 인프라 집중 리스크에 대한 증가하는 우려에 대해서도 살펴보았다. 곧 닥칠 규제 문제는 차치하고, 기업들이 자사 인프라의 미래를 제대로 대비하기 위해서 반드시 해결해야 할 단일 클라우드 접근법의 몇 가지 비즈니스적 기술적 한계가 있다. 단일 클라우드 또는 하이브리드 접근법은 다음과 같은 약점을 가진다. 지리적 제약 모든 클라우드 서비스 제공자가 모든 비즈니스 지역에서 서비스를 제공하는 것은 아니다. 고객이 원하는 지역에서 서비스를 제공하는 제공자를 선택하는 것은 그 당시에는 합리적으로 보일 수 있으나 향후 해당 고객이 해당 서비스 제공자가 제대로 커버할 수 없는 지역으로 진출할 경우 문제가 될 수 있다. 멀티클라우드 전략은 데이터 센터의 지리적 가용성을 모든 주요 클라우드 서비스 제공자가 서비스를 제공하는 수많은 국가로 확장시켜 준다. 해당 지역에서 데이터가 수집되고 저장되고 이용되는 방법을 통제하기 위한 데이터 주권 및 거주 법률을 채택하는 국가가 늘어나면서 지역 클라우드 솔루션의 가용성이 갈수록 중요해지고 있다. 주권 규칙은 한 국가에서 수집되고 저장된 데이터는 해당 국가의 데이터 수집에 대한 법규 및 베스트 프랙티스를 따르도록 의무화한다. 데이터 거주 법률은 한 국가의 국민에 관한 데이터는 비록 해당 데이터가 최종적으로는 복제되어 해외로 전송되더라도 해당 국가 내에서 수집되고 저장되도록 요구한다. 이러한 요구사항은 글로벌 금융 서비스 기업들에게 골치 아픈 기술적, 운영적, 법적 문제다. 단일 클라우드 제공자를 통해 이러한 문제들을 전체적으로 다루는 것은 거의 불가능하다. 이 문제는 관련 정책들을 조기에 추진한 러시아나 캐나다와 같은 일부 국가의 국회의원들뿐만 아니라 지속적으로 전 세계 국회의원들의 관심을 끌고 있다. 유럽연합은 역내 데이터 통제에 관한 증가하는 우려를 해결하기 위해 통합 EU 주권 정책과 실행 계획을 적극적으로 검토 중이다. 일반 데이터 보호 규칙(General Data Protection Regulation) 의 성공 이후 역내 데이터 정책 및 규정을 더 가다듬기 위해 디지털 시장법(Digital Markets Act) 도 제정했다. 벤더 락인(lock-in) 단일 클라우드 제공자 이용의 기술적 리스크 외에도 기관의 모든 서비스를 단일 클라우드 제공자를 통해 실행하는 것에 따른 상업적 리스크도 고려해야 한다. 기관의 애플리케이션들이 단일 클라우드 제공자에 집중되고 해당 제공자의 제3자 서비스에 대한 의존도가 높아질 수록 해당 기관은 클라우드 서비스 비용을 협상하거나 다른 제공자로의 변경을 고려하는 것이 더 어려워진다. 시간이 경과하면서 서비스들이 단일 클라우드 제공자의 프로토콜과 데이터 구조에 적응하기 때문에 다른 클라우드 환경으로 이전하기가 운영상 더 힘들어진다. 기업의 기술적 아키텍처가 하나의 단일 클라우드 제공자와 엮이면 엮일수록 성능 지연, 과중한 커스터마이징 해제 작업 또는 비합리적인 가격 상승과 같은 리스크 없이 출구 전략을 계획하기가 더욱 힘들어진다. 이러한 락인(lock-in) 상태에 있는 기관들은 해당 벤더가 개발 초점을 변경하거나 경쟁력을 잃거나 운영 문제에 봉착할 경우 서비스 품질에 대한 영향력까지 잃게 된다. 결국 금융 서비스 기업이 추진하는 혁신은 해당 CSP의 속도로 감속된다. 심지어 외부 앱 및 서비스의 통합조차 어려워지면서 이 새로운 클라우드 환경이 대체했던 기존의 단일 아키텍처로 돌아간 느낌마저 들게 된다. 멀티클라우드와 탄탄한 출구 전략 데이터 이식성 및 고가용성 외에도 MongoDB Atlas에서의 멀티클라우드 클러스터 는 워크로드를 클라우드로 옮길 때 필요한 탄탄하고 실행 가능한 출구 전략을 금융 서비스 기업들에게 제공한다. 다른 데이터베이스 서비스들은 단일 클라우드 제공자에게 고객을 락인시켜 상업적 관계를 빠르게 정리할 재량을 거의 주지 않는 반면, MongoDB Atlas는 데이터베이스 워크로드를 하나의 클라우드 제공자에서 다른 제공자로 다운타임 없이 이전할 수 있다. 이러한 출구 전략은 애플리케이션을 변경할 필요가 전혀 없기 때문에 비즈니스 연속성, 그리고 클라우드 벤더로부터의 탈출이 쉽거나 어려울 수 있는 클라우드 출구 시나리오를 계획하는 금융 서비스 기업들을 안심시킬 수 있다. 보안 균일성 클라우드 서비스 제공자들은 보안 기능에 상당한 투자를 하고 있으며, 일반적으로 사이버보안 분야의 리더로 인식된다. 이들은 모든 고객 연결점에 배포된 보안 기능을 통해 위협을 선제적으로 관리한다. 금융 서비스의 경우 주요 클라우드 제공자들은 엄격한 거버넌스 요구사항을 만족할 수 있는 향상된 보안을 제공한다. 리스크 관점에서 볼 때 단일 클라우드 하이브리드 배포를 모니터링하고 보호하는 것은 다수의 클라우드에서 위협을 관리하는 것보다 더 쉽다. 위협면의 관점에서 보면 단일 클라우드는 해커들이 노릴 수 있는 경로가 많지 않기 때문에 리스크가 줄어든다. 그러나 문제는 의도와 상관없이 어떠한 사고가 발생했을 때 단일 클라우드 환경에서 이벤트에 대응하는 것이다. 인프라 붕괴 또는 사이버공격 발생 시 멀티클라우드 환경은 조직에게 서비스 제공자를 변경하고 자사 데이터를 백업 및 보호할 수 있는 능력을 제공한다. 기능적 한계 클라우드 서비스 제공자는 새로운 기능들을 비동기적으로 개발한다. 일부는 특정 기능 영역에서 뛰어나며 지속적으로 혁신하는 반면, 다른 제공자들은 예컨대 Google Cloud의 AI 플랫폼이나 Azure의 Cognitive Services, 서버리스 이벤트 기반 컴퓨팅을 지원하는 AWS의 Lambda 플랫폼과 같이 서로 다른 핵심 기능 집합에 집중하기도 한다. 하나의 클라우드 서비스 제공자로 배포를 제한함으로써 기관은 전체 클라우드 중에서 동종 최고의 기능들을 쓸 수 없게 된다. 여러 클라우드 중에서 가장 좋은 기능을 이용하는 대신 해당 플랫폼 내에서 이용 가능한 기능만 사용할 수 밖에 없다. 시간이 경과하면서 이는 혁신을 제한하고 조직은 자사의 경쟁 우위를 잃을 수 있다. 5부: 금융 서비스를 위한 멀티클라우드의 상업적, 기술적 이점 금융 서비스 산업이 클라우드 퍼스트 마인드셋을 가속화함에 따라 더 많은 금융기관이 멀티클라우드 전략을 통해 빠르게 변화하는 상업적, 기술적 및 컴플라이언스 요구에 보다 효과적으로 대응할 수 있음을 깨닫고 있다. 뿐만 아니라 완전하게 갖춰진 멀티클라우드 전략은 가장 기술적으로 발전하고 자원이 풍부한 서비스 제공자들과 협업하고 이를 통한 최고의 혁신으로부터 이점을 얻을 기회를 제공한다. 다수의 은행 경영진 사이에서는 단일 클라우드 제공자가 단순히 제한 요인일 뿐만 아니라 저해 요인이 될 수도 있다는 인식이 늘고 있다. 한 대형 투자은행의 CEO는 MongoDB에게 “멀티클라우드는 각 지역의 가치를 최대한 발휘할 기회를 제공하는 것이지 추상화로 사물을 희석하고 최소공통분모를 수용하는 것이 아니다”라고 말했다. 첨단 혁신에 대한 접근을 용이하게 하는 것 외에도 멀티클라우드 접근법은 금융 서비스 기업에게 여러 추가적인 이점을 제공한다. 성능 최적화 견고한 서비스 가용성 및 응답성은 금융 서비스 성능 계획의 초석이다. 모든 아키텍처 설계의 목표는 다운타임을 제한하고 애플리케이션 지연을 최소화하고 동시에 각 애플리케이션의 특정 니즈에 맞춰 처리 리소스를 조정하는 것이다. 단일 클라우드 제공자만 해도 다수의 데이터 센터가 포함된 대부분의 온-프레미스 솔루션보다 더 높은 업타임을 기록하고 있으며, 멀티클라우드 아키텍처는 추가적인 복원성과 유연성을 제공하여 99.9999%의 가용성으로 이전에는 메인프레임 기술(이른바 시스플렉스 클러스터)만이 달성하던 내부 및 외부의 성능 SLA를 만족하도록 지원한다. 멀티클라우드 환경에서 기관들은 클라우드 제공자들 사이에서 워크로드를 동적으로 이동시켜 작업 속도를 향상시키고 서비스 중단에 대응하고 로컬로 트래픽을 지원하여 지연시간을 줄이고 단일 클라우드 제공자가 가진 취약점에 관한 규제 문제를 해결할 수 있다. 이 모든 요소들을 최적화하면 최고의 고객 경험을 창출하고 인프라에 대한 가장 효율적이고 경제적인 접근법을 얻을 수 있다. 작업과 지역에 따라 동적으로 확장하고 축소하라 확장성과 지역성은 매우 중요하다. 제품 경험에 대한 고객 요구가 갈수록 증가하면서 금융 서비스 제공자들은 새로운 요구사항을 만족해야 하는 부담이 늘고 있으며, 이러한 문제는 때로는 지역 확장을 통해 사용자와 가까워지는 방법으로 효과적으로 해결할 수 있다. 이는 누가 가장 큰 저장 용량 또는 가장 빠른 CPU를 보유하고 있는지의 문제일 뿐만 아니라 사용자와 보다 가까운 곳에서 컴퓨팅 리소스를 실행함으로써 애플리케이션 응답성을 극대화하는 문제이기도 하다. 이러한 문제는 5G 에지 서비스의 등장 과 그에 따른 실시간 에지 컴퓨팅의 성장으로 더욱 중요해지고 있다. 멀티클라우드는 예컨대 캘리포니아, 뉴욕, 싱가포르 등 여러 지역에서 효율을 극대화하기 위해 작업 실행에 대한 동적 밸런싱을 로컬로 실행할 기회를 제공한다. 또한 멀티클라우드를 채택한 기관은 수요와 비용에 따라 여러 제공자들 사이에서 스토리지 요구사항을 확장하고 축소할 수 있다. 빠르게 변화하는 비즈니스 환경에서 금융기관들은 애플리케이션을 필요한 규모로 빠르게 클라우드에 배포할 수 있다. 또한 자사의 비즈니스 전략은 그대로 유지하면서 여러 클라우드 사이에서 비용과 성능을 옮기며 유리하게 조정할 수 있다. 비즈니스 변화에 적응 금융 서비스 기업들은 새로운 규제, 경쟁 및 금융 환경에 빠르게 적응할 수 있도록 유연한 멀티클라우드 기능을 구축함으로써 민첩함을 유지할 수 있다. 이는 Macquarie 나 NETS 와 같은 대형 은행뿐만 아니라 Illimity 나 Current 와 같은 도전자들에게도 마찬가지다. 효과적인 멀티클라우드 전략은 여러 솔루션들의 조각 모음을 클라우드 제공자 간 공통 프레임워크로 대체함으로써 규제, 컴플라이언스 및 내부 정책의 변화를 관리하기 위한 해결책이 될 수 있다. 서로 다른 클라우드 제공자 사이를 원활하게 이동할 수 있기 때문에 기관들은 새로운 데이터 주권 법률의 제정이나 합병과 같은 상황이 발생하더라도 워크로드를 보다 유리한 제공자에게 옮김으로써 빠르게 대처할 수 있다. 벤더 락인을 피하라 전체 지출에서 IT 비용의 비중이 지속적으로 증가하고 있다. 멀티클라우드 전략을 실행하는 금융기관들은 벤더 락인을 피함으로써 제3자 제공자에게 나가는 기술 비용을 보다 잘 관리할 수 있다. 모든 서비스가 똑같이 설계되는 것이 아닌 만큼 제공자들 사이에서 서비스를 변경함으로써 수백만 달러의 클라우드 비용을 절감할 수 있을 것이다. 어떤 분야에서든 단일 공급자에 대한 과도한 의존은 금전적, 운영적 리스크를 야기한다. 단일 클라우드 솔루션과 더 “끈끈하게” 얽힐 수록 해당 솔루션이 고객의 니즈를 더 이상 만족할 수 없을 때 그 얽힘을 푸는 일은 더 어렵기 마련이다. 서비스를 단일 제공자에 집중시킴으로써 기업은 해당 제공자와의 계약 협상에서도 재무 레버리지를 잃는 리스크가 생긴다. 멀티클라우드 접근법을 취함으로써 기관은 상업적 또는 기술적 의존성에 따른 락인의 위험 없이 경쟁하는 여러 제공자들 사이에서 선택이 가능한 위치에 설 수 있다. 멀티클라우드 접근법을 채택한 금융기관들은 제공자들에게 자사의 특정 니즈에 맞는 개발을 진행하도록 압력을 행사할 수도 있다. 혁신 기능을 이용하라 인공지능이나 머신러닝과 같은 클라우드 기능은 클라우드 서비스 제공자와의 협업에서 얻는 주요 이점이다. 멀티클라우드 접근법을 통해 개발자들은 여러 클라우드 제공자들이 제공하는 기능들을 선택하고 자사의 니즈에 가장 부합하는 기술적 빌딩 블록을 배포할 수 있다. 개발자들은 수동으로 데이터를 복제할 필요 없이 동일한 데이터셋에서 다양한 툴을 이용하여 워크로드를 실행할 수 있다. 즉, 기관들은 성가신 데이터 마이그레이션 과정 없이 AWS Lambda, Google Tensorflow Cloud AI, Azure Cognitive Services와 같은 유명한 서비스를 이용할 수 있다. 갈수록 높은 수준의 제품 경험을 요구하는 소비자가 늘면서 금융 서비스 기관들은 동급 최고의 애플리케이션을 사용자 서비스에 배포하여 경쟁 우위를 얻을 수 있다. 멀티클라우드 전략 구축 방법 또는 금융 서비스 분야에서의 MongoDB의 이점에 대해 더 많이 알고 싶으십니까? 다음 링크를 확인해 보시기 바랍니다. MongoDB Atlas에서 멀티클라우드 클러스터 시작하기 MongoDB Atlas로 멀티클라우드 클러스터를 생성하는 방법 MongoDB의 금융 서비스 허브

February 25, 2022

For Banks, KYC Should Mean More than Just "Knowing Your Client"

Banks in the loan or mortgage business believe they know their clients well, yet they struggle to offer services that capitalize on customer data or tailor the loan origination experience to the individual based on the existing volume of information they have. That’s one of the key takeaways from Mortgages: A Digital Process to Be Mastered , a new report from MongoDB and FinTechFutures. The report, which surveyed 104 retail banking, business banking, and corporate banking executives, highlights that customer pain is particularly acute when — despite collecting reams of information about clients — banks and loan originators are still unable to turn around loan requests in a timely manner or offer personalized experiences. Click here to check out the Panel Discussion Do You Really Know Your Customer? According to the report, 61% of financial executives said they have industry-leading Know Your Client (KYC) processes. But at the same time, 43% also named poor digital experiences as a barrier to recruiting and retaining customers, with the inability to deliver personalized offers coming in second at 34%. Other issues commonly cited include the speed of innovation, the complexity of doing business, and the inability to serve customers in real time. So, do banks really know their customers? And if they do, what are they doing with that information if not using it to better serve their customers? What’s holding the industry back? Outdated processes, with agents and employees forced to grapple with manual processes, shuffling paper piles, and creating spreadsheets before they can get to work serving the customer. The market is crying out for better automated and data-driven decisions, and legacy systems can’t keep up. Besides the obvious waste of people resources, the lack of a holistic digital offering also hurts business. Customers increasingly cite easy to use and transparent mortgage processes, smooth onboarding, and digital processes as factors behind the lender they choose. Behind the Numbers Banks have made some progress digitizing and automating what were once almost exclusively paper-based, manual processes. But the primary driver of this transformation has been compliance with local regulations rather than an overarching strategy for really getting to know the client and achieving true customer delight. It’s a missed opportunity for a couple of reasons. First, banks create a comprehensive client profile during the onboarding process. They have enough data to perform risk assessments and personalize offers, but instead default to “new client onboarding” processes that are 30+ years old. The simple fact that the consumer is already a long term client with detailed information is ignored. The famous example is the ask for pay slips while the bank can see the actual monthly (or weekly) pay moving in to begin with. Second, most bank customers stay with their chosen financial institution for their entire lifetimes. And as the client relationship matures, the insight banks can bring to bear become deeper and richer. But a slow approval process (40%) and slow response times (39%) were two of the top areas in need of improvement cited in the survey . These are not the hallmarks of industry-leading KYC processes or deep client relationships, but of siloed data and misalignment of digital strategy. What we’re seeing is the difference between the practice of ensuring compliance with local regulations and a strategic imperative for truly understanding the client. While modernization investments have helped automate much of the paper-pushing related to compliance, transforming customer experiences and making LOS more transparent have yet to be achieved. Most executives in the survey planned on leveraging real-time analytics, AI/ML, and workflow software to improve processes. These are all technologies that can take KYC processes beyond simple compliance use cases and lead to more value-added, personalised client relationships. Boris Bialek, Global Head, Industry Solutions Smart Money Today’s clients are demanding fully modern, mobile-first banking experiences. To meet those expectations, bank executives and IT leaders plan to invest in technologies that can address some of their most glaring needs. Chief among them include getting away from manual processes like email and spreadsheets, better data analytics for decision making, and gaining access to real-time information at every touchpoint in the customer journey. The investments they’re willing to make include real-time analytics, artificial intelligence, and machine learning (AI/ML). Along with digital customer experiences (for example, chatbots and personalized recommendations), these are the three areas that bank executives and IT leaders say will drive greater market share and profitability in the loans business. So, even though many banks have started the journey toward modernization, they still have further to go before they’re able to meet the expectations of their clients. It’s not about reducing the paper-pushing or satisfying regulatory requirements involved with the LOS business. It’s about personalization and real-time experiences, hallmarks of true KYC. Mortgages are a Digital Process to be Mastered If real-time data and AI/ML are the way forward for driving value and transforming customer experiences, it will have to be accompanied by modernization of the underlying data architecture . As bank executives and IT leaders in the survey acknowledge , the lack of a digitization strategy, speed to market, and costly legacy migration are their top three concerns when digitizing their mortgage processes. A de-siloing of data and the introduction of data mesh concepts allows the leap from modernizing legacy infrastructure to digital transformation and competitive advantages. Banking innovators strive to be first to market, but legacy systems are holding them back, stymying digitization strategies. Overcoming these and other challenges requires the introduction of a modern data domain model that integrates the transactional and process workloads and augments customer data with information from other legacy and external systems. MongoDB Atlas is perfectly suited for this purpose. We have deep experience building customer 360 models that can be mapped to omnichannel interactions. In addition, MongoDB also has proven capabilities integrating risk and treasury functions (for mortgages this means funds transfer pricing and credit risk), with MongoDB Atlas being used by many banks and other financial service providers in the mortgage space, from building societies in the UK to special purpose lenders in Australia. Lastly, MongoDB’s ability to integrate mobile experiences, search capabilities, and real-time analytics (for example, scoring for consumer ratings while that consumer is on a web page) makes MongoDB the proven data platform for mortgage modernization and true digital transformation.

November 10, 2021