Improve the interactivity of banking applications by using MongoDB Atlas Vector Search and large language models.
Use cases: Gen AI, Personalization
Industries: Financial Services
Products: MongoDB Atlas Vector Search
Partners: Amazon Bedrock
Solution Overview
With interactive banking, financial services customers engage with digital platforms that anticipate and meet their needs in real time.
This approach uses Generative Artificial Intelligence (Gen AI) technologies like chatbots and virtual assistants to enhance basic banking operations. Banks can improve the customer experience by leveraging Gen AI to provide tailored, context-aware interactions via self-service digital channels. From AI-powered chatbots that resolve queries instantly to predictive analytics that offer tailored financial advice, interactive banking creates a more engaging and intuitive banking experience for users.
By integrating AI-driven advisors into the digital banking experience, banks can deliver instant and relevant answers. This results in smoother, more user-friendly interactions in which customers feel supported.
Reference Architectures
In this solution, MongoDB and Amazon Bedrock store bank documentation, such as terms and conditions, as vectorized embeddings within MongoDB documents. The following figure shows this solution's architecture:
Figure 1. AI-driven interactive banking architecture
MongoDB acts as a data store layer between the AI technology layer and the application layer. This streamlines data management by storing unstructured and structured data together and allowing organizations to operate with a more unified dataset. By breaking down data silos, businesses can deliver more consistent customer experiences across their digital platforms.
Data Model Approach
This solution uses MongoDB's flexibility to store both text chunks from PDFs and their embeddings within the same document. This simplifies queries and ensures high performance without needing additional technologies or features. This allows companies to build AI-enriched applications on MongoDB's modern, multi-cloud database platform, unifying real-time, unstructured, and AI-enhanced data.
The image below shows an example of the data used in this solution:
Figure 2. Single document with text chunks and their embeddings
Build the Solution
This solution has the following GitHub repositories:
To run the solutions, see the README files in the repositories.
The architecture has the following workflow:
1. Document Preprocessing
First, the text-based unstructured data, such as Terms & Conditions PDFs, is processed and transformed into chunks using the sliding window technique. This ensures that transitional data between chunks is preserved to maintain continuity and context.
Once the unstructured data has been transformed into vectorized chunks, it is
passed through an embedding model to generate vector embeddings. You can select
the embedding model based on your requirements. This demo uses Cohere's
cohere.embed-english-v3 model on AWS Bedrock.
Both the chunks and their corresponding vectors are stored in MongoDB Atlas. This demo uses the SuperDuper Python framework to integrate AI models and workflows with MongoDB.
2. Vector Search and Querying
After the chunks and embeddings are stored in MongoDB, you can use MongoDB Atlas Vector Search for semantic querying.
3. Using the Chatbot
The chatbot in this solution is powered by MongoDB Atlas Vector Search and a pre-trained LLM. When a user inputs a question, the question is vectorized, and MongoDB Atlas Vector Search is used to find documents with similar embeddings.
After the relevant documents are retrieved, this data is sent to an LLM. This demo uses Claude from Anthropic, contained in Amazon Bedrock. The LLM uses the retrieved documents as context to generate a more comprehensive and accurate response. This process is known as retrieval-augmented generation (RAG). RAG enhances the chatbot's ability to provide accurate answers by combining semantic search with language model generation.
Figure 3. Leafy Bank chatbot in action
Key Learnings
Chatbots enhance user experience: AI-driven technologies like chatbots simplify customer interactions by providing instant, context-aware responses, allowing users to navigate banking operations independently without navigating through complex terms and conditions.
Atlas Vector Search enables PDF search: By using data chunking and Atlas Vector Search, MongoDB enables efficient querying of dense legal documentation, ensuring customers receive accurate, context-rich answers.
MongoDB enables technology integration: MongoDB's integration with Vector Search, LLMs, and dedicated search infrastructure allows financial institutions to scale AI solutions, improving performance and responsiveness as customer demands grow.
Authors
Luis Pazmino Diaz, FSI Principal EMEA, MongoDB
Ainhoa Múgica, Senior Specialist, Industry Solutions, MongoDB
Pedro Bereilh, Specialist, Industry Solutions, MongoDB
Andrea Alaman Calderon, Senior Specialist, Industry Solutions, MongoDB