Transform customer service with MongoDB Atlas Vector Search and RAG. Convert call recordings into searchable insights for faster, more accurate responses.
Use Cases: Analytics, Gen AI, Modernization, Personalization
Indstries: Insurance, Financial Services, Healthcare, Retail, Telecommunications
Products: MongoDB Atlas, MongoDB Atlas Search, MongoDB Atlas Vector Search
Partners: AWS, Cohere, LangChain
Solution Overview
A major challenge for many insurance companies is inefficient call centers where agents find it difficult to quickly locate and deliver accurate information to customers. Studies have shown that companies with superior customer experiences outperform their peers. For instance, satisfied customers are 80% more likely to renew their policies, directly contributing to growth.
This solution shows how MongoDB can transform call center operations. It leverages AI and analytics to convert unstructured audio files into searchable vectors. This allows businesses to quickly access relevant information, identify successful resolution strategies and frequently asked questions, and improve the overall customer service experience.
Figure 1 shows how to transform raw audio recordings into vectors. The pipeline works as follows:
Store raw audio files: Store past call recordings in their original audio format.
Process audio files: Use AI and analytic services, such as speech-to-text conversion, content summarization, and vectorization.
Store vectors and metadata: Store the generated vectors and their metadata, such as call timestamps and agent information, in an operational data store.
Figure 1. Customer service call insight extraction and vectorization flow
Once the data is stored in vector format within the operational data store, it becomes accessible for real-time applications. Now, Vector Search can consume this data, or it can be integrated into a Retrieval-Augmented Generation (RAG) architecture. This approach combines Large Language Models (LLMs) with external knowledge sources to generate more accurate and informative outputs.
Reference Architectures
The system architecture, shown in Figure 2, contains the following modules and functions:
Amazon Transcribe receives the audio coming from the customer’s phone and converts it into text.
Cohere provides an embedding model through Amazon Bedrock, which converts the text from Transcribe into vectors.
Atlas Vector Search receives the query vector and returns a document that contains the most semantically similar FAQ in the database.
Figure 2. System architecture and modules
For complete implementation details, see the GitHub repository.
Key Learnings
Transform call centers with AI Services: Integrate AI services, such as speech-to-text, vector embedding, and vector search, with MongoDB Atlas to transform traditional call centers with actionable voice data.
Integrate a RAG-based Architecture: Combine a RAG architecture with Vector Search to generate faster agent responses, chatbots and automated workflows.
Implement real-time agent assistance: Integrate agent assistance to boost business outcomes, such as higher customer satisfaction, stronger loyalty, and improved financial performance.
This solution serves as a foundation for advanced applications that require complex interactions, such as agentic workflows, and multi-step processes with LLMs and hybrid search.It also enhances chatbots and voice bots capabilities, allowing them to deliver more relevant and personalized responses to customers.
Authors
Luca Napoli, MongoDB
Sebastian Rojas Arbulu, MongoDB