Docs Menu
Docs Home
/
Atlas

Integrate MongoDB Atlas with AI Technologies

MongoDB and partners have developed specific product integrations to help you leverage MongoDB Atlas in your AI-powered applications and AI agents.

This page highlights notable AI integrations that MongoDB and partners have developed. You can use MongoDB Atlas with popular AI providers and LLMs through their standard connection methods and APIs. For a complete list of integrations and partner services, see Explore MongoDB Partner Ecosystem.

You can use the following frameworks to store custom data in Atlas and implement features such as RAG with Atlas Vector Search.

LangChain is a framework that simplifies the creation of LLM applications through the use of "chains," which are LangChain-specific components that can be combined together for a variety of use cases, including RAG.

To get started, see the following resources:

LangChainGo is a framework that simplifies the creation of LLM applications in Go. LangChainGo incorporates the capabilities of LangChain into the Go ecosystem. You can use LangChainGo for a variety of use cases, including semantic search and RAG.

To get started, see Get Started with the LangChainGo Integration.

LangChain4j is a framework that simplifies the creation of LLM applications in Java. LangChain4j combines concepts and functionality from LangChain, Haystack, LlamaIndex, and other sources. You can use LangChain4j for a variety of use cases, including semantic search and RAG.

To get started, see Get Started with the LangChain4j Integration.

LlamaIndex is a framework that simplifies how you connect custom data sources to LLMs. It provides several tools to help you load and prepare vector embeddings for RAG applications.

To get started, see Get Started with the LlamaIndex Integration.

Microsoft Semantic Kernel is an SDK that allows you to combine various AI services with your applications. You can use Semantic Kernel for a variety of use cases, including RAG.

To get started, see the following tutorials:

Haystack is a framework for building custom applications with LLMs, embedding models, vector search, and more. It enables use cases such as question-answering and RAG.

To get started, see Get Started with the Haystack Integration.

Spring AI is an application framework that allows you to apply Spring design principles to your AI application. You can use Spring AI for a variety of use cases, including semantic search and RAG.

To get started, see Get Started with the Spring AI Integration.

You can use the following frameworks to build AI agents that use Atlas to implement features such as agentic RAG and agent memory.

LangGraph is a specialized framework within the LangChain ecosystem designed for building AI agents and complex multi-agent workflows. LangGraph's graph-based approach allows you to dynamically determine the execution path of your application, enabling advanced agentic applications and use cases. It also supports features like persistence, streaming, and memory.

To get started, see the following resources:

You can also integrate Atlas Vector Search with the following enterprise platforms to build generative AI applications. These platforms provide pre-trained models and other tools to help you build AI applications and agents in production.

Amazon Bedrock is a fully-managed platform for building generative AI applications. You can integrate Atlas Vector Search as a knowledge base for Amazon Bedrock to store custom data in Atlas, implement RAG, and deploy agents.

To get started, see Get Started with the Amazon Bedrock Knowledge Base Integration.

Vertex AI is a platform from Google Cloud for building and deploying AI applications and agents. The Vertex AI platform includes several tools and pre-trained models from Google that you can use with Atlas for RAG and other uses cases such as natural language querying.

To get started, see Integrate Atlas with Google Vertex AI.

You can also integrate Atlas with the following AI tools.

Model Context Protocol (MCP) is an open standard for how LLMs connect to and interact with external resources and services. Use our official MCP Server implementation to interact with your Atlas data and deployments from your agentic AI tools, assistants, and platforms.

To learn more, see MongoDB MCP Server.

Back

Changelog

On this page