BlogAtlas Vector Search voted most loved vector database in 2024 Retool State of AI reportLearn more >>
MongoDB Developer
Sign in to follow topics
MongoDB Developer Centerchevron-right
Developer Topicschevron-right

MongoDB With Bedrock Agent: Quick Tutorial

Pavel Duchovny6 min read • Published Jul 01, 2024 • Updated Jul 01, 2024
Facebook Icontwitter iconlinkedin icon
Rate this quickstart
MongoDB Atlas and Amazon Bedrock have joined forces to streamline the development of generative AI applications through their seamless integration. MongoDB Atlas, a robust cloud-based database service, now offers native support for Amazon Bedrock, AWS's managed service for generative AI. This integration leverages Atlas's vector search capabilities, enabling the effective utilization of enterprise data to augment the foundational models provided by Bedrock, such as Anthropic's Claude and Amazon's Titan. The combination ensures that the generative AI models have access to the most relevant and up-to-date data, significantly improving the accuracy and reliability of AI-driven applications​ with MongoDB​.
This integration simplifies the workflow for developers aiming to implement retrieval-augmented generation (RAG). RAG helps mitigate the issue of hallucinations in AI models by allowing them to fetch and utilize specific data from a predefined knowledge base. In this case, MongoDB Atlas developers can easily set up this workflow by creating a vector search index in Atlas, which stores the vector embeddings and metadata of the text data. This setup not only enhances the performance and reliability of AI applications but also ensures data privacy and security through features like AWS PrivateLink​​.
This notebook demonstrates how to interact with a predefined agent using AWS Bedrock in a Google Colab environment. It utilizes the boto3 library to communicate with the AWS Bedrock service and allows you to input prompts and receive responses directly within the notebook.

Key features

  1. Secure handling of AWS credentials: The getpass module is used to securely enter your AWS Access Key and Secret Key.
  2. Session management: Each session is assigned a random session ID to maintain continuity in conversations.
  3. Agent invocation: The notebook sends user prompts to a predefined agent and streams the responses back to the user.


  • AWS Access Key and Secret Key with appropriate permissions
  • Boto3 and Requests libraries for interacting with AWS services and fetching data from URLs

Setting up MongoDB Atlas

  1. Follow the getting started with Atlas guide and set up your cluster with allowed connection for this notebook.
  2. Predefine an Atlas vector index. on database bedrock collection agenda. This collection will host the data for the AWS summit agenda and will serve as a context store for the agent:
Index name: vector_index:

Set up AWS Bedrock

We will use US-EAST-1 AWS region for this notebook.
Follow our official tutorial to enable a Bedrock knowledge base against the created database and collection in MongoDB Atlas. This guide highlights the steps to build the knowledge base and agent.
For this notebook, we will perform the following tasks according to the guide:
Go to the Bedrock console and enable:
  • Amazon Titan Text Embedding model (amazon.titan-embed-text-v2:0)
  • Claude 3 Sonnet Model (The LLM)
Upload the following source data about the AWS Summit agenda to your S3 bucket:
This will be our source data listing the events happening at the Summit.
Go to Secrets Manager on the AWS console and create credentials to our Atlas cluster via "Other type of secret"
  • key : username , value : <ATLAS_USERNAME>
  • key : password , value : <ATLAS_PASSWORD>
Follow the setup of the knowledge base wizard to connect Bedrock models with Atlas:
  • Click "Create Knowledge Base" and input:
ChoseCreate and use a new service role
Data source name<NAME>
S3 URIBrowse for the S3 bucket hosting the 2 uploaded source files
Embedding ModelTitan Text Embeddings v2
KB Starting point
Embedding Settings
  • Let's choose MongoDB Atlas in the "Vector Database." Select the "Choose a vector store you have created" section:
Vector Setup 1
Select your vector storeMongoDB Atlas
HostnameYour atlas srv hostname eg.
Database namebedrock
Collection nameagenda
Credentials secret ARNCopy the created credentials from the "Secrets manager"
Vector search index namevector_index
Vector embedding field pathembedding
Text field pathtext
Metadata field pathmetadata
Vector setup 3 Vector setup 2
Click “Next” and review the details and "Create Knowledge Base."
Once the knowledge base is marked with "Status : Ready," go to the “Data source” section, choose the one datasource we have, and click "Sync" in the right upper corner. This operation should load the data to Atlas if everything was set up correctly.

Setting up an agenda agent

We can now set up our agent, who will work with a set of instructions and our knowledge base.
  1. Go to the "Agents" tab in the bedrock UI.
  2. Click "Create Agent" and give it a meaningful name (e.g., agenda_assistant).
  3. Input the following data in the agent builder:
Agent Nameagenda_assistant
Agent resource roleCreate and use a new service role
Select modelAnthropic - Claude 3 Sonnet
Instructions for the AgentYou are a friendly AI chatbot that helps users find and build agenda Items for AWS Summit Tel Aviv. elaborate as much as possible on the response.
Agent Nameagenda_assistant
Knowledge basesChoose your Knowledge Base
AliasesCreate a new Alias
And now, we have a functioning agent that can be tested via the console. Let's move to the notebook.
Take note of the Agent ID and create an Agent Alias ID for the notebook.

Interacting with the agent

To interact with the agent, we need to install the AWS Python SDK:
Let's place the credentials for our AWS account.
Now, we need to initialise the boto3 client and get the Agent ID and Alias ID input.
Let's build the helper function to interact with the agent.
We can now interact with the agent using the application code.
Here you go! You have a powerful Bedrock agent with MongoDB Atlas. You can run this code via the following interactive notebook.


The integration of MongoDB Atlas with Amazon Bedrock represents a significant advancement in the development and deployment of generative AI applications. By leveraging Atlas's vector search capabilities and the powerful foundational models available through Bedrock, developers can create applications that are both highly accurate and deeply informed by enterprise data. This seamless integration facilitates the retrieval-augmented generation (RAG) workflow, enabling AI models to access and utilize the most relevant data, thereby reducing the likelihood of hallucinations and improving overall performance.
The benefits of this integration extend beyond just technical enhancements. It also simplifies the generative AI stack, allowing companies to rapidly deploy scalable AI solutions with enhanced privacy and security features, such as those provided by AWS PrivateLink. This makes it an ideal solution for enterprises with stringent data security requirements. Overall, the combination of MongoDB Atlas and Amazon Bedrock provides a robust, efficient, and secure platform for building next-generation AI applications​.
If you have questions or want to share your work with other developers, visit us in the MongoDB Developer Community.
Top Comments in Forums
There are no comments on this article yet.
Start the Conversation

Facebook Icontwitter iconlinkedin icon
Rate this quickstart

Part #1: Build Your Own Vector Search with MongoDB Atlas and Amazon SageMaker

Jul 11, 2024 | 4 min read

Listen Along at Scale Up with Atlas Application Services

Apr 02, 2024 | 3 min read

Getting Started with MongoDB Atlas, NodeJS, and Azure App Service

Apr 02, 2024 | 5 min read

Demystifying Stored Procedures in MongoDB

Feb 27, 2024 | 6 min read
Table of Contents