MongoDB Support for LangGraph.js Long-Term Memory

May 8, 2026

What it is: LangGraph.js now supports MongoDB as the backend for long-term agent memory, adding to the short-term memory (Checkpointers) already available. The MongoDB Memory Store keeps and retrieves cross-session data, with support for semantic memory search powered by either an client-side embeddings provider or MongoDB Atlas Automated Embeddings, which generates and indexes vector embeddings server-side via Voyage AI models.

Who it's for: JavaScript and TypeScript developers building LangGraph agents and want a unified database for conversation history, long-term memory, and semantic search.

Why it matters: MongoDB is now a first-class option at every layer of LangGraph.js memory. Teams already running on MongoDB can now keep agent memory in the same database as their operational data: no additional infrastructure for conversation state, long-term storage, or vector search. Semantic memory search lets agents retrieve memories based on meaning, surfacing past context that matches the current conversation. Automated Embeddings removes the last piece of friction: instead of provisioning and calling a separate embedding service, MongoDB handles vectorization server-side, keeping application code focused on agent logic.

How to get started: See LangGraph.js Memory documentation for step-by-step examples covering MongoDB checkpointers, stores, and semantic search or head directly to our official tutorial.

Related Content

Web

Memory - Docs by LangChain

Tutorial

Add Long-Term Memory to LangGraph.js Agents with MongoDB Atlas

Tutorial

How to Automatically Generate Vector Embeddings for Text Data in Your Collection and Queries