EVENTGet 50% off your ticket to MongoDB.local NYC on May 2. Use code Web50! Learn more >

Connectors

MongoDB Connector for Apache Kafka

The quickest way to connect to MongoDB and Atlas to work with your data and manage your data platform.
Download Now
A diagram of the Apache Kafka Connector being configured as a sink or a source when integrating with MongoDB.

Configure your connection

The official MongoDB Connector for Apache® Kafka® is developed and supported by MongoDB engineers and verified by Confluent. The Connector enables MongoDB to be configured as both a sink and a source for Apache Kafka.

Easily build robust, reactive data pipelines that stream events between applications and services in real time.

Why MongoDB and Apache Kafka?

MongoDB and Kafka are at the heart of modern data architectures. Kafka is designed for boundless streams of data that sequentially write events into commit logs, allowing real-time data movement between your services.
connectors_kafka_sink

Configure as a Sink

Map and persist events from Kafka topics directly to MongoDB collections with ease. Ingest events from your Kakfa topics directly into MongoDB collections, exposing the data to your services for efficient querying, enrichment, and analytics.

Why MongoDB?

MongoDB customers have experienced success with the Kafka Connector across a span of industries and companies for a variety of use cases.
industry_retail

eCommerce and Customer Single View

ao.com, a leading online electrical retailer, uses Kafka to push all data changes from its source databases to MongoDB Atlas. This creates a single source of truth for all customer data to drive new and enhanced applications and business processes including customer service, fraud detection, and GDPR compliance. Employees with appropriate permissions can access customer data from one easy-to-consume operational data layer.

industry_enterprise

IoT

Josh Software, part of a project in India to house more than 100,000 people in affordable smart homes, pushes data from millions of sensors to Kafka, processes it in Apache Spark, and writes the results to MongoDB, which connects the operational and analytical data sets. By streaming data from millions of sensors in near real-time, the project is creating truly smart homes, and citizens can access data via a mobile app to better manage their homes.

industry_finance

Financial Services

AHL, a subsidiary of The Man Group, which is one of the world’s largest hedge fund investment firms, used MongoDB to create a single platform for all of its financial data. The system receives data for up to 150,000 ticks per second from multiple financial sources and writes it to Kafka. Kafka provides both consolidation and buffering of events before they are stored in MongoDB, where the data can be analyzed.

general_industry_insurance

Customer Journey

comparethemarket.com, a leading price comparison provider, uses MongoDB as the default operational database across its microservices architecture. While each microservice uses its own MongoDB database, the company needs to maintain synchronization between services, so every application event is written to a Kafka topic. Relevant events are written to MongoDB to enable real-time personalization and optimize the customer experience.

general_events_breakout

Opinion and Polling

State, an intelligent opinion network connecting people with similar beliefs, writes survey data to MongoDB and leverages MongoDB Change Streams to push database changes into Kafka topics where they are consumed by its user recommendation engine. This engine suggests potentially interesting users and updates instantly as soon as a user contributes a new opinion.

Featured Resources

Ready to get started?

Get the MongoDB connector for Apache Kafka.
Try It NowContact sales
example