Build Event-Driven Apps Locally with MongoDB Atlas Stream Processing
July 1, 2025
Building event-driven architectures (EDAs) often poses challenges, particularly when you’re integrating complex cloud components with local development services. For developers, working directly from a local environment provides convenience, speed, and flexibility. Our demo application demonstrates a unique development workflow that balances local service integration with cloud stream processing, showcasing portable, real-time event handling using MongoDB Atlas Stream Processing and ngrok.
With MongoDB Atlas Stream Processing, you can streamline the development of event-driven systems while maintaining all the components locally. Using this service’s capabilities alongside ngrok, this demo application shows a secure way to interact with cloud services directly from your laptop, ensuring you can build, test, and refine applications with minimal friction and maximum efficiency.
Using MongoDB Atlas Stream Processing
MongoDB Atlas Stream Processing is a powerful feature within the MongoDB Atlas modern database that enables you to process data streams in real time using the familiar MongoDB Query API (and aggregation pipeline syntax). It integrates seamlessly with MongoDB Atlas clusters, Apache Kafka, AWS Lambda, and external HTTP endpoints.
Key takeaway #1: Build event-driven apps more easily with MongoDB Atlas Stream Processing
One of the primary goals of MongoDB Atlas Stream Processing is to simplify the development of event-driven applications. Instead of managing separate stream processing clusters or complex middleware, you can define your processing logic directly within MongoDB Atlas. This means:
-
A unified platform: Keep your data storage and stream processing within the same ecosystem.
-
Familiar syntax: Use the MongoDB Query API and aggregation pipelines you already know.
-
Managed infrastructure: Let MongoDB Atlas handle the underlying infrastructure, scaling, and availability for your stream processors.
Key takeaway #2: Develop and test locally, deploy globally
A significant challenge in developing event-driven systems is bridging the gap between your local development environment and cloud-based services. How do you test interactions with services running on your laptop? You can configure MongoDB Atlas Stream Processing to connect securely to HTTP services and even Apache Kafka instances running directly on your development machine!
You can typically achieve this using a tunneling service like ngrok, which creates secure, publicly accessible URLs for your local services. MongoDB Atlas Stream Processing requires HTTPS for HTTP endpoints and specific Simple Authentication and Security Layer protocols for Apache Kafka, making ngrok an essential tool for this local development workflow.
Introducing the real-time order fulfillment demo
To showcase these capabilities in action, we’ve built a full-fledged demo application available on GitHub.

This demo simulates a real-time order fulfillment process using an event-driven architecture orchestrated entirely by MongoDB Atlas Stream Processing.
What the demo features
-
A shopping cart service: Generates events when cart items change.
-
An order processing service: Handles order creation and validation (running locally as an HTTP service).
-
A shipment service: Manages shipment updates.
-
Event source flexibility: Can ingest events from either a MongoDB capped collection or an Apache Kafka topic (which can also run locally).
-
Processors from Atlas Stream Processing: Act as the central nervous system, reacting to events and triggering actions in the different services.
-
An order history database: Centralizes status updates for easy tracking.

How the demo uses MongoDB Atlas Stream Processing and local development
-
Event orchestration: MongoDB Atlas Stream Processing instances listen for shopping cart events (from MongoDB or Kafka).
-
Local service interaction: An ASP processor calls the
Order Processing Service
running locally onlocalhost
via anngrok
HTTPS tunnel. -
Kafka integration (optional): Demonstrates ASP connecting to a local Kafka broker, also tunneled via
ngrok
. -
Data enrichment & routing: Processors enrich events and route them appropriately (e.g., validating order, triggering shipments).
-
Centralized logging: All services write status updates to a central MongoDB collection that functions as a continuously materialized view of order status and history.
This demo practically illustrates how you can build sophisticated, event-driven applications using ASP while performing key development and testing directly on your laptop, interacting with local services just as you would in a deployed environment.
What the demo highlights
-
Real-world EDA: Provides a practical example of asynchronous service communication.
-
Orchestration powered by MongoDB Atlas Stream Processing: Shows how this service manages complex event flows.
-
Local development workflow: Proves the concept of connecting this service to local HTTP / Apache Kafka via ngrok.
-
Flexible event ingestion: Supports both MongoDB and Apache Kafka sources.
-
Centralized auditing: Demonstrates easy status tracking via a dedicated history collection.
-
Get started with the demo!
MongoDB Atlas Stream Processing significantly lowers the barrier to entry for building robust, real-time EDAs. Its ability to integrate seamlessly with MongoDB Atlas, external services, and, crucially, your local development environment (thanks to tools like ngrok) makes it a powerful addition to the developer toolkit.
Explore the demo project, dive into the code, and see for yourself how ASP can simplify your next event-driven architecture, starting right from your own laptop!
Ready to see it in action? Head over to the GitHub repository!
The repository’s README.md file contains comprehensive, step-by-step instructions to get you up and running. In summary, you’ll:
-
Clone the repository.
-
Set up a Python virtual environment and install dependencies.
-
Crucially, set up ngrok to expose your local order-processing service (and Apache Kafka, if applicable) via secure tunnels. (Details in the README.md appendix!)
-
Configure your .env file with MongoDB Atlas credentials, API keys, and the ngrok URLs.
-
Run scripts to create necessary databases, collections, and the MongoDB Atlas Stream Processing instance/connections/processors.
-
Start the local
order_processing_service.py
. -
Run the
shopping_cart_event_generator.py
to simulate events. -
Query the order history to see the results!
For detailed setup guidance, especially regarding ngrok configuration for multiple local services (HTTP and TCP / Apache Kafka), please refer to the appendix of the project's README.md.