Databricks. Data, analytics and AI on one platform
Deliver real-time analytics, scaled out data warehouse, and AI/ML enhanced applications using MongoDB and Databricks.
Connect your Lakehouse to MongoDB Using a Databricks Notebook
Databricks now features MongoDB as a data source. Create a unified, real-time processing layer by integrating Databricks Lakehouse with MongoDB Atlas.
MongoDB Spark Connector and Databricks
Make operationalizing ML-enhanced applications easy by leveraging Databricks’ real-time data ingestion and processing capabilities for all types of data. Then activate analytics in MongoDB by serving results to user-based applications.
Import/Export Data via Object Store
With the MongoDB aggregation pipeline and $out, you can pre-process and transform data before exporting it in an analytics-optimized columnar format to object stores (such as Amazon S3) for seamless ingest into Databricks. This allows you to process bi-directionally targeted (and large) datasets.
How to Integrate Databricks and MongoDB
Linking your MongoDB Atlas database to data stored in the Databricks Lakehouse has never been easier. Learn more about the Databricks MongoDB Notebook.