Docs Menu

Docs HomeView & Analyze DataMongoDB Spark Connector

Batch Mode

On this page

  • Overview

In batch mode, you can use the Spark Dataset and DataFrame APIs to process data at a specified time interval.

The following sections show you how to use the Spark Connector to read data from MongoDB and write data to MongoDB in batch mode:


Apache Spark Documentation

To learn more about using Spark to process batches of data, see the Spark Programming Guide.

←  Configure TLS/SSLRead from MongoDB in Batch Mode →

On this page