On this page
In batch mode, you can use the Spark Dataset and DataFrame APIs to process data at a specified time interval.
The following sections show you how to use the Spark Connector to read data from MongoDB and write data to MongoDB in batch mode:
Apache Spark Documentation
To learn more about using Spark to process batches of data, see the Spark Programming Guide.