Docs Menu

Docs HomeMongoDB Spark Connector

The following example creates a DataFrame from a json file and saves it to the MongoDB collection specified in SparkConf:

val df = spark.read.format("json").load("example.json")
df.write.format("mongodb").mode("overwrite").save()

The MongoDB Connector for Spark supports the following save modes:

  • append

  • overwrite

To learn more about save modes, see the Spark SQL Guide.

MongoDB Connector for Spark →
Share Feedback
© 2023 MongoDB, Inc.

About

  • Careers
  • Investor Relations
  • Legal Notices
  • Privacy Notices
  • Security Information
  • Trust Center
© 2023 MongoDB, Inc.