Docs Home → MongoDB Spark Connector
The following example creates a DataFrame from a json
file and
saves it to the MongoDB collection specified in SparkConf
:
val df = spark.read.format("json").load("example.json") df.write.format("mongodb").mode("overwrite").save()
The MongoDB Connector for Spark supports the following save modes:
append
overwrite
To learn more about save modes, see the Spark SQL Guide.