I used these settings to create the spark session:
spark = SparkSession.builder \
.appName(appName) \
.config("spark.mongodb.input.uri", "mongodb+srv://user:password@cluster.url.net/databasename?retryWrites=true&w=majority") \
.config("spark.mongodb.output.uri", "mongodb+srv://user:password@cluster.url.net/databasename?retryWrites=true&w=majority") \
.getOrCreate()
These for writing:
# Create dataframe named df
df.write.format("mongo").option('spark.mongodb.output.collection', 'collection_name')\
.mode("append") \
.save()
And these for reading:
# Read data from MongoDB
df = spark.read.format('mongo').option("spark.mongodb.input.collection", "collection_name").load()
df.printSchema()
df.show()
Hope this help!!