Dear Community,
We are evaluating the spark-connector in version 10.1.1 to stream the data into Spark but could not find an option on below yet and appreciate your suggestions. We are using payspark and with Databricks to structure stream the data.
-
How to stream data from multiple collections of a database
.option(“spark.mongodb.read.collection”, collection1, collection2,…collectionN) -
How to stream data from multiple databases
.option(“spark.mongodb.read.database”, DB1, DB2,…DBn) -
How read the existing data of collection first and then start the streaming
Example: “copy.existing” which will copy the existing data first then start the stream of data.
Thanks in anticipation!
- Ravi