MongoDB connection timeout with Databricks

I can connect using pymongo and compass . I install the library org.mongodb.spark:mongo-spark-connector_2.13:10.4.1 (latest one) but I never was able to connect to same mongo cluster (sharded) using the primary as default.

This is the scala code (I’ve tested in python as well)

val connstr = “mongodb://user:xxxxxxx@cluster/dbxxx?tls=true&tlsInsecure=true&authSource=admin”

val df = spark.read.format(“mongodb”)

.option(“database”, “dbdbdbdbdb”)

.option(“spark.mongodb.read.connection.uri”, connstr)

.option(“collection”, “cccccccccc”)

.load().limit(5)

Also I can telnet the cluster successfully .

Mongodb: MongoDB 7.0.14-8 Community

Any clues?