Hi I am unable to connect to mongodb atlas server from Databricks. I have configured Allow all hosts in network access section and i am able to connect to Mongodb from a Mongo Compass client from my system. I am not sure why it’s not working from Databricks cloud. Below is the code i am using, i have installed the required library as specified in this url:
val connectionString1 = "mongodb+srv://phanivyr:MyPassword@cluster0.awdeasr.mongodb.net/sample_airbnb?retryWrites=true&w=majority&appName=Cluster0"
val database = "sample_airbnb"
val collection = "listingsAndReviews"
val df = spark.read.format("mongodb").option("database", database).option("spark.mongodb.input.uri", connectionString1).option("collection", collection).load()
df.printSchema()
As this looks a Databricks issue. I would recommend reaching out to Databricks public forum for more accurate information.
Having said that, could you help me with the error message that you are seeing while making the connection ?
MongoTimeoutException: Timed out after 30000 ms while waiting to connect. Client view of cluster state is {type=UNKNOWN, servers=[{address=localhost:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.ConnectException: Connection refused (Connection refused)}}]
@Phani_Kanakamedala@Aasawari any update here? I’m getting the same issue when trying to connect to my MongoDB from Databricks, I did everything based on documents but I’m getting the same issue
Note: My Databricks compute configuration is as below
Spark: 3.5.3
scala - 2.13
mongo-connector : org.mongodb.spark:mongo-spark-connector_2.13:10.4.0 Install the mongo-connector from Maven Central