Connect to MongoDB Atlas through SSH Tunnel from Spark

Hi, I can connect to MongoDB through SSH tunnel using MongoDB Compass, but when I try to connect from pyspark, setting up the tunnel using sshtunnel libraries it crashes.

Without the tunnel I can connect OK, but not using the tunnel. Although the tunnel works fine with other datasources like MySQL but it doesn’t work when try to connect to Altlas.

My code is:

    mypkey = paramiko.RSAKey.from_private_key_file(ssh_file)
    tunnel = SSHTunnelForwarder((ssh_host,int(ssh_port)), ssh_username=ssh_user, ssh_pkey=mypkey, remote_bind_address=(jdbcHostname, int(jdbcPort))) 
    tunnel.start()
        
    jdbcUrl = "mongodb+srv://{0}:{1}@{2}/?retryWrites=true&w=majority".format(jdbcUsername, jdbcPassword, jdbcHostname)

The error message is a Timeout… it seems like it doesn’t find the redirected port, but mongodb+srv does not allow port specification:

mongodb+srv://***:***@mongodb_host/?retryWrites=true&w=majority

com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting to connect. Client view of cluster state is {type=UNKNOWN, servers=[{address=localhost:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.ConnectException: Connection refused (Connection refused)}}]

Any help would be appreciated. Thanks in advance!