Streaming data using databricks spark connector

When trying to execute the code in Streaming Data with Apache Spark and MongoDB | MongoDB receiving an error message which states that "
org.apache.spark.SparkClassNotFoundException: [DATA_SOURCE_NOT_FOUND] Failed to find data source: mongodb. Please find packages at"

Any thoughts on what is going wrong here. The Mongodb is Mongo Atlas. Spark engine is thru Databricks

Hi Srinivasan,

Have you installed Mongodb Spark connector to your databricks environment?

Here are the steps:
Once the cluster is up and running, click on “Install New” from the Libraries menu.
Here we have a variety of ways to create a library, including uploading a JAR file or downloading the Spark connector from Maven. In this example, we will use Maven and specify org.mongodb.spark:mongo-spark-connector_XXX: as the coordinates.

Remaining here: Exploring Data with MongoDB Atlas, Databricks, and Google Cloud | MongoDB Blog

1 Like

This topic was automatically closed 5 days after the last reply. New replies are no longer allowed.