Unable to read from kafka and publish to mongo using mongo-spark-connector

Hi there!

I am not able to read from a kafka topic and sink to mongo using mongo-spark-connector

Having the following

    val spark = SparkSession
      .builder
      .master("local[*]")
      .appName("KafkaStreamProcessor")
      .config("spark.mongodb.write.connection.uri", "mongodb://mongodb:27017")
      .config("spark.mongodb.write.database", "streaming")
      .config("spark.mongodb.write.collection", "multiplayer_score_event")
      .config("spark.mongodb.write.convertJson", "any")
      .getOrCreate

    spark.sparkContext.setLogLevel("ERROR")

    val df = spark.readStream
      .format("kafka")
      .option("kafka.bootstrap.servers", "kafka:9092")
      .option("subscribe", "kafka-spark-streaming")
      .option("startingOffsets", "earliest")
      .load()
      .writeStream
      .format("mongodb")
      .option("checkpointLocation", "/tmp/")
      .option("forceDeleteTempCheckpointLocation", "true")
      .outputMode("append")
      .start()
      .awaitTermination()

I get this error…

Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0) (be89ae268e8c executor driver): java.lang.NoSuchMethodError: ‘org.apache.spark.sql.catalyst.encoders.ExpressionEncoder org.apache.spark.sql.catalyst.encoders.RowEncoder$.apply(org.apache.spark.sql.types.StructType)’

Could you help me with this? Thank you!