Reading from MongoDB using Spark-MongoDB connector failed

Hi,
I’am using Spark-MongoDB connector to read from MongoDB and write to Hudi, It throws the following exception:

20/07/24  17:51:57  ERROR  yarn.Client:  Application  diagnostics  message:  User  class  threw  exception:  org.apache.spark.sql.avro.IncompatibleSchemaException:  Unexpected  type  NullType.
at  org.apache.spark.sql.avro.SchemaConverters$.toAvroType(SchemaConverters.scala:182)
at  org.apache.spark.sql.avro.SchemaConverters$$anonfun$5.apply(SchemaConverters.scala:176)
at  org.apache.spark.sql.avro.SchemaConverters$$anonfun$5.apply(SchemaConverters.scala:174)
at  scala.collection.Iterator$class.foreach(Iterator.scala:891)

I checked this exception, It seems that the schema of nested fields are Identified as NullType

how can i solve this problem ? any feedback are appreciated