Custom Schema to Mongo Spark connection is not working properly

schema = StructType([
  StructField("_class", StringType(), True),
  StructField("_id", StringType(), True)
])

mongo_df = spark.read.format("mongodb")\
    .option("connection.uri", "mongodb://***:***@*****:27017")\
    .option("database", "orderdb")\
    .option("collection", "order")\
    .option("spark.mongodb.input.ssl", False)\
    .option("schema", schema)\
    .option("sql.inferSchema.mapTypes.enabled", False)\
    .load()

I am trying to load data with custom schema specified. but i am still able to fetch the schema of original documents from mongodb.
How to achieve custom schema load ?

i am using latest mongodb-spark-connector 10.2.2