Pyspark is reading time stamp field from mongo db as struct data type instead of timestamp

I am reading time stamp field from mongdb db atlas using pyspark . In mongodb field is with timestamp value but when I pull the same data using pyspark and write in s3 in parquet file , timestamp field converts to struct data type and instead of timestamp value , in parquet it is in epoch time.
for example let’s say column name is x and it’s value in mongodb is “2010-02-21T21:26:14Z” but in the s3 when it is in parquet its value is : {"$date": 1497546159188}" .
Note : 1497546159188 is not equal to “2010-02-21T21:26:14Z” , just taken for reference . I am trying to indicate at mongodb server column data type is timestamp but when fetched from spark in s3 in parquet format is changing to struct data type .

connector : mongo-spark-connector_2.11-2.4.4.jar
spark version - 2.4.2