Timeseries Collection : What is exact data loss when granularity is set different than the frequency at which data received

Hi all,
I read at many places that a data loss that can occur when you set the granularity of a MongoDB time series collection to hours and receive data at a more granular level of seconds.
But I am not able to test this. In below example, I have created a sales time series collection with hours granularity and sending data at seconds interval.

    "sales",
    {
       timeseries: {
          timeField: "timeStamp",
          metaField: "shop",
          granularity: "hours"
       }
    },
)
db.sales.insertMany( [
   {
      "sop": "ABC",
      "timeStamp": ISODate("2023-04-24T00:00:00.000Z"),
      "sales": 50
   },
   {
      "sop": "ABC",
      "timeStamp": ISODate("2023-04-24T00:00:01.000Z"),
      "sales": 100
   },
   {
      "sop": "ABC",
      "timeStamp": ISODate("2023-04-24T00:00:02.000Z"),
      "sales": 150
   },
   {
      "sop": "ABC",
      "timeStamp": ISODate("2023-04-24T00:00:03.000Z"),
      "sales": 200
   },
   {
      "sop": "ABC",
      "timeStamp": ISODate("2023-04-24T00:00:04.000Z"),
      "sales": 100
   },
   {
      "sop": "ABC",
      "timeStamp": ISODate("2023-04-24T00:00:05.000Z"),
      "sales": 200
   }
] )

I could see that all data points are stored in to the mongodb collection.
.find() operation returns all the data points, even aggregations functions like sum , avg returned the correct results, then what is exact data loss that is been talked about.
Can someone please share a example which I show the data loss?

Thanks in advance.