As a start, I’d just convert the field to a date once, there is no reason to do it three times, convert it before you use it in the facet. I assume you have three grouping facets, so extract the day, month year BEFORE you hit the facet so all stages can share the computed values.
You’re also not going to be able to make use of proper date functions efficiently using a tick to store the time, I’d look at storing it as a proper date, do you use this query a lot?
That fact you have an index is of no use in this scenario, your’re calculating a value and that is not indexed.
Ideally you want to hit the groups with sorted data, in this case with the data in that form I’m not sure what other improvements you can make to be honest.
I’m tied up at the moment with work so can’t repro locally but from the screenshot you’re running this over 10M docs approx?
Computing and grouping on that many records is going to be slow, if this is a common query then perhaps look at an alternative storage, bucket pattern perhaps and then you can easily count and group at a higher level?
Since I had other fields in the collection: year, month, day, hour, minute in addition to the raw timestamp, I changed the query to use these new fields (numerical). But what bothers me is that I always see a big difference between the sum of the times reported by the Explain and the total execution time. Do you know why is that? What is that hidden difference in time (33 seconds - 1.5 seconds)?