I was aggregating some data using the group stage and I have noticed that some result sets have obtained very long decimal point sequences despite being stored as Decimal128 values:
$priceData.price is stored as Decimal128 on the database. Could you tell me what could be the cause of the issue and how to resolve it, please? Thank you
That’s a very nice catch, thank you. Running the following query:
{ $priceData.price: { $not : { $type : 19 } } }
Revealed some old documents that have the price value stored as strings instead of Decimal128.
What do you think would be the right course of action? 2 ideas come to mind:
Repair the incorrect fields by running an update query and converting them to the correct type (string → Decimal128). However, I am not sure what the correct and safe update query could be?
As an immediate solution, I have used the $round: ["$price", 2] operator in the $project stage to reduce the decimal points. What are your thoughts on that?
Due to type bracketing they won’t be included in the sum, it would be another number type. Another number type would create this result or a decimal created by passing a long as an argument(this was accpted in earlier versions).
If there is no use case for the data being of the unexpected decimal then I would suggest updating it to the expected decimal with correct precision.
Could be as simple as using $toDecimal, $round and $merge or it could be more complex if there are strings or null present in which case $convert and some $cond might have to e used.