I am running into an issue and not sure how to proceed. I have an aggregation pipeline that links a couple collections based on some field id in the document. However one of those field (Files) has multiple entries that are similar, like so:
While looking closer at your issue, I think that what you are experimenting is a welcome optimization where $lookup is not performed on multiple items.
Doing multiple $lookup over the same item and returning multiple time the same sub-document is potentially harmful.
First, let say that you do not need to use let with array lookup. Simple $lookup like shown here is sufficient to gather all the information.
It is then the role of your application to find the correct sub-document from files_lookup and material_lookup to reconstruct the whole things with duplicated.
If you insist in having the server to feed duplicates, then you could write $addFields stage that uses $map from files with $indexOfArray and $arrayElemAt to build your tree of duplicate objects. But more work and more data sent by the server means less free CPU and less free network bandwidth to serve other client.
In both case, it is the same code logic, mapping an array to another one by looking up complex object in another array. I think it is easier to scale that logic at the application level rather than the data level.