How to replicate (insert / update) a large collection into a new one with Filters

I have a large collection catalog, with a count of 200 million rows.
I want to schedule and replicate into a new collection catalog_filter_replicated with a filter like clientId:“123”.
Every time when schedule script executes, new rows should be inserted and the old one should update the target one.

The performance factor is important here. Please suggest the best way to replicate (insert / update) a large collection into a new one with Filters.
Thanks

Would that replica of the original collection be read-only? Depending of the use-cases involved with the replica there is different version.

You could have a normal view

a materialized view

You could also create a collection run an aggregation and use $out or $merge. You wrote

and

but what about the deleted one? A $merge won’t take care of the deleted documents, but a drop collection and $out will recreate a fresh replica.

Another way would be to use Change Stream to replicates the update of catalog for clientId:123.

It all depends of the use-cases of the replica.