How to estimate performance of a pipeline that includes a lookup and a merge?

Hello!

We need to move one field from collection A to collection B in our system. Each document in collection B has only one correspondent document in collection A. The field in collection B that holds the reference to collection A is indexed. The field needs to be moved in around 500K documents.

Our question is how to estimate if it’s feasible to run it in a single step, and how it could affect DB load, or if we need to run it in batches. This is how the pipeline looks like:

db.B.aggregate([
  {
     $lookup: {
         from: 'A',
         localField: 'referenceToA',
         foreignField: '_id',
         as: 'docInfo',
     }
  },
  {
     $unwind: '$docInfo'
  },
  {
     $project: {
        _id: 1,
        fieldToMove: '$docInfo.fieldToMove'
     }
  },
  {
     $merge: {
        into: 'B',
        on: '_id'
     }
  }
]);

This time, we decided to run it in batches of 3K documents, but we want to have insights on how this could scale (I remember I ran a similar pipeline in about 100K documents in the past).

Thanks!