Mongo aggregation merge document nested array on update and return result

Hi! I am tring to create an aggregation query in order to update an array inside a document like so:

Document:

{
   _id: ObjectId(65141e3d8fc74027df89e79d)
   version: "14c613f7-84f9-4df6-a9c2-8c8923b40c31"
   name: "decoder-facd32602f9faa54",
   map:[
      {
         prefix: 101,
         name:"Digital Input",
         scale: 1,
         size: 1,
         rage: [0, 255],
         value_type: "uint8"
      },
      {
         prefix: 102,
         name:"Digital Output",
         scale: 1,
         size: 1,
         rage: [0, 255],
         value_type: "uint8"
      }
   ]
}

Update:

name: decoder-rename-update
map:[
      {
         prefix: 101,
         name:"Digital Input UPDATE",
         scale: 1,
         size: 1,
         rage: [0, 255],
         value_type: "uint8"
      },
      {
         prefix: 103,
         name:"Analog Output",
         scale: 1,
         size: 1,
         rage: [0, 255],
         value_type: "uint8"
      }
]

Expected result:

{
   _id: ObjectId(65141e3d8fc74027df89e79d)
   version: "14c613f7-84f9-4df6-a9c2-8c8923b40c31"
   name: "decoder-rename-update",
   map:[
      {
         prefix: 101,
         name:"Digital Input",
         scale: 1,
         size: 1,
         rage: [0, 255],
         value_type: "uint8"
      },
      {
         prefix: 102,
         name:"Digital Output UPDATE",
         scale: 1,
         size: 1,
         rage: [0, 255],
         value_type: "uint8"
      },
      {
         prefix: 103,
         name:"Analog Output",
         scale: 1,
         size: 1,
         rage: [0, 255],
         value_type: "uint8"
      }
   ]
}

I have tried this approach:

const decoder = await this.decoderModel.aggregate( [
      {
        $match: {
          _id
        }
      },
      {
        $set: { name }
      },
      {
        $project: {
          map: {
            $concatArrays: [
              {
                $map: {
                  input: '$map',
                  as: 'map',
                  in: {
                    cond: [
                      {
                        $in: [
                          "$$map.prefix",
                          map.map( d => d.prefix )
                        ]
                      },
                      {
                        $mergeObjects: [
                          "$$map",
                          {
                            $arrayElemAt: [
                              {
                                $filter: {
                                  input: map,
                                  cond: {
                                    $eq: [
                                      `$$this.prefix`,
                                      `$$map.prefix`
                                    ]
                                  }
                                }
                              },
                              0
                            ]
                          }
                        ]
                      },
                      `$$map`
                    ]
                  }
                }
              },
              {
                $filter: {
                  input: map,
                  cond: {
                    $not: {
                      $in: [
                        `$$this.prefix`,
                        `$$map.prefix`
                      ]
                    }
                  }
                }
              }
            ]
          }
        }
      }
    ] );

And I’ve received this error:

MongoServerError: Invalid $project :: caused by :: FieldPath field names may not start with '$'. Consider using $getField or $setField.
    at Connection.onMessage (E:\Gitlab\babylon-data-registery\node_modules\mongodb\src\cmap\connection.ts:413:18)
    at MessageStream.<anonymous> (E:\Gitlab\babylon-data-registery\node_modules\mongodb\src\cmap\connection.ts:243:56)
    at MessageStream.emit (node:events:513:28)
    at processIncomingData (E:\Gitlab\babylon-data-registery\node_modules\mongodb\src\cmap\message_stream.ts:193:12)
    at MessageStream._write (E:\Gitlab\babylon-data-registery\node_modules\mongodb\src\cmap\message_stream.ts:74:5)
    at writeOrBuffer (node:internal/streams/writable:392:12)
    at _write (node:internal/streams/writable:333:10)
    at MessageStream.Writable.write (node:internal/streams/writable:337:10)
    at TLSSocket.ondata (node:internal/streams/readable:766:22)
    at TLSSocket.emit (node:events:513:28)
    at addChunk (node:internal/streams/readable:324:12)
    at readableAddChunk (node:internal/streams/readable:297:9)
    at TLSSocket.Readable.push (node:internal/streams/readable:234:10)
    at TLSWrap.onStreamRead (node:internal/stream_base_commons:190:23)
Waiting for the debugger to disconnect...

What am I missing?

I’ve achieved the desired result like so:

const { _id, map, name } = update;
    const decoder = await this.decoderModel.findById( _id );

    if ( !decoder ) {
      throw new NotFoundException( "Decoder not found" );
    }

    decoder.name = name ? name : decoder.name;

    map && map.forEach( ( item, index ) => {
      const existingIndex = decoder.map.findIndex( ( d ) => d.prefix === item.prefix );

      console.log( existingIndex );
      if ( existingIndex !== -1 ) {
        decoder.map[ index ] = { ...decoder.map[ existingIndex ], ...item };
        this.logger.log( decoder.toObject().map[ index ] );
      } else {
        decoder.map.push( item );
      }
    } );

    this.logger.log( decoder.toObject() );

    const updateDecoder = await this.decoderModel.findByIdAndUpdate( _id, { ...decoder }, { new: true } );

    this.logger.log( updateDecoder.toObject() );
    return { message: "Decoder update success", decoder: updateDecoder };

Yet this is resoruce heavy depending on how many elements are sent in the map array. I would like to let the mongodb instance handle the mutations if possible. Does anyone have any ideea with the query?