Real-time analytics with change-streams, aggregation pipelines and sockets

I am implementing a real-time analytics dashboard for our application using change-streams with Socket.io. (Application has React FE, Mongo Atlas 6.0, Node.js and Mongoose as our ODM on the backend). When user opens the dashboard, they are added to a Socket.io room for all connections to that dashboard or one is created. Stats are calculated on opening the dashboard and act as the starting point.

For incremental changes, things are working well and we are seeing our updates in real-time. However, we can also capture batch updates on this dash, and these are overwhelming our implementation. For increment/decrement operations, we can manage, but for operations where we need to compare, add or remove documents, we’re not able to process them fast enough to keep up, so the stats get quickly out of sync. I’m trying to store the current stats object locally on the backend, which doesn’t really work for this use case.
This is our code =>

export const roomStatsMap = new Map(); // holds intermediate results

export const statsChangeStream = async (io) => {

  try {
    // defining pipeline, options, etc
    const db = await connectToDb()
    // stats build off each other - iterator structure SHOULD ensure that changes are processed one by one ??? 
    const changeStreamIterator = await makeChangeStream(db, pipeline, newOptions)
    
    for await (const change of changeStreamIterator) {
      await handleStats(change, io)
    }
    } catch (error) {
      console.log('ERROR => ', error)
    }
}

In the handleStats function, we increment/decrement the stats object and update the value in our map. However, before we process the change event and update the stats object, we start processing the next change event.

Do I need to add something like a queue or would a different syntax be more appropriate?