Node.js | GridFS: Executor error during find command :: caused by :: Sort exceeded memory limit

I have uploaded a 400Mb zip file to MongoDB using GridFS. I then try to donwnload it using the following code:

let mongoGridFsBucket = new mongodb.GridFSBucket(Mongoose.connection.db, {
  chunkSizeBytes: 1024,

let gridFsDownloadStream = mongoGridFsBucket.openDownloadStreamByName(filename)

gridFsDownloadStream.on('error', console.error)
gridFsDownloadStream.on('end', function() {'downloaded')


and get this Error:

MongoServerError: Executor error during find command :: caused by :: Sort exceeded memory limit of 33554432 bytes, but did not opt in to external sorting.

The above code works fine for smaller files (e.g. 9Mb files), I’ve already tested that successfully.

However in this case the file is too big. I looked for a solution online and apparently there is some allowDiskUse flag that I need to set somewhere but I don’t know where and how .

There is no place in the above code where I could set this allowDiskUse to true so I don’t know what else to do to make this work.

So did you find a solution to this problem? I’m still looking :confused: Please share

I did some research and I’ve found a solution.

Apparently, when you download a file by streaming it with GridFS, the documents that it is comprised of are first sorted. According to this blog post, when doing a sort, MongoDb first attempts to retrieve the documents using the order specified in an index. When no index is available it will try to load the documents into memory and sort them there.

The catch is that Mongo is configured by default to abort the operation when exceeding usage of 32 MB. In that case, we run into the “Sort exceeded memory limit” error described above. In order to solve this problem then, you’ll have to create an index for the ‘n’ field of the chunks collection that contains the file you want to download:

    // Create an index for the 'n' field to sort the chunks collection.
    db.collection('media.chunks').createIndex({n: 1});