Hi everyone,
I’m using MongoDB as the backend for FiftyOne, and I’m running into an issue when adding a large amount of data to the database. Specifically, I encounter the following error:
OperationFailure: PlanExecutor error during aggregation :: caused by :: Total size of documents in 'collection' matching pipeline's $lookup stage exceeds 104857600 bytes, full error: {'ok': 0.0, 'errmsg': "PlanExecutor error during aggregation :: caused by :: Total size of documents in frames.samples.648aedc2e54c5d1599ee16b1 matching pipeline's $lookup stage exceeds 104857600 bytes", 'code': 4568, 'codeName': 'Location4568'}
From what I understand, the $lookup
stage has a limit of 100MB for the total size of documents being processed. Is there any way to increase this limit, either through configuration settings or by adjusting how the aggregation pipeline is structured?
If not, are there any best practices to work around this issue when handling large datasets?
Thanks in advance for any insights!