Queries perform poorly when retrieving ~5K documents


I’ve been dealing with a problem that I can’t seem to get my head around. We’re running a mongodb database with just one collection that has ~5K documents. This collection, when exported to JSON is around 200MB in size.

I’m running a query aimed towards this collection filtering by a text projectId field, this query should return all ~5K documents as we only have documents from one project at the moment.

The issue is this query takes way too long and the time it takes can vary a lot from time to time. Originally this database was hosted in Atlas, but I’ve cloned it both in AWS and my dev environment with similar performance. Some times the load times are as slow as 20 minutes.

I’ve also tried to create a text index for the projectId field with no luck whatsoever.

Does anyone have any idea why this could be happening? We had this data inside an SQL database before and it seems that one is able to query the projects in a matter of milliseconds, so I’m sure we’re doing something wrong here, but I can’t seem to find what.

Thanks in advance and sorry if this is something trivial, I’ve just started using mongodb and I’m fairly new to most of it’s concepts.

Hello @Oscar_Arranz ,

Welcome to The MongoDB Community Forums! :wave:

Can you please share more details for me to understand your use case better?

  • MongoDB version
  • Query being executed
  • Sample documents
  • Indexes created
  • Run your query with explain in executionStats mode (e.g. `db.collection.explain(‘executionStats’).aggregate(…)) and share the output.


1 Like