Hi, I’m in Chapter 2: Basic Aggregation - Utility Stages: Cursor-like stages: Part 2 where the instructors speaks about the 100 MB limit of a pipeline and the use of the allowDiskUse option and I was wondering:
- How could we know how much memory will our pipeline use when we are developing it and if it is going to exceed the 100 MB limit?
- How to know if that amount of memory used will increase while the database increases? We may have an issue in the future if we don’t detect that when developing.
- I’m thinking that the 100 MB limit is for the whole pipeline. Am I right? Or is it for every unique $sort stage? So… is it A) we will have 100 MB limit for the whole pipeline process, B) for every $sort stage we will have new 100 MB free space?
Also, he was speaking about using $sort as the first stage or at least be at the near beginning and before a $project stage in order to increase the performance. Does this mean it’s going to be faster and use less memory? Or only use less memory? And how much “less memory” is that?