Question on .Limit and .Sort

I am taking the M220N class and we are building aggregations in VS Studio 2019. I am was working on a ticket that used $facets and I have a question that I would like to discuss.

My original post is M220N Advanced Reads. Please read this to get the background, but basically my question is:

My question is according to the original query:

movies = await _moviesCollection.Find<Movie>(Builders<Movie>.Filter.Empty)
                .Skip(pageNumber * moviesPerPage)

The query will be called again when the page is incremented. This seems inefficient for applications that will have a lot of users executing paged searches.

In containerised or service based designs, is there a way to bring the original search into memory to avoid executing another query every time the page is incremented. It seems like poor optimisation and it can lead to system lag based on the number of queries executed.


What make you think it is inefficient to have the most powerful machine of the system and the closest to the data do the most complicated work?

Your other choices would be to do it in the application server or in the client. In both case you remove the Limit() and Skip(). In both cases you do less request more send much more data over the wire. The farther you are from the source the slower it is. Imagine downloading 100K documents on a smart phone when you can only see a couple at one time. Okay then, lets handle that at the application level. It can be done. But then you have to maintain a list of documents for each connected clients. Clients that might leave your application leaving its list of documents in your app. server. When do you clean up? When do you update your documents? For example, item changes price, quantity in inventory changes. You might need to have a more powerful app. server because all the extra work it needs to do.

Just some thoughts.