Best way to get all matching documents with huge JSON array as payload

Hi Team,
So we have scenario which is taking long time as normal and now we want to tune that. So the issues is as follows:
we are querying mongo with huge array size (around 1k) in size and doing find on single collection. Our collection holds just 200 documents but query payload will keep on increasing . Its like mapping REAL time events to some queues so we prepare filters on the fly and hit find.

Sample filter will have data like below.

 {
            "view.k1": kwargs["taaaa"],
            "view.k2": kwargs["jjjj"],
            "view.k3": kwargs["jkjjkjk"],
            "view.k4": True,
            "view.k5": {"$ne": kwargs["tktteam"]},
            "view.k6"": True,
            "view.k7: {"$in": checklist}
        }

db.test.find({filter1},{filter2},{filter3},{filter4}, {fiter5},…{filter1000})

Since we are sending huge payload over network due to which also lag will appear. Now I am looking for best way to handle this scenario. I don’t know much about mongo but I think caching shall work for me or may be break queries and do find concurrently.

We are using PyMongo for now and this application is written very badly so we need this mapping to be very fast.

Can you suggest best approach to handle this.

The question is not clear for me, so what you want to do is search on collection which hold 200 documents several time? maybe sample document structure and sample synatx will help to understand more clearly

1 Like

I wanted to know is there any better way to query where we have many filters and each filter contains many fields.