Trigger is firing for each document rather than as a bulk action

As no one is responding in the online chat I thought that i would open this up…

I have database triggers that are firing off for each inserted document but I am looking to have it fire off once per entry. So for example if the initiating call generates 50 new documents I would just like it to update after all insertions. I have tried to use a bulk insert (https://www.mongodb.com/docs/manual/reference/method/Bulk.insert/) but I can see in the logs that it is still firing for each document.

Does anyone have any advice as how to handle?

Welcome to the MongoDB Community @Timothy_Payne !

Atlas Database Triggers are based on change streams of document changes rather than command execution (for example, bulk insert). Processing per document also helps functions complete within Atlas Function execution limits including 120 seconds of runtime.

However, you could invoke processing after a specific batch job finishes by calling a function via a Custom HTTPS Endpoint. I assume you would need some criteria to be able to identify the batch of documents most recently inserted, such as a creation date or batch ID in your documents (or perhaps the most recent _id before the bulk insert).

Regards,
Stennie

Thanks @Stennie_X ! This might then be better hooked-up within the app as it would still have access to the inserted documents. It does seem strange that there isn’t a one action trigger after a bulk insert/update but we’re all working towards something. Thanks again