Triggers for OnPremise MongoDB

I have created an on demand Materialized view in my database. Now, I need to update it after every change in the source collection. I found out that triggers are not supported for on premise MongoDB but are only available in Atlas. Since I do not want to indulge into atlas, what possible options do I have got?
I am looking for workarounds to achieve this.
So far, i have found about oplogs usage in this purpose, can you make me understand this?
Also. can I update materialized view through some event?

Check MongoDB Change Streams… they are native to MongoDB and does not need Atlas.

1 Like

Hi @MWD_Wajih_N_A,

Atlas Triggers use the MongoDB Change Streams API which is a standard feature of modern MongoDB Server versions (3.6+). Change Streams use the replication oplog in their underlying implementation, but provide a stable API to subscribe and react to changes for a single collection, database, or an entire deployment (replica set or sharded cluster). You should use the Change Streams API rather than directly reading the oplog.

Atlas triggers are provided via Atlas’ cluster management interface. You can implement similar logic in an on-premise deployment by creating a persistent application using change streams.

The On-Demand Materialised Views documentation includes examples of creating and updating data using the $merge aggregation functionality in MongoDB 4.2+. The documentation examples use the mongo shell for illustration purposes, but you can translate those approaches into any of the supported drivers for MongoDB 4.2+.

You can find examples of working with change streams in different drivers in the MongoDB manual. For example: Open a Change Stream.



I have been exploring the change stream API and workarounds for my usecase.
What I have understood is that I can detect any change in my collection via change stream “watch” function. I am trying to figure out that how can I read two or more fields from a collection and apply aggregation in the collection and get some result and merge result to another collection. All this based on every change stream event occurring.

hi @Stennie_X

I am new to mongo and was looking for ways to trigger the on-demand-materialised view in Go driver defined as a function in one of my mongo DB.

I tried running the aggregation pipeline in Go with $merge. But seems like it’s not performing anything.

collection := client.Database("poc").Collection("po_new_audit")

	matchStg := bson.D{{"$match", bson.D{{"synced", false}}}}
	addFieldStg1 := bson.D{{"$addFields", bson.D{{"audit_type", "purchase_order"}}}}
	lookupStg := bson.D{{"$lookup", bson.D{{"from", "user"}, {"localField", ""}, {"foreignField", "_id"}, {"as", "user_details"}}}}
	unwindStg1 := bson.D{{"$unwind", "$user_details"}}
	addFieldStg2 := bson.D{{"$addFields", bson.D{{"breadcrumbs", bson.D{{"po_id", "$entity_id"}}}}}}
	projectStg := bson.D{{"$project", bson.D{{"_id", "1"}, {"what", "1"}, {"when", "1"}, {"user_details", "1"}, {"breadcrumbs", "1"}, {"audit_type", "1"}}}}
	mergeStg := bson.D{{"$merge", bson.D{{"into", "audit"}}}}
	c, err := collection.Aggregate(context.TODO(), mongo.Pipeline{matchStg, addFieldStg1, lookupStg, unwindStg1, addFieldStg2, projectStg, mergeStg})
	if err != nil {

	var loaded []bson.M
	if err = c.All(context.TODO(), &loaded); err != nil {

The aggregation works till the projectStg and cursor gives the return values. But adding $merge doesn’t give an error but returns empty cursor. Also, the merge document doesn’t get updated.

Any pointers will really help.