Explore Developer Center's New Chatbot! MongoDB AI Chatbot can be accessed at the top of your navigation to answer all your MongoDB questions.

MongoDB Developer
JavaScript
plus
Sign in to follow topics
MongoDB Developer Centerchevron-right
Developer Topicschevron-right
Languageschevron-right
JavaScriptchevron-right

Change Streams & Triggers with Node.js Tutorial

Lauren Schaefer17 min read • Published Feb 04, 2022 • Updated Aug 24, 2023
Node.jsMongoDBChange StreamsJavaScript
Facebook Icontwitter iconlinkedin icon
Rate this quickstart
star-empty
star-empty
star-empty
star-empty
star-empty
QuickStart Node.js Logo
Sometimes you need to react immediately to changes in your database. Perhaps you want to place an order with a distributor whenever an item's inventory drops below a given threshold. Or perhaps you want to send an email notification whenever the status of an order changes. Regardless of your particular use case, whenever you want to react immediately to changes in your MongoDB database, change streams and triggers are fantastic options.
If you're just joining us in this Quick Start with MongoDB and Node.js series, welcome! We began by walking through how to connect to MongoDB and perform each of the CRUD (Create, Read, Update, and Delete) operations. Then we jumped into more advanced topics like the aggregation framework and transactions. The code we write today will use the same structure as the code we built in the first post in the series, so, if you have any questions about how to get started or how the code is structured, head back to that post.
And, with that, let's dive into change streams and triggers! Here is a summary of what we'll cover today:
Prefer a video over an article? Check out the video below that covers the exact same topics that I discuss in this article.
Get started with an M0 cluster on Atlas today. It's free forever, and it's the easiest way to try out the steps in this blog series.

What are Change Streams?

Change streams allow you to receive notifications about changes made to your MongoDB databases and collections. When you use change streams, you can choose to program actions that will be automatically taken whenever a change event occurs.
Change streams utilize the aggregation framework, so you can choose to filter for specific change events or transform the change event documents.
For example, let's say I want to be notified whenever a new listing in the Sydney, Australia market is added to the listingsAndReviews collection. I could create a change stream that monitors the listingsAndReviews collection and use an aggregation pipeline to match on the listings I'm interested in.
Let's take a look at three different ways to implement this change stream.

Set Up

As with all posts in this MongoDB and Node.js Quick Start series, you'll need to ensure you've completed the prerequisite steps outlined in the Set up section of the first post in this series.
I find it helpful to have a script that will generate sample data when I'm testing change streams. To help you quickly generate sample data, I wrote changeStreamsTestData.js. Download a copy of the file, update the uri constant to reflect your Atlas connection info, and run it by executing node changeStreamsTestData.js. The script will do the following:
  1. Create 3 new listings (Opera House Views, Private room in London, and Beautiful Beach House)
  2. Update 2 of those listings (Opera House Views and Beautiful Beach House)
  3. Create 2 more listings (Italian Villa and Sydney Harbour Home)
  4. Delete a listing (Sydney Harbour Home).

Create a Change Stream

Now that we're set up, let's explore three different ways to work with a change stream in Node.js.

Get a Copy of the Node.js Template

To make following along with this blog post easier, I've created a starter template for a Node.js script that accesses an Atlas cluster.
  1. Download a copy of template.js.
  2. Open template.js in your favorite code editor.
  3. Update the Connection URI to point to your Atlas cluster. If you're not sure how to do that, refer back to the first post in this series.
  4. Save the file as changeStreams.js.
You can run this file by executing node changeStreams.js in your shell. At this point, the file simply opens and closes a connection to your Atlas cluster, so no output is expected. If you see DeprecationWarnings, you can ignore them for the purposes of this post.

Create a Helper Function to Close the Change Stream

Regardless of how we monitor changes in our change stream, we will want to close the change stream after a certain amount of time. Let's create a helper function to do just that.
  1. Paste the following function in changeStreams.js.
    1function closeChangeStream(timeInMs = 60000, changeStream) {
    2 return new Promise((resolve) => {
    3 setTimeout(() => {
    4 console.log("Closing the change stream");
    5 resolve(changeStream.close());
    6 }, timeInMs)
    7 })
    8};

Monitor Change Stream using EventEmitter's on()

The MongoDB Node.js Driver's ChangeStream class inherits from the Node Built-in class EventEmitter. As a result, we can use EventEmitter's on() function to add a listener function that will be called whenever a change occurs in the change stream.

Create the Function

Let's create a function that will monitor changes in the change stream using EventEmitter's on().
  1. Continuing to work in changeStreams.js, create an asynchronous function named monitorListingsUsingEventEmitter. The function should have the following parameters: a connected MongoClient, a time in ms that indicates how long the change stream should be monitored, and an aggregation pipeline that the change stream will use.
    1async function monitorListingsUsingEventEmitter(client, timeInMs = 60000, pipeline = []){
    2
    3}
  2. Now we need to access the collection we will monitor for changes. Add the following code to monitorListingsUsingEventEmitter().
    1const collection = client.db("sample_airbnb").collection("listingsAndReviews");
  3. Now we are ready to create our change stream. We can do so by using Collection's watch(). Add the following line beneath the existing code in monitorListingsUsingEventEmitter().
    1const changeStream = collection.watch(pipeline);
  4. Once we have our change stream, we can add a listener to it. Let's log each change event in the console. Add the following line beneath the existing code in monitorListingsUsingEventEmitter().
    1changeStream.on('change', (next) => {
    2 console.log(next);
    3});
  5. We could choose to leave the change stream open indefinitely. Instead, let's call our helper function to set a timer and close the change stream. Add the following line beneath the existing code in monitorListingsUsingEventEmitter().
    1await closeChangeStream(timeInMs, changeStream);

Call the Function

Now that we've implemented our function, let's call it!
  1. Inside of main() beneath the comment that says Make the appropriate DB calls, call your monitorListingsUsingEventEmitter() function:
    1await monitorListingsUsingEventEmitter(client);
  2. Save your file.
  3. Run your script by executing node changeStreams.js in your shell. The change stream will open for 60 seconds.
  4. Create and update sample data by executing node changeStreamsTestData.js in a new shell. Output similar to the following will be displayed in your first shell where you are running changeStreams.js.
    1{
    2 _id: { _data: '825DE67A42000000012B022C0100296E5A10046BBC1C6A9CBB4B6E9CA9447925E693EF46645F696400645DE67A42113EA7DE6472E7640004' },
    3 operationType: 'insert',
    4 clusterTime: Timestamp { _bsontype: 'Timestamp', low_: 1, high_: 1575385666 },
    5 fullDocument: {
    6 _id: 5de67a42113ea7de6472e764,
    7 name: 'Opera House Views',
    8 summary: 'Beautiful apartment with views of the iconic Sydney Opera House',
    9 property_type: 'Apartment',
    10 bedrooms: 1,
    11 bathrooms: 1,
    12 beds: 1,
    13 address: { market: 'Sydney', country: 'Australia' }
    14 },
    15 ns: { db: 'sample_airbnb', coll: 'listingsAndReviews' },
    16 documentKey: { _id: 5de67a42113ea7de6472e764 }
    17}
    18{
    19 _id: { _data: '825DE67A42000000022B022C0100296E5A10046BBC1C6A9CBB4B6E9CA9447925E693EF46645F696400645DE67A42113EA7DE6472E7650004' },
    20 operationType: 'insert',
    21 clusterTime: Timestamp { _bsontype: 'Timestamp', low_: 2, high_: 1575385666 },
    22 fullDocument: {
    23 _id: 5de67a42113ea7de6472e765,
    24 name: 'Private room in London',
    25 property_type: 'Apartment',
    26 bedrooms: 1,
    27 bathroom: 1
    28 },
    29 ns: { db: 'sample_airbnb', coll: 'listingsAndReviews' },
    30 documentKey: { _id: 5de67a42113ea7de6472e765 }
    31}
    32{
    33 _id: { _data: '825DE67A42000000032B022C0100296E5A10046BBC1C6A9CBB4B6E9CA9447925E693EF46645F696400645DE67A42113EA7DE6472E7660004' },
    34 operationType: 'insert',
    35 clusterTime: Timestamp { _bsontype: 'Timestamp', low_: 3, high_: 1575385666 },
    36 fullDocument: {
    37 _id: 5de67a42113ea7de6472e766,
    38 name: 'Beautiful Beach House',
    39 summary: 'Enjoy relaxed beach living in this house with a private beach',
    40 bedrooms: 4,
    41 bathrooms: 2.5,
    42 beds: 7,
    43 last_review: 2019-12-03T15:07:46.730Z
    44 },
    45 ns: { db: 'sample_airbnb', coll: 'listingsAndReviews' },
    46 documentKey: { _id: 5de67a42113ea7de6472e766 }
    47 }
    48 {
    49 _id: { _data: '825DE67A42000000042B022C0100296E5A10046BBC1C6A9CBB4B6E9CA9447925E693EF46645F696400645DE67A42113EA7DE6472E7640004' },
    50 operationType: 'update',
    51 clusterTime: Timestamp { _bsontype: 'Timestamp', low_: 4, high_: 1575385666 },
    52 ns: { db: 'sample_airbnb', coll: 'listingsAndReviews' },
    53 documentKey: { _id: 5de67a42113ea7de6472e764 },
    54 updateDescription: {
    55 updatedFields: { beds: 2 },
    56 removedFields: []
    57 }
    58 }
    59 {
    60 _id: { _data: '825DE67A42000000052B022C0100296E5A10046BBC1C6A9CBB4B6E9CA9447925E693EF46645F696400645DE67A42113EA7DE6472E7660004' },
    61 operationType: 'update',
    62 clusterTime: Timestamp { _bsontype: 'Timestamp', low_: 5, high_: 1575385666 },
    63 ns: { db: 'sample_airbnb', coll: 'listingsAndReviews' },
    64 documentKey: { _id: 5de67a42113ea7de6472e766 },
    65 updateDescription: {
    66 updatedFields: { address: [Object] },
    67 removedFields: []
    68 }
    69 }
    70 {
    71 _id: { _data: '825DE67A42000000062B022C0100296E5A10046BBC1C6A9CBB4B6E9CA9447925E693EF46645F696400645DE67A42113EA7DE6472E7670004' },
    72 operationType: 'insert',
    73 clusterTime: Timestamp { _bsontype: 'Timestamp', low_: 6, high_: 1575385666 },
    74 fullDocument: {
    75 _id: 5de67a42113ea7de6472e767,
    76 name: 'Italian Villa',
    77 property_type: 'Entire home/apt',
    78 bedrooms: 6,
    79 bathrooms: 4,
    80 address: { market: 'Cinque Terre', country: 'Italy' }
    81 },
    82 ns: { db: 'sample_airbnb', coll: 'listingsAndReviews' },
    83 documentKey: { _id: 5de67a42113ea7de6472e767 }
    84 }
    85 {
    86 _id: { _data: '825DE67A42000000072B022C0100296E5A10046BBC1C6A9CBB4B6E9CA9447925E693EF46645F696400645DE67A42113EA7DE6472E7680004' },
    87 operationType: 'insert',
    88 clusterTime: Timestamp { _bsontype: 'Timestamp', low_: 7, high_: 1575385666 },
    89 fullDocument: {
    90 _id: 5de67a42113ea7de6472e768,
    91 name: 'Sydney Harbour Home',
    92 bedrooms: 4,
    93 bathrooms: 2.5,
    94 address: { market: 'Sydney', country: 'Australia' } },
    95 ns: { db: 'sample_airbnb', coll: 'listingsAndReviews' },
    96 documentKey: { _id: 5de67a42113ea7de6472e768 }
    97 }
    98 {
    99 _id: { _data: '825DE67A42000000082B022C0100296E5A10046BBC1C6A9CBB4B6E9CA9447925E693EF46645F696400645DE67A42113EA7DE6472E7680004' },
    100 operationType: 'delete',
    101 clusterTime: Timestamp { _bsontype: 'Timestamp', low_: 8, high_: 1575385666 },
    102 ns: { db: 'sample_airbnb', coll: 'listingsAndReviews' },
    103 documentKey: { _id: 5de67a42113ea7de6472e768 }
    104 }
    If you run node changeStreamsTestData.js again before the 60 second timer has completed, you will see similar output.
    After 60 seconds, the following will be displayed:
    1Closing the change stream

Call the Function with an Aggregation Pipeline

In some cases, you will not care about all change events that occur in a collection. Instead, you will want to limit what changes you are monitoring. You can use an aggregation pipeline to filter the changes or transform the change stream event documents.
In our case, we only care about new listings in the Sydney, Australia market. Let's create an aggregation pipeline to filter for only those changes in the listingsAndReviews collection.
To learn more about what aggregation pipeline stages can be used with change streams, see the official change streams documentation.
  1. Inside of main() and above your existing call to monitorListingsUsingEventEmitter(), create an aggregation pipeline:
    1const pipeline = [
    2 {
    3 '$match': {
    4 'operationType': 'insert',
    5 'fullDocument.address.country': 'Australia',
    6 'fullDocument.address.market': 'Sydney'
    7 },
    8 }
    9 ];
  2. Let's use this pipeline to filter the changes in our change stream. Update your existing call to monitorListingsUsingEventEmitter() to only leave the change stream open for 30 seconds and use the pipeline.
    1await monitorListingsUsingEventEmitter(client, 30000, pipeline);
  3. Save your file.
  4. Run your script by executing node changeStreams.js in your shell. The change stream will open for 30 seconds.
  5. Create and update sample data by executing node changeStreamsTestData.js in a new shell. Because the change stream is using the pipeline you just created, only documents inserted into the listingsAndReviews collection that are in the Sydney, Australia market will be in the change stream. Output similar to the following will be displayed in your first shell where you are running changeStreams.js.
    1{
    2 _id: { _data: '825DE67CED000000012B022C0100296E5A10046BBC1C6A9CBB4B6E9CA9447925E693EF46645F696400645DE67CED150EA2DF172344370004' },
    3 operationType: 'insert',
    4 clusterTime: Timestamp { _bsontype: 'Timestamp', low_: 1, high_: 1575386349 },
    5 fullDocument: {
    6 _id: 5de67ced150ea2df17234437,
    7 name: 'Opera House Views',
    8 summary: 'Beautiful apartment with views of the iconic Sydney Opera House',
    9 property_type: 'Apartment',
    10 bedrooms: 1,
    11 bathrooms: 1,
    12 beds: 1,
    13 address: { market: 'Sydney', country: 'Australia' }
    14 },
    15 ns: { db: 'sample_airbnb', coll: 'listingsAndReviews' },
    16 documentKey: { _id: 5de67ced150ea2df17234437 }
    17}
    18{
    19 _id: { _data: '825DE67CEE000000032B022C0100296E5A10046BBC1C6A9CBB4B6E9CA9447925E693EF46645F696400645DE67CEE150EA2DF1723443B0004' },
    20 operationType: 'insert',
    21 clusterTime: Timestamp { _bsontype: 'Timestamp', low_: 3, high_: 1575386350 },
    22 fullDocument: {
    23 _id: 5de67cee150ea2df1723443b,
    24 name: 'Sydney Harbour Home',
    25 bedrooms: 4,
    26 bathrooms: 2.5,
    27 address: { market: 'Sydney', country: 'Australia' }
    28 },
    29 ns: { db: 'sample_airbnb', coll: 'listingsAndReviews' },
    30 documentKey: { _id: 5de67cee150ea2df1723443b }
    31}
    After 30 seconds, the following will be displayed:
    1Closing the change stream

Monitor Change Stream using ChangeStream's hasNext()

In the section above, we used EventEmitter's on() to monitor the change stream. Alternatively, we can create a while loop that waits for the next element in the change stream by using hasNext() from MongoDB Node.js Driver's ChangeStream class.

Create the Function

Let's create a function that will monitor changes in the change stream using ChangeStream's hasNext().
  1. Continuing to work in changeStreams.js, create an asynchronous function named monitorListingsUsingHasNext. The function should have the following parameters: a connected MongoClient, a time in ms that indicates how long the change stream should be monitored, and an aggregation pipeline that the change stream will use.
    1async function monitorListingsUsingHasNext(client, timeInMs = 60000, pipeline = []) {
    2
    3}
  2. Now we need to access the collection we will monitor for changes. Add the following code to monitorListingsUsingHasNext().
    1const collection = client.db("sample_airbnb").collection("listingsAndReviews");
  3. Now we are ready to create our change stream. We can do so by using Collection's watch(). Add the following line beneath the existing code in monitorListingsUsingHasNext().
    1const changeStream = collection.watch(pipeline);
  4. We could choose to leave the change stream open indefinitely. Instead, let's call our helper function that will set a timer and close the change stream. Add the following line beneath the existing code in monitorListingsUsingHasNext().
    1closeChangeStream(timeInMs, changeStream);
  5. Now let's create a while loop that will wait for new changes in the change stream. We can use ChangeStream's hasNext() inside of the while loop. hasNext() will wait to return true until a new change arrives in the change stream. hasNext() will throw an error as soon as the change stream is closed, so we will surround our while loop with a try { } block. If an error is thrown, we'll check to see if the change stream is closed. If the change stream is closed, we'll log that information. Otherwise, something unexpected happened, so we'll throw the error. Add the following code beneath the existing code in monitorListingsUsingHasNext().
    1try {
    2 while (await changeStream.hasNext()) {
    3 console.log(await changeStream.next());
    4 }
    5} catch (error) {
    6 if (changeStream.isClosed()) {
    7 console.log("The change stream is closed. Will not wait on any more changes.")
    8 } else {
    9 throw error;
    10 }
    11}

Call the Function

Now that we've implemented our function, let's call it!
  1. Inside of main(), replace your existing call to monitorListingsUsingEventEmitter() with a call to your new monitorListingsUsingHasNext():
    1await monitorListingsUsingHasNext(client);
  2. Save your file.
  3. Run your script by executing node changeStreams.js in your shell. The change stream will open for 60 seconds.
  4. Create and update sample data by executing node changeStreamsTestData.js in a new shell. Output similar to what we saw earlier will be displayed in your first shell where you are running changeStreams.js. If you run node changeStreamsTestData.js again before the 60 second timer has completed, you will see similar output again. After 60 seconds, the following will be displayed:
    1Closing the change stream

Call the Function with an Aggregation Pipeline

As we discussed earlier, sometimes you will want to use an aggregation pipeline to filter the changes in your change stream or transform the change stream event documents. Let's pass the aggregation pipeline we created in an earlier section to our new function.
  1. Update your existing call to monitorListingsUsingHasNext() to only leave the change stream open for 30 seconds and use the aggregation pipeline.
    1await monitorListingsUsingHasNext(client, 30000, pipeline);
  2. Save your file.
  3. Run your script by executing node changeStreams.js in your shell. The change stream will open for 30 seconds.
  4. Create and update sample data by executing node changeStreamsTestData.js in a new shell. Because the change stream is using the pipeline you just created, only documents inserted into the listingsAndReviews collection that are in the Sydney, Australia market will be in the change stream. Output similar to what we saw earlier while using a change stream with an aggregation pipeline will be displayed in your first shell where you are running changeStreams.js. After 30 seconds, the following will be displayed:
    1Closing the change stream

Monitor Changes Stream using the Stream API

In the previous two sections, we used EventEmitter's on() and ChangeStreams's hasNext() to monitor changes. Let's examine a third way to monitor a change stream: using Node's Stream API.

Load the Stream Module

In order to use the Stream module, we will need to load it.
  1. Continuing to work in changeStreams.js, load the Stream module at the top of the file.
    1const stream = require('stream');

Create the Function

Let's create a function that will monitor changes in the change stream using the Stream API.
  1. Continuing to work in changeStreams.js, create an asynchronous function named monitorListingsUsingStreamAPI. The function should have the following parameters: a connected MongoClient, a time in ms that indicates how long the change stream should be monitored, and an aggregation pipeline that the change stream will use.
    1async function monitorListingsUsingStreamAPI(client, timeInMs = 60000, pipeline = []) {
    2
    3}
  2. Now we need to access the collection we will monitor for changes. Add the following code to monitorListingsUsingStreamAPI().
    1const collection = client.db("sample_airbnb").collection("listingsAndReviews");
  3. Now we are ready to create our change stream. We can do so by using Collection's watch(). Add the following line beneath the existing code in monitorListingsUsingStreamAPI().
    1const changeStream = collection.watch(pipeline);
  4. Now we're ready to monitor our change stream. ChangeStream's stream() will return a Node Readable stream. We will call Readable's pipe() to pull the data out of the stream and write it to the console.
    1changeStream.stream().pipe(
    2 new stream.Writable({
    3 objectMode: true,
    4 write: function (doc, _, cb) {
    5 console.log(doc);
    6 cb();
    7 }
    8 })
    9);
  5. We could choose to leave the change stream open indefinitely. Instead, let's call our helper function that will set a timer and close the change stream. Add the following line beneath the existing code in monitorListingsUsingStreamAPI().
    1await closeChangeStream(timeInMs, changeStream);

Call the Function

Now that we've implemented our function, let's call it!
  1. Inside of main(), replace your existing call to monitorListingsUsingHasNext() with a call to your new monitorListingsUsingStreamAPI():
    1await monitorListingsUsingStreamAPI(client);
  2. Save your file.
  3. Run your script by executing node changeStreams.js in your shell. The change stream will open for 60 seconds.
  4. Output similar to what we saw earlier will be displayed in your first shell where you are running changeStreams.js. If you run node changeStreamsTestData.js again before the 60 second timer has completed, you will see similar output again. After 60 seconds, the following will be displayed:
    1Closing the change stream

Call the Function with an Aggregation Pipeline

As we discussed earlier, sometimes you will want to use an aggregation pipeline to filter the changes in your change stream or transform the change stream event documents. Let's pass the aggregation pipeline we created in an earlier section to our new function.
  1. Update your existing call to monitorListingsUsingStreamAPI() to only leave the change stream open for 30 seconds and use the aggregation pipeline.
    1await monitorListingsUsingStreamAPI(client, 30000, pipeline);
  2. Save your file.
  3. Run your script by executing node changeStreams.js in your shell. The change stream will open for 30 seconds.
  4. Create and update sample data by executing node changeStreamsTestData.js in a new shell. Because the change stream is using the pipeline you just created, only documents inserted into the listingsAndReviews collection that are in the Sydney, Australia market will be in the change stream. Output similar to what we saw earlier while using a change stream with an aggregation pipeline will be displayed in your first shell where you are running changeStreams.js. After 30 seconds, the following will be displayed:
    1Closing the change stream

Resume a Change Stream

At some point, your application will likely lose the connection to the change stream. Perhaps a network error will occur and a connection between the application and the database will be dropped. Or perhaps your application will crash and need to be restarted (but you're a 10x developer and that would never happen to you, right?).
In those cases, you may want to resume the change stream where you previously left off so you don't lose any of the change events.
Each change stream event document contains a resume token. The Node.js driver automatically stores the resume token in the _id of the change event document.
The application can pass the resume token when creating a new change stream. The change stream will include all events that happened after the event associated with the given resume token.
The MongoDB Node.js driver will automatically attempt to reestablish connections in the event of transient network errors or elections. In those cases, the driver will use its cached copy of the most recent resume token so that no change stream events are lost.
In the event of an application failure or restart, the application will need to pass the resume token when creating the change stream in order to ensure no change stream events are lost. Keep in mind that the driver will lose its cached copy of the most recent resume token when the application restarts, so your application should store the resume token.
For more information and sample code for resuming change streams, see the official documentation.

What are MongoDB Atlas Triggers?

Change streams allow you to react immediately to changes in your database. If you want to constantly be monitoring changes to your database, ensuring that your application that is monitoring the change stream is always up and not missing any events is possible... but can be challenging. This is where MongoDB Atlas triggers come in.
MongoDB supports triggers in Atlas. Atlas triggers allow you to execute functions in real time based on database events (just like change streams) or on scheduled intervals (like a cron job). Atlas triggers have a few big advantages:
  • You don't have to worry about programming the change stream. You simply program the function that will be executed when the database event is fired.
  • You don't have to worry about managing the server where your change stream code is running. Atlas takes care of the server management for you.
  • You get a handy UI to configure your trigger, which means you have less code to write.
Atlas triggers do have a few constraints. The biggest constraint I hit in the past was that functions did not support module imports (i.e. import and require). That has changed, and you can now upload external dependencies that you can use in your functions. See Upload External Dependencies for more information. To learn more about functions and their constraints, see the official Realm Functions documentation.

Create a MongoDB Atlas Trigger

Just as we did in earlier sections, let's look for new listings in the Sydney, Australia market. Instead of working locally in a code editor to create and monitor a change stream, we'll create a trigger in the Atlas web UI.

Create a Trigger

Let's create an Atlas trigger that will monitor the listingsAndReviews collection and call a function whenever a new listing is added in the Sydney, Australia market.
  1. Navigate to your project in Atlas.
  2. In the Data Storage section of the left navigation pane, click Triggers.
  3. Click Add Trigger. The Add Trigger wizard will appear.
  4. In the Link Data Source(s) selection box, select your cluster that contains the sample_airbnb database and click Link. The changes will be deployed. The deployment may take a minute or two. Scroll to the top of the page to see the status.
  5. In the Select a cluster... selection box, select your cluster that contains the sample_airbnb database.
  6. In the Select a database name... selection box, select sample_airbnb.
  7. In the Select a collection name... selection box, select listingsAndReviews.
  8. In the Operation Type section, check the box beside Insert.
  9. In the Function code box, replace the commented code with a call to log the change event. The code should now look like the following:
    1exports = function(changeEvent) {
    2 console.log(JSON.stringify(changeEvent.fullDocument));
    3};
  10. We can create a $match statement to filter our change events just as we did earlier with the aggregation pipeline we passed to the change stream in our Node.js script. Expand the ADVANCED (OPTIONAL) section at the bottom of the page and paste the following in the Match Expression code box.
    1{
    2 "fullDocument.address.country": "Australia",
    3 "fullDocument.address.market": "Sydney"
    4}
  11. Click Save. The trigger will be enabled. From that point on, the function to log the change event will be called whenever a new document in the Sydney, Australia market is inserted in the listingsAndReviews collection.

Fire the Trigger

Now that we have the trigger configured, let's create sample data that will fire the trigger.
  1. Return to the shell on your local machine.
  2. Create and update sample data by executing node changeStreamsTestData.js in a new shell.

View the Trigger Results

When you created the trigger, MongoDB Atlas automatically created a Realm application for you named Triggers_RealmApp.
The function associated with your trigger doesn't currently do much. It simply prints the change event document. Let's view the results in the logs of the Realm app associated with your trigger.
  1. Return to your browser where you are viewing your trigger in Atlas.
  2. In the navigation bar toward the top of the page, click Realm.
  3. In the Applications pane, click Triggers_RealmApp. The Triggers_RealmApp Realm application will open.
  4. In the MANAGE section of the left navigation pane, click Logs. Two entries will be displayed in the Logs pane—one for each of the listings in the Sydney, Australia market that was inserted into the collection.
  5. Click the arrow at the beginning of each row in the Logs pane to expand the log entry. Here you can see the full document that was inserted.
If you insert more listings in the Sydney, Australia market, you can refresh the Logs page to see the change events.

Wrapping Up

Today we explored four different ways to accomplish the same task of reacting immediately to changes in the database. We began by writing a Node.js script that monitored a change stream using Node.js's Built-in EventEmitter class. Next we updated the Node.js script to monitor a change stream using the MongoDB Node.js Driver's ChangeStream class. Then we updated the Node.js script to monitor a change stream using the Stream API. Finally, we created an Atlas trigger to monitor changes. In all four cases, we were able to use $match to filter the change stream events.
This post included many code snippets that built on code written in the first post of this MongoDB and Node.js Quick Start series. To get a full copy of the code used in today's post, visit the Node.js Quick Start GitHub Repo.
The examples we explored today all did relatively simple things whenever an event was fired: they logged the change events. Change streams and triggers become really powerful when you start doing more in response to change events. For example, you might want to fire alarms, send emails, place orders, update other systems, or do other amazing things.
This is the final post in the Node.js and MongoDB Quick Start Series (at least for now!). I hope you've enjoyed it! If you have ideas for other topics you'd like to see covered, let me know in the MongoDB Community.

Additional Resources


Facebook Icontwitter iconlinkedin icon
Rate this quickstart
star-empty
star-empty
star-empty
star-empty
star-empty
Related
Quickstart

Getting Started With Bun and MongoDB


Jul 19, 2024 | 9 min read
Code Example

Final Space API


Jul 07, 2022 | 1 min read
Tutorial

Use MongoDB as the Data Store for your Strapi Headless CMS


Sep 23, 2022 | 8 min read
Tutorial

Sentiment Chef Agent App with Google Cloud and MongoDB Atlas


Jun 24, 2024 | 16 min read
Table of Contents