GIANT Stories at MongoDB

Working with MongoDB Stitch Through Existing Drivers – Python & PyMongo

Andrew Morgan

Technical, Cloud

You can now access MongoDB Stitch (and Atlas behind it) using any of the existing MongoDB Drivers – this post shows how to do it using the PyMongo Python Driver.

Working with MongoDB Stitch Through the mongo Shell – MongoDB Wire Protocol Support

Andrew Morgan

Technical, Cloud

The Stitch SDK is the best way to access MongoDB Stitch from your frontend application code – however, you may want to access Stitch from your existing tools (e.g. the mongo shell) or backend code – this post shows you how.

MongoDB Stitch Authentication Triggers

Andrew Morgan

Technical, Cloud

See how Stitch authentication triggers let you use third-party user-authentication services such as Facebook or Google without losing the ability to perform custom actions when users register, sign-in, or leave.

Sending Text Messages with MongoDB Stitch & Twilio

Andrew Morgan

Technical, Cloud

How to send text messages from your app using MongoDB Stitch and Twilio.

Testing & Debugging MongoDB Stitch Functions

Andrew Morgan

Technical, Cloud

Testing and debugging serverless functions can be tricky – not so with MongoDB Stitch functions. This post shows how quick and easy it is through the Stitch UI.

Building a REST API with MongoDB Stitch

Andrew Morgan

Technical, Cloud

MongoDB Stitch QueryAnywhere often removes the need to create REST APIs, but for the other times, Stitch webhooks let you create them in minutes.

New to MongoDB Atlas — Data Explorer Now Available for All Cluster Sizes

Ken W. Alger

Release Notes, Cloud

At the recent MongoDB .local Chicago event, MongoDB CTO and Co-Founder, Eliot Horowitz made an exciting announcement about the Data Explorer feature of MongoDB Atlas. It is now available for all Atlas cluster sizes, including the free tier.

The easiest way to explore your data

What is the Data Explorer? This powerful feature allows you to query, explore, and take action on your data residing inside MongoDB Atlas (with full CRUD functionality) right from your web browser. Of course, we've thought about security; Data Explorer access and whether or not a user can modify documents is tied to her role within the Atlas Project. Actions performed via the Data Explorer are also logged in the Atlas alerting window.

Bringing this feature to the "shared" Atlas cluster sizes — the free M0s, M2s, and M5s — allows for even faster development. You can now perform actions on your data while developing your application, which is where these shared cluster sizes really shine.

Check out this short video to see the Data Explorer in action.

Atlas is the easiest and fastest way to get started with MongoDB. Deploy a free cluster in minutes.

Recording sensor data with MongoDB Stitch & Electric Imp

Andrew Morgan

Technical, Cloud

Electric Imp devices and cloud services are a great way to get started with IoT. Electric Imp's MongoDB Stitch client library makes it a breeze to integrate with Stitch. This post describes how.

Controlling humidity with a MongoDB Stitch HTTP service and IFTTT

Andrew Morgan

Technical, Cloud

IFTTT is a great cloud service for pairing up cloud and IoT services. This post shows how to invoke an IFTTT webhook from a MongoDB Stitch function, where that webhook controls a dehumidifier via a Smart power plug.

Integrating MongoDB and Amazon Kinesis for Intelligent, Durable Streams

You can build your online, operational workloads atop MongoDB and still respond to events in real time by kicking off Amazon Kinesis stream processing actions, using MongoDB Stitch Triggers.

Let’s look at an example scenario in which a stream of data is being generated as a result of actions users take on a website. We’ll durably store the data and simultaneously feed a Kinesis process to do streaming analytics on something like cart abandonment, product recommendations, or even credit card fraud detection.

We’ll do this by setting up a Stitch Trigger. When relevant data updates are made in MongoDB, the trigger will use a Stitch Function to call out to AWS Kinesis, as you can see in this architecture diagram:

Figure 1. Architecture Diagram

What you’ll need to follow along:

  1. An Atlas instance
    If you don’t already have an application running on Atlas, you can follow our getting started with Atlas guide here. In this example, we’ll be using a database called streamdata, with a collection called clickdata where we’re writing data from our web-based e-commerce application.
  2. An AWS account and a Kinesis stream
    In this example, we’ll use a Kinesis stream to send data downstream to additional applications such as Kinesis Analytics. This is the stream we want to feed our updates into.
  3. A Stitch application
    If you don’t already have a Stitch application, log into Atlas, and click Stitch Apps from the navigation on the left, then click Create New Application.

Create a Collection

The first step is to create a database and collection from the Stitch application console. Click Rules from the left navigation menu and click the Add Collection button. Type streamdata for the database and clickdata for the collection name. Select the template labeled Users can only read and write their own data and provide a field name where we’ll specify the user id.

Figure 2. Create a collection

Configuring Stitch to talk to AWS

Stitch lets you configure Services to interact with external services such as AWS Kinesis. Choose Services from the navigation on the left, and click the Add a Service button, select the AWS service and set AWS Access Key ID, and Secret Access Key.

Figure 3. Service Configuration in Stitch

Services use Rules to specify what aspect of a service Stitch can use, and how. Add a rule which will enable that service to communicate with Kinesis by clicking the button labeled NEW RULE. Name the rule “kinesis” as we’ll be using this specific rule to enable communication with AWS Kinesis. In the section marked Action, select the API labeled Kinesis and select All Actions.

Figure 4. Add a rule to enable integration with Kinesis

Write a function that uses the service to stream documents into Kinesis

Now that we have a working AWS service, we can use it to put records into a Kinesis stream. The way we do that in Stitch is with Functions. Let’s set up a putKinesisRecord function.

Select Functions from the left-hand menu, and click Create New Function. Provide a name for the function and paste the following in the body of the function.

exports = function(event){
 const awsService = context.services.get('aws');
try{
   awsService.kinesis().PutRecord({
     Data: JSON.stringify(event.fullDocument),
     StreamName: "stitchStream",
     PartitionKey: "1"
      }).then(function(response) {
        return response;
      });
}
catch(error){
  console.log(JSON.parse(error));
}
};
Figure 5. Example Function - putKinesisRecord

Test out the function

Let’s make sure everything is working by calling that function manually. From the Function Editor, Click Console to view the interactive javascript console for Stitch.

Functions called from Triggers require an event. To test execution of our function, we’ll need to pass a dummy event to the function. Creating variables from the console in Stitch is simple. Simply set the value of the variable to a JSON document. For our simple example, use the following:

event = {
   "operationType": "replace",
   "fullDocument": {
       "color": "black",
       "inventory": {
           "$numberInt": "1"
       },
       "overview": "test document",
       "price": {
           "$numberDecimal": "123"
       },
       "type": "backpack"
   },
   "ns": {
       "db": "streamdata",
       "coll": "clickdata"
   }
}
exports(event);

Paste the above into the console and click the button labeled Run Function As. Select a user and the function will execute.

Ta-da!

Putting it together with Stitch Triggers

We’ve got our MongoDB collection living in Atlas, receiving events from our web app. We’ve got our Kinesis stream ready for data. We’ve got a Stitch Function that can put data into a Kinesis stream.

Configuring Stitch Triggers is so simple it’s almost anticlimactic. Click Triggers from the left navigation, name your trigger, provide the database and collection context, and select the database events Stitch will react to with execution of a function.

For the database and collection, use the names from step one. Now we’ll set the operations we want to watch with our trigger. (Some triggers might care about all of them – inserts, updates, deletes, and replacements – while others can be more efficient because they logically can only matter for some of those.) In our case, we’re going to watch for insert, update and replace operations.

Now we specify our putKinesisRecord function as the linked function, and we’re done.

Figure 6. Trigger Configuration in Stitch

As part of trigger execution, Stitch will forward details associated with the trigger event, including the full document involved in the event (i.e. the newly inserted, updated, or deleted document from the collection.) This is where we can evaluate some condition or attribute of the incoming document and decide whether or not to put the record onto a stream.

Test the trigger!

Amazon provides a dashboard which will enable you to view details associated with the data coming into your stream.

Figure 7. Kinesis Stream Monitoring

As you execute the function from within Stitch, you’ll begin to see the data entering the Kinesis stream.

Building some more functionality

So far our trigger is pretty basic – it watches a collection and when any updates or inserts happen, it feeds the entire document to our Kinesis stream. From here we can build out some more intelligent functionality. To wrap up this post, let’s look at what we can do with the data once it’s been durably stored in MongoDB and placed into a stream.

Once the record is in the Kinesis Stream you can configure additional services downstream to act on the data. A common use case incorporates Amazon Kinesis Data Analytics to perform analytics on the streaming data. Amazon Kinesis Data Analytics offers pre-configured templates to accomplish things like anomaly detection, simple alerts, aggregations, and more.

For example, our stream of data will contain orders resulting from purchases. These orders may originate from point-of-sale systems, as well as from our web-based e-commerce application. Kinesis Analytics can be leveraged to create applications that process the incoming stream of data. For our example, we could build a machine learning algorithm to detect anomalies in the data or create a product performance leaderboard from a sliding, or tumbling window of data from our stream.

Figure 8. Amazon Data Analytics - Anomaly Detection Example

Wrapping up

Now you can connect MongoDB to Kinesis. From here, you’re able to leverage any one of the many services offered from Amazon Web Services to build on your application. In our next article in the series, we’ll focus on getting the data back from Kinesis into MongoDB. In the meantime, let us know what you’re building with Atlas, Stitch, and Kinesis!

Resources

MongoDB Atlas

MongoDB Stitch

Amazon Kinesis