Nick Parsons

2 results

Part 2: Introducing Mongoose to Your Node.js and Restify API

This post is a sequel to Getting Started with MongoDB, Node.js and Restify. We’ll now guide you through the steps needed to modify your API by introducing Mongoose. If you have not yet created the base application, please head back and read the original tutorial . In this post, we’ll do a deep dive into how to integrate Mongoose , a popular ODM (Object -Document Mapper) for MongoDB, into a simple Restify API. Mongoose is similar to an ORM (Object-Relational Mapper) you would use with a relational database. Both ODMs and ORMs can make your life easier with built-in structure and methods. The structure of an ODM or ORM will contain business logic that helps you organize data. The built in methods of an ODM or ORM automate common tasks that help you communicate with the native drivers, which helps you work more quickly and efficiently. All of that said, the beauty of a tool like MongoDB is that ODMs are more of a convenience, as compared to how ORMs are essential for RDBMS’. MongoDB has many built in features for helping you organize, analyze and keep track of your data. In order to harness the added structure and logic that an ODM like Mongoose offers, we are going to show you how to incorporate it into your API. Mongoose is an ODM that provides a straightforward and schema-based solution to model your application data on top of MongoDB’s native drivers. It includes built-in type casting, validation (which enhances MongoDB’s native document validation ), query building, hooks and more. Note: If you’d like to jump ahead without following the detailed steps below, the complete git repo for this tutorial can be found on GitHub. Prerequisites In order to get up to speed, let’s make sure that you are all set with the following prerequisites: An understanding of the original API The latest version of Node.js (currently at v8.1.4 ) A Mac (OSX, macOS, etc. as this tutorial does not cover Windows or Linux) git (installed by default on macOS) Getting Started This post assumes that you have the original codebase from the previous blog post. Please follow the instructions below to get up and running. I’ve included commands to pull in the example directory from the first post. $ git clone $ cp -R mongodb-node-restify-api-part-1 mongodb-node-restify-api-part-2 $ cd mongodb-node-restify-api-part-2 && npm install With the third command above, you have successfully copied the initial codebase into its own directory, which enables us to start the migration. To view the directories on your system, use the following command: $ ls You should see the following output: $ mongodb-node-restify-api-part-1 mongodb-node-api-restify-part-2 Move into the new directory with the cd command and let’s begin the migration from the raw MongoDB driver to Mongoose: $ cd mongodb-node-restify-api-part-2 New Dependencies We’ll need to install additional dependencies in order to add the necessary functionality. Specifically, we’ll be adding mongoose and the mongoose-timestamp plugin to generate/store createdAt and updatedAt timestamps (we’ll touch more on Mongoose plugins later in the post). $ npm install --save mongoose mongoose-timestamp Since we’re moving away from the native MongoDB driver over to Mongoose, let’s go ahead and remove the dependency on the MongoDB driver using the following npm command: $ npm uninstall mongodb Now, if you view your package.json file, you will see the following JSON: { "name": "rest-api", "version": "1.0.0", "description": "", "main": "index.js", "scripts": { "start": "node index.js", "test": "echo \"Error: no test specified\" && exit 1" }, "keywords": [], "author": "Nick Parsons ", "license": "ISC", "dependencies": { "mongoose": "^4.11.1", "mongoose-timestamp": "^0.6.0", "restify": "^4.3.1" } } Mongoose Schemas & Models When you’re developing an application backend using Mongoose, your document design starts with what is called a schema. Each schema in Mongoose maps to a specific MongoDB collection. With Mongoose schemas come models , a constructor compiled from the schema definition. Instances of models represent a MongoDB document, which can be saved and retrieved from your database. All document creation and retrieval from MongoDB is handled by a specific model. It’s important to know that schemas are extremely flexible and allow for the same nested structure as the native MongoDB driver would support. Furthermore, schemas support business logic such as validation, pre/post hooks, plugins, and more – all of which is outlined in the official Mongoose guide. In the following steps, we’ll be adding two schema definitions to our codebase and, in turn, we will import them into our API routes for querying and document creation. The first model will be used to store all user data and the second will be used to store all associated todo items. This will create a functional and flexible structure for our API. Schema/Model Creation Assuming you’re inside of the root directory, create a new directory called models with a user.js and todo.js file: $ mkdir models && cd models && touch user.js todo.js Next, let’s go ahead and modify our models/user.js and models/todo.js models. The model files should have the following contents: models/user.js const mongoose = require('mongoose'), timestamps = require('mongoose-timestamp') const UserSchema = new mongoose.Schema({ email: { type: String, trim: true, lowercase: true, unique: true, required: true, }, name: { first: { type: String, trim: true, required: true, }, last: { type: String, trim: true, required: true, }, }, }, { collection: 'users' }) UserSchema.plugin(timestamps) module.exports = exports = mongoose.model('User', UserSchema) models/todo.js const mongoose = require('mongoose'), timestamps = require('mongoose-timestamp') const TodoSchema = new mongoose.Schema({ userId: { type: mongoose.Schema.Types.ObjectId, ref: 'User', index: true, required: true, }, todo: { type: String, trim: true, required: true, }, status: { type: String, enum: [ 'pending', 'in progress', 'complete', ], default: 'pending', }, }, { collection: 'todos' }) TodoSchema.plugin(timestamps) module.exports = exports = mongoose.model('Todo', TodoSchema) Note: We’re using the mongoose-timestamp plugin by calling SchemaName.plugin(timestamps). This allows us to automatically generate createdAt and updatedAt timestamps and indexes without having to add additional code to our schema files. A full breakdown on schema plugins can be found here. Route Creation The /routes directory will hold our user.js and todo.js files. For the sake of simplicity, you can copy and paste the following file contents into your todo.js file and overwrite the previous code. If you compare the two files, you’ll notice there is a slight change in the way that we call MongoDB using Mongoose. Specifically, Mongoose is playing as a role of abstraction over our database model, piping operations to the native MongoDB driver with validation in between. Lastly, we’ll need to create a new file called user.js. $ cd ../routes Then create the routes/user.js file: $ touch user.js routes/user.js const User = require('../models/user'), Todo = require('../models/todo') module.exports = function(server) { /** * Create */'/users', (req, res, next) => { let data = req.body || {} User.create(data) .then(task => { res.send(200, task) next() }) .catch(err => { res.send(500, err) }) }) /** * List */ server.get('/users', (req, res, next) => { let limit = parseInt(req.query.limit, 10) || 10, // default limit to 10 docs skip = parseInt(req.query.skip, 10) || 0, // default skip to 0 docs query = req.query || {} // remove skip and limit from query to avoid false querying delete query.skip delete query.limit User.find(query).skip(skip).limit(limit) .then(users => { res.send(200, users) next() }) .catch(err => { res.send(500, err) }) }) /** * Read */ server.get('/users/:userId', (req, res, next) => { User.findById(req.params.userId) .then(user => { res.send(200, user) next() }) .catch(err => { res.send(500, err) }) }) /** * Update */ server.put('/users/:userId', (req, res, next) => { let data = req.body || {}, opts = { new: true } User.findByIdAndUpdate({ _id: req.params.userId }, data, opts) .then(user => { res.send(200, user) next() }) .catch(err => { res.send(500, err) }) }) /** * Delete */ server.del('/users/:userId', (req, res, next) => { const userId = req.params.userId User.findOneAndRemove({ _id: userId }) .then(() => { // remove associated todos to avoid orphaned data Todo.deleteMany({ _id: userId }) .then(() => { res.send(204) next() }) .catch(err => { res.send(500, err) }) }) .catch(err => { res.send(500, err) }) }) } routes/todo.js const Todo = require('../models/todo') module.exports = function(server) { /** * Create */'/users/:userId/todos', (req, res, next) => { let data = Object.assign({}, { userId: req.params.userId }, req.body) || {} Todo.create(data) .then(task => { res.send(200, task) next() }) .catch(err => { res.send(500, err) }) }) /** * List */ server.get('/users/:userId/todos', (req, res, next) => { let limit = parseInt(req.query.limit, 10) || 10, // default limit to 10 docs skip = parseInt(req.query.skip, 10) || 0, // default skip to 0 docs query = req.params || {} // remove skip and limit from data to avoid false querying delete query.skip delete query.limit Todo.find(query).skip(skip).limit(limit) .then(tasks => { res.send(200, tasks) next() }) .catch(err => { res.send(500, err) }) }) /** * Get */ server.get('/users/:userId/todos/:todoId', (req, res, next) => { Todo.findOne({ userId: req.params.userId, _id: req.params.todoId }) .then(todo => { res.send(200, todo) next() }) .catch(err => { res.send(500, err) }) }) /** * Update */ server.put('/users/:userId/todos/:todoId', (req, res, next) => { let data = req.body || {}, opts = { new: true } Todo.update({ userId: req.params.userId, _id: req.params.todoId }, data, opts) .then(user => { res.send(200, user) next() }) .catch(err => { res.send(500, err) }) }) /** * Delete */ server.del('/users/:userId/todos/:todoId', (req, res, next) => { Todo.findOneAndRemove({ userId: req.params.userId, _id: req.params.todoId }) .then(() => { res.send(204) next() }) .catch(err => { res.send(500, err) }) }) } Entry Point Our updated entry point for this API is in /index.js. Your index.js file should mirror the following: /** * Module Dependencies */ const restify = require('restify'), mongoose = require('mongoose') /** Config */ const config = require('./config') /** Initialize Server */ const server = restify.createServer({ name :, version : config.version }) /** Bundled Plugins ( */ server.use(restify.jsonBodyParser({ mapParams: true })) server.use(restify.acceptParser(server.acceptable)) server.use(restify.queryParser({ mapParams: true })) server.use(restify.fullResponse()) /** Start Server, Connect to DB & Require Route Files */ server.listen(config.port, () => { /** Connect to MongoDB via Mongoose */ const opts = { promiseLibrary: global.Promise, server: { auto_reconnect: true, reconnectTries: Number.MAX_VALUE, reconnectInterval: 1000, }, config: { autoIndex: true, }, } mongoose.Promise = opts.promiseLibrary mongoose.connect(config.db.uri, opts) const db = mongoose.connection db.on('error', (err) => { if (err.message.code === 'ETIMEDOUT') { console.log(err) mongoose.connect(config.db.uri, opts) } }) db.once('open', () => { require('./routes/user')(server) require('./routes/todo')(server) console.log(`Server is listening on port ${config.port}`) }) }) Starting the Server Now that we’ve modified the code to use Mongoose, let’s go ahead and run the npm start command from your terminal: $ npm start Assuming all went well, you should see the following output: Server is listening on port 3000 Using the API The API is almost identical to the API written in the "getting started" post , however, in this version we have introduced the concept of “users” who are owners of “todo” items. I encourage you to experiment with the new API endpoints using Postman to better understand the API endpoint structure. For your convenience, below are the available calls ( cURL ) for your updated API endpoints: User Endpoints CREATE curl -i -X POST http://localhost:3000/users -H 'content-type: application/json' -d '{ "email": "", "name": { "first": "Nick", "last": "Parsons" }}' LIST curl -i -X GET http://localhost:3000/users -H 'content-type: application/json' READ curl -i -X GET http://localhost:3000/users/$USER_ID -H 'content-type: application/json' UPDATE curl -i -X PUT http://localhost:3000/users/$USER_ID -H 'content-type: application/json' -d '{ "email": "" }' DELETE curl -i -X DELETE http://localhost:3000/users/$USER_ID -H 'content-type: application/json' Todo Endpoints CREATE curl -i -X POST http://localhost:3000/users/$USER_ID/todos -H 'content-type: application/json' -d '{ "todo": "Make a pizza!" }' LIST curl -i -X GET http://localhost:3000/users/$USER_ID/todos -H 'content-type: application/json' READ curl -i -X GET http://localhost:3000/users/$USER_ID/todos/$TODO_ID -H 'content-type: application/json' UPDATE curl -i -X PUT http://localhost:3000/users/$USER_ID/todos/$TODO_ID -H 'content-type: application/json' -d '{ "status": "in progress" }' DELETE curl -i -X DELETE http://localhost:3000/users/$USER_ID/todos/$TODO_ID -H 'content-type: application/json' Note: The $PARAM_ID requirement in the URL denotes that the URL parameter should be replaced with a value. In our case, it will likely be a MongoDB ObjectId. Final Thoughts I hope this short tutorial on adding Mongoose to your API was helpful for future development. Hopefully, you noticed how using a tool like Mongoose can simplify writing MongoDB functionality as a layer on top of your API. As Mongoose is only a single addition to keep in mind as you develop and hone your API development skills, we’ll continue to release more posts with other examples and look forward to hearing your feedback. If you have any questions or run into issues, please comment below. In my next post, I’ll show you how to create a similar application from start to finish using MongoDB Stitch , our new Backend as a Service. You'll get to see how abstracting away this API in favor of using Stitch will make it easier to add additional functionality such as database communication, authentication and authorization, so you can focus on what matters – the user experience on top of your API.

July 21, 2017

Getting Started with MongoDB, Node.js and Restify

When building a REST API, it is important to choose a framework that will help you to work quickly and easily through the process. This can be impacted by the actual speed of the framework, but also by the amount of knowledge and documentation that exists so that you can spend less time working through roadblocks and more time working on critical components. In this post, we’ll be covering how to set up a simple API with full CRUD operations using the ever-popular Restify framework backed by a MongoDB Atlas database. For sake of simplicity, we’ll create a basic “to-do” style API. Installing Node.js Let’s get started by installing Node.js (if you already have Node.js installed, feel free to skip to the next section of this post). If you are on a Mac, the easiest way to install Node.js is via Homebrew , a package manager for macOS. To do so, simply run the following command in your terminal and you’ll be ready to go: $ brew install node If you already have Node.js installed, make sure you’re running the latest version by running: $ brew upgrade node Once installed, you can run the following command to verify that node has been installed (please note that I’m currently running v7.10.0): $ node --version The output will display the latest version installed, which will be v7.10.0 or above: $ v7.10.0 Note : If you’re on Windows or Linux, please read the installation instructions on the official Node.js website. Getting Started with MongoDB Atlas The easiest way to get start using MongoDB is with MongoDB Atlas, a managed service from MongoDB that facilitates seamless scaling and operations of your deployments in the cloud. It is super simple to set up and operate, and, best of all, there’s a free tier for small deployments – no credit card required. To get started, head on over to and click the “Start Free” button. From there, we’ll need to create a “group” to hold our free cluster. Feel free to name your group whatever you’d like. In this example, we'll work with a group called young-sea-3503 . Once your group is configured, you’ll be prompted to set up your first cluster. Simply follow the steps and choose the “M0” instance size for the free tier. Once you deploy your instance, it should be available within about five minutes. Create Our Application Directory Let’s create a directory for our application. We will call it “rest-api”. To do so, simply run the following command in your terminal: $ cd ~ $ mkdir rest-api && cd rest-api Pro Tip : The application directory location does not technically matter. I chose your root simply because it’s easy to find. Feel free to specify a different directory path if you’d like. NPM Package Management Now that Node.js is set up and we have a database to connect to, we’ll need to install the npm packages restify and mongodb (the official MongoDB driver for Node.js). When you installed Node.js, you also installed npm , which makes it super simple to install the necessary dependencies. The best way to manage locally installed npm packages is to create a package.json file. A package.json file affords you a number of capabilities: It documents which packages your project depends on. It allows you to specify the versions of a package that your project can use, using semantic versioning rules. It makes your build reproducible, which makes it way easier to share with other developers. Installing the Necessary NPM Packages Now that you understand the need for a package.json file, let’s run an npm command in your terminal to generate the file: $ npm init --yes You should see the following output (the contents of your package.json file): { "name": "rest-api", "version": "1.0.0", "description": "", "main": "index.js", "scripts": { "test": "echo \"Error: no test specified\" && exit 1" }, "keywords": [], "author": "", "license": "ISC" } Pro Tip : Running the init command attempts to make reasonable guesses about how you want your options set, and then writes a package.json file with the properties you select when prompted. If you invoke it with -f , --force , -y , or --yes , it will use only defaults and not prompt you for any options. Now that we have the package.json file created, open up a terminal, navigate to the directory that contains your Node.js application and run the following command: $ npm install mongodb restify --save The command above will automatically install the mongodb and restify node modules from the npm registry. Create a Configuration File To keep things clean, I generally like to abstract important and/or reusable information into a configuration file. With that said, let’s create a file called config.js with the following command: $ touch config.js The contents of the file should look something like this: 'use strict' module.exports = { name: 'rest-api', version: '0.0.1', env: process.env.NODE_ENV || 'development', port: process.env.PORT || 3000, db: { uri: 'YOUR_MONGODB_CONNECTION_STRING', } } With our config.js file in place, grab the connection string so we can add it to your configuration file. To do so, head back over to the MongoDB Atlas console and click on the “connect” button. Once clicked, you will be presented with the following screen, containing your connection string to your MongoDB Atlas cluster: Click “copy” and replace “ YOUR_MONGODB_CONNECTION_STRING ” in your config.js file with your provided connection string. Here's an example of our config.js file with our MongoDB Atlas connection string specifying the API database using our username and password: module.exports = { name: 'rest-api', version: '0.0.1', env: process.env.NODE_ENV || 'development', port: process.env.PORT || 3000, db: { uri: 'mongodb://$USERNAME:$,,' } } Note : You will need to manually drop in the password you created when you initially built your cluster. For security purposes the password is not visible in the console. Additionally, it’s best to also specify a user with read, write, or read and write access (depending on your needs). For the purpose of this tutorial, we have named the database “api”. Whitelist your IP Address Because security is top of mind with MongoDB Atlas, we will need to whitelist our IP address in order to connect to the cluster. MongoDB Atlas makes this super simple by providing a button that says “Add Current IP Address” just above where you previously copied your connection string from. Create the index.js File The index.js file will server as the main entrypoint to your Node.js API. To get started, let’s go ahead and create the file using the following command: $ touch index.js Next, let’s drop in some boilerplate code so that your application will kickoff the Restify server process: 'use strict' /** Module Dependencies */ const config = require('./config'), restify = require('restify'), mongodb = require('mongodb').MongoClient /** Initialize Server */ const server = restify.createServer({ name :, version : config.version }) /** Bundled Plugins ( */ server.use(restify.jsonBodyParser({ mapParams: true })) server.use(restify.acceptParser(server.acceptable)) server.use(restify.queryParser({ mapParams: true })) server.use(restify.fullResponse()) /** Lift Server, Connect to DB & Require Route File */ server.listen(config.port, () => { // establish connection to mongodb atlas mongodb.connect(config.db.uri, (err, db) => { if (err) { console.log('An error occurred while attempting to connect to MongoDB', err) process.exit(1) } console.log( '%s v%s ready to accept connections on port %s in %s environment.',, config.version, config.port, config.env ) require('./routes')({ db, server }) }) }) Setting Up Routes Next, let’s create our routes for CRUD operations. With a fully fledged application, you would generally have dozens of route files; however, for our case, we’ll keep it simple and only have one route file called routes.js . Let’s create our file by running the following command from your terminal: $ touch routes.js Now that you’ve successfully created your routes file, let’s start populating the file with your actual endpoints! Below is boilerplate code to drop into your routes.js file (all of which is commented and should be self-explanatory): 'use strict' module.exports = function(ctx) { // extract context from passed in object const db = ctx.db, server = ctx.server // assign collection to variable for further use const collection = db.collection('todos') /** * Create */'/todos', (req, res, next) => { // extract data from body and add timestamps const data = Object.assign({}, req.body, { created: new Date(), updated: new Date() }) // insert one object into todos collection collection.insertOne(data) .then(doc => res.send(200, doc.ops[0])) .catch(err => res.send(500, err)) next() }) /** * Read */ server.get('/todos', (req, res, next) => { let limit = parseInt(req.query.limit, 10) || 10, // default limit to 10 docs skip = parseInt(req.query.skip, 10) || 0, // default skip to 0 docs query = req.query || {} // remove skip and limit from query to avoid false querying delete query.skip delete query.limit // find todos and convert to array (with optional query, skip and limit) collection.find(query).skip(skip).limit(limit).toArray() .then(docs => res.send(200, docs)) .catch(err => res.send(500, err)) next() }) /** * Update */ server.put('/todos/:id', (req, res, next) => { // extract data from body and add timestamps const data = Object.assign({}, req.body, { updated: new Date() }) // build out findOneAndUpdate variables to keep things organized let query = { _id: }, body = { $set: data }, opts = { returnOriginal: false, upsert: true } // find and update document based on passed in id (via route) collection.findOneAndUpdate(query, body, opts) .then(doc => res.send(204)) .catch(err => res.send(500, err)) next() }) /** * Delete */ server.del('/todos/:id', (req, res, next) => { // remove one document based on passed in id (via route) collection.findOneAndDelete({ _id: }) .then(doc => res.send(204)) .catch(err => res.send(500, err)) next() }) } Verifying our API Functionality with Postman At it’s core, Postman is a powerful GUI platform built to make your API development faster and easier – from building API requests to testing, documenting and sharing with teammates. It’s one of my favorite tools (and should be yours, too). Start the server. In the root directory where your index.js file lives, simply type the following command: $ node index.js This will kickoff the Node.js process and start your server on port 3000 (as defined in our config.js file) on localhost ( http://localhost:3000 ). If all goes well, you should see the following output in your console: rest-api v0.0.1 ready to accept connections on port 3000 in development environment. To ensure that our API properly accepts the request, we’ll need to confirm that the Content-Type header is set to application/json . Now that we have that set, let’s go ahead and create a document in our MongoDB “todos” collection in the “api” database by hitting the /todos endpoint with a POST request and a JSON body from Postman. Choose the “raw” body type and “JSON (application/json)” from the dropdown menu. By doing so, you’ll be able to construct your JSON much easier within Postman. Copy the following snippet and paste it into the “raw” section to be passed along as your JSON payload: { "name": "Nick", "task": "Make Moves", "status": "pending" } If you inspect the payload, you’ll notice that the MongoDB driver automatically inserted an ObjectID (_id). More the MongoDB ObjectID can be found here . Additionally, our created and updated dates were automatically added to the document by the code we wrote. Next, we can GET all of the documents via the /todos endpoint. If you looked at the GET function, you probably noticed logic for querying and limiting results. This works well because Restify automatically converts query parameters to JSON, allowing us to simply pass in the object as our MongoDB query. The payload response will contain the following: Let’s go ahead and update our task from “Pending” to “Complete” using the PUT HTTP method. This is similar to our HTTP POST , however, this time we’ll be using a different method and modified payload. We will also be passing the MongoDB ObjectID as a URL parameter: Note : Your body should consist of the following to ensure that the status is modified: { "status": "complete" } Finally, let’s delete our document with the DELETE HTTP method (you’ll need to pass the auto-generated ObjectId from your initial call in the URL): You now have four functional API endpoints to power a todo list in which you can run full CRUD operations. While we covered just the tip of the iceberg, you are well along your way to adding additional functionality using the API/structural patterns provided in this tutorial. Pro Tip : You can kill your Node.js process at any time by pressing control-c . Next Steps & Further Reading Building a todo app is a wonderful way to practice working with APIs and combining technologies. It allows you to try out new technologies in order see how much information around working with them exists and whether they fit your development style. Please let us know how you found your experience in the comments section or by shooting us an email! Below are a few resources that I found helpful for increasing my knowledge around APIs and MongoDB: Getting started with MongoDB Atlas Node.js MongoDB Driver API NodeSchool MongoDB University Restify

May 19, 2017