Building Modern Applications with Microservices: Part 1
This is the first in a two post series about microservices. This post discusses the background behind microservices, new technologies that have enabled them, and the benefit of microservices.
Introduction
As enterprises work to replicate the development agility of internet companies and innovate in highly competitive markets, application development has grown increasingly complex. The large, monolithic codebases that traditionally power enterprise applications make it difficult to quickly launch new services. Siloed and potentially distributed development and operations teams present organizational alignment problems. On top of this, users are more demanding than ever – enterprises need to scale effectively and monitor deployments to ensure customers are provided with high performance and a consistent experience. Of course, all this needs to be done while providing always-on service availability.
Due to these trends, there is demand for a software architecture pattern that can handle the requirements of the modern age. Monolithic architectures have been the traditional approach, but limitations with scaling, difficulties in maintaining a large codebase, high risk upgrades, and large upfront setup costs have compelled enterprises to explore different approaches.
In the last few years, microservices have come to the forefront of the conversation. They have been rapidly adopted, due to their ability to provide modularity, scalability, high availability, as well as facilitate organizational alignment.
The Monolith
Before microservices, a common approach to application design was to use a monolithic architecture. In this mode of development, the application is developed, tested, packaged, and deployed as a single unit. Codebases are compiled together, and the application is deployed as one entity. Scaling required copying instances of the application binaries and the required libraries to different servers, and the application code typically ran as a single process. Continuous delivery — an approach that involves fast, iterative software development and safe updates to the deployed application — was challenging since the full monolithic application stack needed to be recompiled, relinked, and tested for even the smallest incremental release.
What are Microservices?
Microservices is a software architecture where applications are broken down into small autonomous services. Services are typically focused on a specific, discrete objective or function and decoupled along business boundaries. Separating services by business boundaries allows teams to focus on the right goals and also ensures autonomy between services. Each service is developed, tested, and deployed independently, and services are usually separated as independent processes that communicate over a network via agreed APIs, although in some cases that network may be local to the machine.
Microservices grew from Service Oriented Architecture (SOA), which gained popularity in the early 2000s and emerged as a way to combat large monolithic applications. Key differences between SOA and microservices are:
- SOAs are stateful, while microservices are stateless
- SOAs tend to use a enterprise service bus for communication, while microservices use a less elaborate and simple messaging system
- SOAs may have hundreds or thousands of lines of code, while microservices could have less than one hundred lines
- SOAs put a greater emphasis on reusability (i.e. runtime code, databases), whereas microservices focus on decoupling as much as possible
- A systematic change in a SOA requires modifying the monolith, whereas a systematic change in a microservice is to create a new service
- SOAs use traditional relational databases more often, while microservices gravitate more towards modern, non-relational databases. Further sections will cover the advantages of non-relational databases over relational databases in a microservices architecture
Many architects found that SOAs suffered problems with communication protocols and lacked sufficient guidelines on effectively separating services, which laid the foundation for microservices to emerge as a best practice method to implement a truly SOA.
New Technologies Enable Microservices
The downsides of deploying and provisioning hundreds and potentially thousands of services did not outweigh the benefits gained with a microservices architecture (faster development, scalability).
The emergence of technologies such as containers (Docker, LXC) and orchestration frameworks (Kubernetes, Mesos) mitigate many of the problems that prevented using microservices architectures in the past.
Containers are lightweight run-time environments that provide isolation and scalability with minimal impact to performance and capacity. Packaging is simplified as the same environment can simultaneously host development, support, test, and production versions of the application, so that going from dev to test to QA to production is easier. Containers work very well in a microservices environment as they isolate services to an individual container. Updating a service becomes a simple process to automate and manage, and changing one service will not impact other services, provided that APIs are maintained.

Figure 1: Container in microservices
When organizations start running containers at scale, many look to orchestration frameworks to help manage the increased complexity. Orchestration frameworks help deploy and manage containers: provision hosts, instantiate containers, handle failures, and provide automated scaling. Kubernetes and Mesos are popular orchestration frameworks that make it easier to deploy containers at massive scale in a microservice environment.
To learn more about building microservices architectures with containers and MongoDB, download our guide: Enabling Microservices: Containers and Orchestration Explained
Benefits of Microservices
Many organizations can better meet the needs of modern application development by implementing microservices. The benefits include:
Faster Time To Market: In a monolithic application, any small change in the application will require redeploying the entire application stack, which carries higher risk and complexity. This results in longer release cycles, as changes may be batched together and not released until reaching a minimum threshold. With microservices, a small change to a service can be committed, tested, and deployed immediately since changes are isolated from the rest of the system.
Continuous integration — a software practice of integrating and testing developer changes to the main code branch multiple times a day — is much simpler and faster as there are fewer functions to test. This results in a more iterative release cadence as less code needs to be compiled and retested. Orchestration tools such as Kubernetes facilitate faster time to market by automating the on-line, rolling upgrade of containers, and providing the ability to roll back any changes should they be necessary.
Flexibility and Scalability: Monolithic applications require all components of the system to scale together. If one service requires extra performance, the only option is to scale all the services rather than the individual service that needs additional capacity. With microservices, only the services that require extra performance need to be scaled. Scaling is achieved by deploying more containers, enabling more effective capacity planning, less software licensing costs, and lower TCO as the service and hardware can be matched more appropriately.

Figure 2: Scaling containers
Resiliency: A major issue with monolithic applications is that if a service fails, the whole application may be compromised. In microservices, service boundaries serve as natural isolation barriers to prevent cascading failures from bringing down the whole system. If using containers, orchestration frameworks can provide added resiliency: when one container fails, a new one is started, restoring full redundancy and capacity.
Alignment With Organization: Microservices enable better alignment of the architecture to the organization, as team sizes can be optimally defined to match the required tasks. Teams can be broken down into smaller groups and focus on a single component of the application. This is especially useful for distributed teams. For example, if a team in Singapore handles three services, while a team in San Francisco handles five services, each team can release and deploy features and functionalities independently. This helps break down silos between teams and fosters better collab oration as cross discipline teams (Ops, Dev, QA) collectively own the services. This also ensures that the communication between teams matches the communication through the services' APIs. Essentially, the APIs between services define a contract between development teams on what each service should provide to others.
Reduction in Cost: By using containers, applications and environments (design, test, production, support) can better share the same infrastructure, resulting in increased hardware utilization and reduced costs due to administrative simplification. In addition, microservices also help reduce technical debt. With a monolithic application, there are costs (time, resources) associated with refactoring code for a large application. By breaking the application into API accessible microservices, code refactoring can be done service by service, resulting in less time maintaining and updating code.
In the second part of this blog post series, we will discuss how MongoDB enables microservices.
Learn more about MongoDB and microservices. Read the white paper.
About the Author - Jason Ma
Jason is a Principal Product Marketing Manager based in Palo Alto, and has extensive experience in technology hardware and software. He previously worked for SanDisk in Corporate Strategy doing M&A and investments, and as a Product Manager on the Infiniflash All-Flash JBOF. Before SanDisk, he worked as a HW engineer at Intel and Boeing. Jason has a BSEE from UC San Diego, MSEE from the University of Southern California, and an MBA from UC Berkeley.