Serverless architecture is a way of building applications without needing to think about the underlying infrastructure that supports it. In a serverless model, instead of provisioning servers upfront to meet your needs, all you need to do is write code and push it to a serverless platform in the cloud. You don’t have to think about traditional server management (or virtual machines, containers, or another traditional unit of infrastructure).
This may seem concerning to the traditionalist, but it’s not that radical; developers haven’t had to think about logic gates in processors for a long time. Serverless computing represents an evolution moving the abstraction up a layer. Instead of worrying about the underlying infrastructure, developers can focus on solving business problems with code.
When you deploy code with serverless computing, there’s no need to deploy additional servers, virtual machines, or containers when your application takes off - instead a serverless architecture scales seamlessly with usage.
With serverless computing, costs also scale with usage. This is in stark contrast to traditional server management (Iaas - Infrastructure as a Service) where you are buying a server for a high price upfront which has some immediate drawbacks:
With a serverless model, the costs move in lockstep with usage. When nothing is being executed, you don’t pay for anything. As actual usage goes up, the costs go up along with it. Different serverless platforms have different units of cost - they might call them request units, or compute units, or something else - but they’re typically based on compute usage (memory and/or time), number of requests, data transferred, or some other combination.
The serverless model is an efficient way of using and paying for services, while focusing on the important parts of development. Instead of spending your time worrying about capacity or planning an instance tier upgrade in advance, you can just write your code and let it run at the necessary scale.
So we have a general definition of serverless architecture: one where you don’t need to think about infrastructure. But what does that mean in practice?
There are several different major categories that fall under the umbrella of serverless technology. The divisions between them aren’t hard and fast, but it’s useful to look at a few different areas.
FaaS is the serverless model applied to application logic where each logical function is an individual unit. Your code is deployed to the FaaS. When a function is called, the service instantiates it in a container where it runs as required producing the appropriate output. When there’s no more demand, the container is torn down. Since FaaS is pure compute it’s sometimes referred to as serverless compute or serverless computing, though those terms are also used more broadly.
Because instances aren’t always running, FaaS can have what’s called the cold start problem. If a function hasn’t been called for a while, a response can have an extra wait time while the instance is set up. As long as there’s continued usage, the infrastructure stays up and the latency is shorter. However, if a long enough time goes without the function being used, the instance is removed and the next call will have a cold start.
Functions are also stateless since their infrastructure is ephemeral. If a function needs to retrieve or store state it needs to do so elsewhere, usually in a database.
The most common FaaS is probably AWS Lambda (named after Lambda Functions). The other cloud vendors have FaaS platforms called Google Cloud Functions and Microsoft Azure Functions.
MongoDB Atlas Functions are also FaaS. Atlas Functions, however, are optimized for low-latency application requests, avoiding the cold start problem by running functions in pre-provisioned containers.
MongoDB Atlas Triggers can be used in conjunction with MongoDB Atlas Functions to execute serverless, event-based logic. Database triggers allow you to execute serverless function logic whenever a document is added, updated, or removed in a linked cluster, scheduled triggers let you execute function logic on a time-based schedule.
Atlas App Services Webhooks can be used when Realm’s functions need to be executed via an API call from an external source. Developers can write serverless functions and call an API endpoint that will execute this logic."
Where FaaS lets you run pieces of code in a serverless fashion, a Backend as a Service runs the full backend of an application. BaaS platforms provide simplified architectures for applications, usually web applications or mobile applications. The service hosts your entire backend application removing the need for an application server. Clients connect to the BaaS, and the BaaS connects to a database (or may have one built in). If the application backend needs to scale up or down, the BaaS handles it automatically.
Some Backends as a Service are fully serverless, scaling infrastructure and cost precisely with demand. Others don’t have a completely serverless approach. For example, there may still be the concept of instance size or there may be an hourly charge even if there is no usage.
Development approaches differ in different Backends as a Service. Some may use a FaaS approach of multiple code snippets, while others may have the full backend application code uploaded together. Many recommend that you use built-in services where available instead of writing your own, e.g. an authentication service, a notification service, or an image recognition service.
While not a BaaS, Atlas App Services offers a set of fully managed, built-in application development services like Authentication, Triggers, Functions, and an instant GraphQL API that solve many of the same problems.
Serverless databases are relatively new. They’re an expansion of the serverless paradigm beyond the application layer where serverless architecture is more common.
With a traditional Database as a Service (DBaaS), the database is fully managed for you and deployed as a specific pre-provisioned instance size with a specific price. A serverless architecture for a cloud database means that you don’t choose an instance size at all - instead you simply set up a database and it grows as necessary with data size and throughput - and the cost scales accordingly. MongoDB Atlas allows you to deploy a serverless database via serverless instances, now available in preview. Simply choose your cloud region to get started and the database will dynamically scale to meet the demand of your workload.
Many cloud services could be thought of as having a serverless architecture even if they’re not one or the normal categories of “serverless technology”. For example, Twilio is a popular text message service where if you want an application to send texts, all you need to do is call the API. There’s no need to build your own messaging service – and no need to scale it. Whether your application sends a few texts or a million, Twilio handles it and charges you accordingly.
There are also serverless application development platforms, like Vercel and Netlify, that offer developers everything they need to build front-end applications and use serverless functions to provide the backend. These development platforms focus on ease of use by transforming difficult-to-reason functions into specific REST or GraphQL API endpoints, removing the need to deploy and manage FaaS.
You should consider a serverless architecture in any of these scenarios:
In fact, these scenarios apply to most modern applications. So if you start by considering serverless options, when does it not make sense? If you’re not already on board with the cloud, or you need to control your own infrastructure for regulatory reasons, serverless architecture obviously isn’t a good choice.
As always, you should investigate a serverless offering carefully to confirm that it meets your requirements for latency (i.e. minimal cold start issues), security, and so on. And generally, most serverless technologies have specific patterns of development built in, so make sure you like this way of working before you commit.
Serverless computing - especially the FaaS approach - and microservices have some similarities (in some cases, one may even have a microservices architecture that uses serverless functions).
Both are basically architectures that break application code into smaller components that communicate with one another, with databases, and with other services. There are a few key differences between serverless computing and microservices:
You can think of microservices and serverless computing in terms of an evolution upwards in abstraction. A long time ago, servers were physical hardware managed on-premises, and applications were monolithic code bases.
As the cloud landscape has grown, many organizations stopped managing hardware and started managing instances instead, while monoliths are increasingly being replaced by microservices, moving the development abstraction from “application” to “service”.
Serverless computing is an evolution in both infrastructure management and software development, removing infrastructure entirely and moving to a development mindset focused on data and functions.
In a serverless architecture, what triggers a function? More often than not, it’s what we would call an event: an atomic occurrence in the application (or in the real world). For example, every time a user signs in to our application, we want to do a handful of things: log the sign-in, retrieve the user’s profile information, check if there are any new offers, etc. We may design/construct this with serverless functions and events:
An event-driven architecture can be used to coordinate either microservices or serverless functions.
MongoDB offers several services that abstract away the provisioning of servers to provide the benefits of a serverless architecture, whether you’re already using serverless technologies as a part of your stack or just getting started.