5 Reasons Not to Use Serverless Computing

Despite its popularity, serverless computing has downsides and is not always the best approach for every part of every workload.

Christopher Tozzi, Technology analyst

December 8, 2023

4 Min Read
servers with a NOT symbol over them
Alamy

There's a lot to love about serverless computing. It's scalable. It's cost-efficient. It minimizes effort required from engineers to set up and deploy software.

But serverless comes with some distinct downsides, too — and they're easy to overlook during discussions of all of the benefits that serverless has the potential to offer.

With that reality in mind, let's take a sober look at the reasons why serverless computing isn't always the right way to deploy software. As you'll learn below, although serverless functions certainly have their benefits, understanding their limitations is critical for making informed decisions about whether serverless is actually the right way to go.

What Is Serverless Computing?

Serverless computing is an approach to application deployment where engineers can run apps on-demand without having to configure or manage host servers themselves. It's called "serverless" because from the user's perspective, there are no servers to deal with; instead, the servers are provisioned and managed by the serverless computing provider.

Since Amazon's release of its Lambda serverless computing service in 2014, more and more businesses have turned to serverless to meet their application deployment needs. Today, all of the major cloud providers offer a serverless computing solution, and it's also possible to configure serverless environments in on-premises servers or private data centers.

Related:How to Choose Between Serverless, VMs, and Containers

As of 2023, serverless computing adoption rates were as high as 70%, at least for organizations using AWS. (Serverless usage on other clouds is lower.)

The widespread adoption of serverless computing reflects the ability of serverless architectures to deliver benefits such as the following:

  • Simplified workload deployment: Engineers can simply upload the code they want to run and configure when it should run, without having to manage the host server environment.

  • Scalability: In most cases, there is no practical limit on the number of serverless functions you can deploy at once, which means serverless can scale up and down easily based on workload requirements.

  • Cost-efficiency: Typically, you only pay for serverless functions when they are actually running. This makes serverless a cost-effective model compared with running workloads on VMs and paying continuously to operate them, even if the workloads are not continuously active.

The Disadvantages of Serverless Computing

The benefits that serverless computing stands to offer are counterbalanced in some cases by the downsides of the serverless approach.

Related:Why the Benefits of Serverless Computing May Never Be Realized

Higher costs

As noted above, serverless computing is cost-efficient in the sense that you only pay for the time your workloads are active. However, the per-minute cost of serverless is almost always higher than the cost of running an equivalent workload on a VM.

For this reason, serverless may result in greater total costs than other types of cloud services, especially for workloads that are active most of the time.

Lock-in risks

Each serverless computing platform works in a different way. That makes it challenging to migrate workloads from one serverless environment (like AWS Lambda) to another (such as Azure Functions).

By comparison, the differences between other types of cloud services (such as AWS EC2 and Azure Virtual Machines) are less pronounced, leading to lower levels of lock-in when you use those services.

Slow startup time

Although serverless workloads theoretically run on-demand, in practice there is typically a delay between when serverless code is triggered and when it actually runs. This is especially true in the case of "cold starts," which happen when serverless code hasn't run recently. Sometimes, the delays in startup time can be a second or longer.

Granted, a second's delay may not be a big deal for many workloads, such as generating content for a web page. But if you need real-time performance — as you might if, for example, your workload is processing streaming payment data — serverless functions may not work well. You'd be better served by a traditional hosting model where your workload is continuously and instantly available.

Limited control

The tradeoff for not having to manage server environments for serverless workloads is that you get little control over how those environments are configured. Apart from being able to configure the conditions that trigger serverless functions to execute, you have to settle for whichever operating system and runtime settings your serverless provider chooses to support.

Management overhead

In many cases, you wouldn't deploy a complete workload using serverless. You'd deploy only specific functions or microservices that benefit from the serverless architecture, while hosting other parts of your workload on a VM.

This approach allows you to balance the costs and benefits of serverless more efficiently. But it also means that you end up having to manage and orchestrate multiple types of cloud services for the same workload — so you end up with greater management complexity.

Conclusion: Serverless Is Great, but Only Sometimes

To be clear, I'm not here to tell you that you should never use serverless computing services. But I am telling you to be strategic about when and how you take advantage of serverless. Despite the hype surrounding serverless over the past several years, serverless computing can have real downsides, and it's not the right approach for every part of every workload.

Read more about:

Serverless Computing

About the Author

Christopher Tozzi

Technology analyst, Fixate.IO

Christopher Tozzi is a technology analyst with subject matter expertise in cloud computing, application development, open source software, virtualization, containers and more. He also lectures at a major university in the Albany, New York, area. His book, “For Fun and Profit: A History of the Free and Open Source Software Revolution,” was published by MIT Press.

Sign up for the ITPro Today newsletter
Stay on top of the IT universe with commentary, news analysis, how-to's, and tips delivered to your inbox daily.

You May Also Like