Microsoft Azure, Google Cloud Push Ahead on Serverless

News from Azure and Google Cloud highlight the growing capabilities around functions as a service.

Jeffrey Burt

April 16, 2019

5 Min Read
Serverless Architecture

Major cloud providers continue to build out the services they offer customers around serverless computing workloads, giving organizations more options and tools for creating and running modern applications in the cloud without having to worry about the underlying server infrastructure.

Microsoft Azure this month announced a new plan for its Functions hosting model that includes the ability to leverage what officials call “pre-warmed instances” that allow for applications to run without delays and scale quickly after sitting inactive. It also offers access to more powerful instances for serverless workloads.

Also this month, as part of its onslaught of announcements at the Google Cloud Next 2019 show, Google Cloud Platform unveiled an array of advancements around serverless computing led by Cloud Run, a serverless platform for containerized applications, as well as Cloud Run on GKE (Google Kubernetes Engine) and Knative, an open API and runtime for serverless environments.

The news from Azure and Google Cloud are part of a larger trend within the cloud space toward functions as a service (FaaS) – or serverless computing – to more efficiently run modern workloads in the cloud, a boon for organizations and one more area in which channel partners can help their customers as they continue their move into the cloud.

According to Paul Teich, principal analyst for Liftr Cloud Insights, “serverless is the final frontier for increasing server utilization and efficiency.”

In the public cloud, FaaS is more efficient than bare-metal servers, infrastructure as a service (IaaS) or platform as a service (PaaS), though not as efficient as software as a service, Teich told Channel Futures.


Liftr Cloud Insights’ Paul Teich

“PaaS is more efficient than IaaS and bare metal because it takes developers less time to write, debug and validate apps on PaaS vs. IaaS,” he said. “Likewise for FaaS being more efficient than PaaS to write, debug and validate apps. SaaS is the ultimate in efficiency because someone else spent time doing all that writing, debugging and validating.”

Given that, FaaS is “theoretically … about the best an enterprise IT developer can do for both programming efficiency in developing an app and for runtime efficiencies in operationally delivering an app, but they have to start from scratch,” Teich added.

The debate over FaaS vs. PaaS vs. IaaS is less about the applications running in the environments and more about developer productivity, accelerating time to market and the operational efficiencies that come with scale, the analyst said.

Those also are many of the arguments for the public cloud in general, so all the top cloud providers are busy expanding their portfolios of services around serverless computing as well as the other as-a-service models. According to Amazon Web Services (AWS), the top benefits of serverless computing are eliminating the need to manage servers, flexible and automated scaling, paying for value rather than by server, and built-in high availability and fault tolerance.

Azure’s new Functions Premium plan delivers “a suite of long requested scaling and connectivity options without compromising on event-based scale,” Alex Karcher, program manager for Azure Functions, wrote in a blog. With the premium plan, customers get access to instances of up to four cores and 14GB of memory (up from one core and 1.5GB in the current consumption plan), as well as VNET integration and the pre-warmed instances that address the problem of cold starts, which can cause a delay when calling an application that has sat idle.

“Keeping a pool of pre-warmed instances to scale into is one of the core advantages beyond existing workarounds,” Karcher wrote, noting that in the consumption plan, there’s a workaround called a pinger that constantly  …

… pings the application to keep it warm — though cold starts are still a problem as the app scales.


Microsoft’s Alex Karcher

With the pre-warm instances, customers avoid the cold start not only with the first start but at each point of scaling.

Google Cloud’s Cloud Run is designed to address the choice organizations often have to make with serverless offerings between serverless or containers, Eyal Manor, vice president of engineering for Google Cloud, and Oren Teich, product management director, wrote in a blog. Cloud Run “lets you run stateless HTTP-driven containers, without worrying about the infrastructure,” they wrote. “Cloud Run is a fully serverless offering: It takes care of all infrastructure management including provisioning, configuring, scaling and managing servers.”

Along with Cloud Run for GKE and Knative, Google Cloud announced other enhancements to Functions, including support for such runtimes at Node.js 8, Python 3.7 and Go 1.11, the open-source Functions Framework for Node.js 10, serverless VPC access and scaling controls.

All of this fuels further competition among the top cloud players. Liftr Cloud Insights’ Teich said there is some hype around serverless right now, though the trend is toward adoption by developers.

“If you are an early adopter or a younger developer, you’re probably already using FaaS, but you are designing your service or app to take advantage of it,” he said. “In other words, you are cloud-native. Everyone else has to refactor their existing apps yet again. It’s kind of like the ‘Men in Black’ comment: ‘I guess I’m going to have to buy the White Album again.’ Maybe you already ported an app from a VM to a container running in IaaS, and you are in the middle of porting the same app to use your favorite PaaS. Eventually you are going to think about porting it to FaaS.”

Free Newsletters for the Channel
Register for Your Free Newsletter Now

You May Also Like