If you want to understand how enterprise IT workloads are being redefined, the on-demand computing trend is a good place to look. Here's how on-demand computing is reshaping the way companies approach IT resources.
On-demand computing means that organizations consume compute, storage or other resources only when they need them. In the on-demand model, resources are made immediately available when a server or program wants them, but users don't have to pay for resources when they don't need them.
That's much more efficient than conventional computing. Traditionally, if you set up a server -- whether on-premise or in the cloud -- you had to pay to run and maintain that server twenty-four hours a day, seven days a week. Yet in most cases, the server was not being actively used 100 percent of the time, because the apps or data that it hosted were not in constant demand.
With on-demand computing, in contrast, you can pay only for what you need, when you need it. You get a drastic reduction in overhead.
On-demand computing solutions have been around for years. A company called Zimki began offering what it called "utility computing" in 2006 -- though it lasted barely a year before shuttering.
The introduction in 2014 of AWS Lambda, the first on-demand service from a major tech company, made on-demand computing more popular. Lambda allows programmers to write "serverless" functions that are executed on demand in the AWS cloud. Users pay only for the time they actually use to run functions.
More recently, Azure, IBM Bluemix and other major public cloud providers have introduced serverless compute solutions of their own. You can also now set up your own on-demand framework using open source tools like Fission.io. These days, on-demand computing has become broadly available.
How On-Demand Computing is Changing the IT World
At first, on-demand services like Lambda were used mostly by programmers who needed a way to run resource-intensive operations, like resizing an image, quickly. But as the on-demand trend has become more popular and serverless computing services have converged with other types of technologies, companies are now beginning to treat on-demand computing as the norm. Paying for overhead they don't use is no longer acceptable.
After all, Docker containers have become massively popular in a few short years because they help organizations achieve the same goal as on-demand services: Minimizing overhead while maximizing cost-efficiency and performance. Docker containers let you build scalable, agile infrastructure for hosting apps, without the overhead that comes with virtual machines -- just as on-demand services let you run resource-heavy operations quickly, while paying only for the resources it takes to run them, not a penny more.
In other words, on-demand computing has become part of a larger trend that is reshaping the entire approach companies take to organizing their IT resources. On-demand computing is about much more than just serverless functions on AWS Lambda or Azure.
On-Demand Computing and the On-Demand Economy
It's worth noting, too, the parallels between the on-demand computing phenomenon and the on-demand economy (as well as the closely related "sharing" economy).
In the on-demand economy, you request services like a car ride or house cleaning when you need it, and receive it virtually instantaneously. You only pay for the services you actually receive. There is no overhead resulting from ongoing membership fees, or maintaining dedicated employees to provide a given on-demand service to you.
I don't think the programmers making use of AWS Lambda or other serverless computing services have been embracing them while consciously thinking about how they mirror on-demand economic solutions. But there is clearly a powerful trend here that extends beyond the world of computing itself.
The future -- of your server as well as the way you work -- will be on-demand.