From distributed computing to on-demand services and modular software environments, certain long-term trends are shaping the way we compute today and will compute tomorrow. Here's a look.
These trends represent broad patterns that have existed in the world of computing for decades. They have slowly manifested themselves in recent years to shape the technology landscape of today. It's a safe bet that they will continue to influence widespread technology practices going forward.
The trends include...
In the earliest days of computers, before the predecessors of the Internet existed, computing happened locally. Distributed environments did not exist because there was no way to connect different nodes together.
We live in a very different world now. Thanks to reliable, high-bandwidth networks, compute, storage and other resources are routinely shared over long distances. Increasingly, desktop computers, phones and other end-user devices are merely thin clients that connect to environments hosted in the cloud in order to do anything useful. This is a distributed computing architecture.
The server side is highly distributed, too. Applications and storage are often hosted in environments that are spread across many different physical machines. This is part of the reason why containers, for example, have become so popular.
There's little doubt that computing will become yet more distributed going forward as networks grow even faster and technologies for building distributed environments become more sophisticated.
In the old days, you had two choices: One was to keep your desktops and servers running 24/7 so that applications would be instantly available whenever you needed them. The downside to this approach was that you had to pay to maintain machines constantly even though they were only used occasionally.
The other option was not to leave the machines running all the time. That saved simplified management, but it meant that your applications were not instantly available whenever you needed them.
Today, on-demand computing services, like AWS Lambda, have finally provided a solution to this old, inefficient way of doing things. Rather than having to choose between a heavy management burden and instant access to computing resources, users can now run code instantly without having to pay for or maintain environments when they're not necessary.
On-demand computing is consistent with a trend toward other types of on-demand services. Expect on-demand to be the way of the future in the computing world and beyond as consumers look for more efficient ways of accessing services.
Software applications were once unwieldly and difficult to move to new environments because they were highly monolithic.
This began to change with the advent of object-oriented programming. Object-oriented source code is divided into modular chunks, making it easier for programmers to modify one part of an application without affecting other parts.
Then, with the advent of the Service Oriented Architecture (SOA) trend in the 2000s, not just source code but running applications were broken into distinct pieces, too. When an application was composed of different services, it was easier to update and secure.
The current containers and microservices craze brings the modularity trend to the next level. Today, programmers break applications into many small services. They can use technologies like containers to deploy each service in a flexible, scalable way.
Modularity is the foundation for huge agility advantages.
Another reason that makes containers popular is their ability to support highly scalable infrastructure.
Containers are not the only technology that allows applications to scale. They're just the latest innovation in the long march toward building environments and applications that are incredibly scalable.
Before containers, there were virtual machines. Virtual machines were more scalable than bare-metal servers.
Going forward, expect programmers to develop even better solutions for scaling up -- as well as for scaling down by reducing the footprint of an application when demand decreases, which is crucial for cost-efficient software deployments.
Once upon a time, your applications ran on actual hardware.
Then along came virtual machines. They allowed applications to run in emulated environments that were more agile than those that ran directly on hardware.
Today, however, it's not just servers that are virtualized. Everything has become software-defined, from storage to networking. Software-defined computing is the key to the future, in which