Most cataclysms are signaled long before the actual event by things such as the ground rumbling. Such is the current case with Docker containers, a lighter-weight approach to building and deploying applications that has taken the application development community by storm.
For the most part usage of Docker containers is still confined to application development and testing environments. But it’s only a matter of time before solution providers begin to see larger numbers of Docker containers deployed in production environments.
At its core a Docker container wraps a piece of software in a file system that contains everything it needs to run—including code, runtime, system tools and system libraries—on any operating system, virtual machine or platform-as-a-service (PaaS) environment. To better explain exactly how Docker containers are built and then deployed, Shawn Powers, in collaboration with CBT Nuggets, a provider of IT certification training, has created a video series that explains how Docker containers are used to drive what is now commonly referred to as a microservices architecture. Given how few people really understand what Docker containers really are, Powers said there’s a clear need for some fundamental education on how Docker containers work and what’s involved in building a Docker applications.
As is often the case with most major innovations, there’s a lot of debate concerning where Docker containers actually will run. Most of the early adopters of Docker have leaned toward deploying them on bare metal servers as an alternative to virtual machines. Others argue that because of security and manageability concerns it makes more sense in production environments to deploy Docker containers on top of virtual machines. A third camp is making a case of deploying Docker containers on a PaaS environment that has all the tooling required to deploy and manage those containers.
Where things get really interesting from a solution provider perspective is the level of IT infrastructure density that can be achieved using Docker containers on a bare metal server. Where once on average there might have been 10 to 25 virtual machines on a server, now there could easily be more than 100 containers, all of which will be contending for memory, storage I/O and network bandwidth. IT organizations may wind up using less IT infrastructure in the future thanks to Docker containers, but it’s also safe to say that what they have in place won’t be of much use. As such, solution providers should expect to see Docker container sprawl driving a wave of IT infrastructure upgrade in the not-too-distant future.
Of course, most IT operations teams don’t really appreciate just yet what all this rumbling under their feet means either. In terms of disruptive IT forces, Docker containers may be the most disruptive thing to come along in a decade or more. The opportunity for solution providers is to be the trusted source that not only identifies the size and scope of that disruption, but also explains just how soon its impact will actually be felt.