How IBM Set the Stage for Containers and What the Future Holds
From IT Pro Today
Although computer technology isn’t as young as it once was, any way you slice or dice it, the technology is still pretty young when compared to other technologies. But however young or old it might be, many of today’s latest and greatest innovations aren’t as new as they may seem, but are based on older innovations that have been repurposed, repositioned or tweaked for modern times.
Take container technology, for instance. By all appearances, containers came over the horizon about 2013, about the time that Red Hat announced it would be integrating Docker with its OpenShift platform-as-a-service offering.
But as Jim Ford, founder of cloud startup Pareidolia, and a 23-year-veteran as the chief architect at ADM, pointed out in the opening keynote at last week’s Container World, the genesis behind what would become container technology goes back to early virtualization efforts.
“Containers really started with IBM and the LPAR on the mainframe in 1972,” he said. “That was the birth of the hypervisor and that was the birth of the idea that you could run multiple full execution environments on the same piece of hardware.”
LPAR stands for “logical partition,” and each LPAR creates what is effectively a separate mainframe with it’s own separate operating system — which is sort of like virtualization … which is sort of like containers.
“In 1979 our friends at Bell Labs came up with chroot, and we suddenly had the ability to run two Unixes on one machine, but it was very naïve and very trusting,” he said. “That’s really the key here; it’s about trust and naivety. That’s really the whole game that we’re playing.”
Ford’s keynote presentation was partly a history lesson, partly a look at the issues we face now, and partly a cautionary tale as we move boldly into the always connected future.
“In 2005, Sun came out with Solaris zones,” he said. “Strangely, those were really the first containers, and in some respects they were greater than the ones we have today because they actually contained. They didn’t have leaky holes; they didn’t have strange methods of allowing things to happen through flaws in design or flaws in delivery.”
Ford’s purpose was to offer insight into how to look ahead and anticipate the future. As he sees it, the issues we’ll face in the future will often be tied to the issues we’ve faced in the past.
“If you think about it, the guy who wrote Pac-Man and put it into that coin-op cabinet 40 years ago was living at the extreme edge of what the hardware could do,” he said. “He was pushing very hard to make this stuff run. Now you run on your iPhone. Admittedly, he was running a five hertz processor.
“Everything we’ve ever thought was hard – streaming video, streaming audio, streaming whatever – has come to pass and become quite commonplace and quite performant. So, I want you to try to break away from their paradigm of, ‘Oh, we’re trapped today because,’ and think about, ‘Well, we’re trapped today, but let’s keep an eye out because tomorrow it may be simpler.'”
Ford’s optimism was tempered by the observation that many of the pressing issues being faced by the tech industry, especially on the security front, are embedded in …