May 23, 2017
Hewlett Packard Enterprise (HPE) has unveiled the first full-scale version of the mega computer it calls The Machine. It runs on a new concept of “memory-driven computing” that has the potential to be a game changer for IT.
In place of hard drives, The Machine holds data on arrays of memristors, which provide 160 terabytes of RAM. The memory-driven computing concept allows data to be stored in-memory rather than on disk, and HPE says it can drastically reduce processing times, allowing users to derive real-time insights from truly massive, Mars mission-type data loads.
HPE Chief Architect Kirk Bresniker says that memory-driven computing allows for a continued increase in processing power, which was reaching its maximum using transistors. “If you’re familiar with Moore’s Law, you know that up until now we could count on chips to get better year after year,” he said, “but that era is over.”
While HPE’s marketing team emphasized The Machine’s ability to run a NASA space mission, the company’s CEO Meg Whitman pointed out that the new architecture’s real impact is in real-world data processing applications.
“The secrets to the next great scientific breakthrough, industry-changing innovation or life-altering technology hide in plain sight behind the mountains of data we create every day,” Whitman said in a statement. “To realize this promise, we can’t rely on the technologies of the past. We need a computer built for the Big Data era.”
In the channel, there’s not often a need to crunch the amount of data that NASA holds. And though some in the industry are skeptical that tech like The Machine will make it to market anytime soon, it signals a potential seismic shift away from on disk storage, which is a big deal for partners.
The Machine’s memory uses laser-light connections to form a network from many CPUs and other processors like graphics chips. HPE likens it to the human brain, where information is stored and computed throughout instead of in just one central processor. Bresniker says that by eliminating the need to copy data and move it through a series of interfaces, it can move data more than 1,000 times faster.
The Machine’s 160 terabytes of shared memory is just the start. Bresniker says the architecture is on track to scale to an Exabyte level—a million terabytes of memory—in another five years.
About the Author(s)
You May Also Like
AWS re:Invent Partner, Vendor News: Cisco, Salesforce, MoreDec 01, 2023
People on the Move: Comcast, Cisco, NICE, TPx, Barracuda, MoreNov 29, 2023
AWS re:Invent 2023 Partner News: Marketplace, Salesforce, Certs, MoreNov 29, 2023
AWS re:Invent Expo: VMware, Snyk, HPE, More Showcase Cloud, Security, AINov 28, 2023