The far-off promises of future computing may arrive in hardware that IT pros can buy today.
The Machine will serve as a waterfall, of sorts, for Hewlett Packard Enterprise (HPE) during the next few years, slowly cascading technology advancements into hardware that can go in today's data centers, long before the supercomputer itself -- touted by HPE as a way to rearchitect servers, with memristors and photonics at its core -- comes to market.
"I hope HPE will be able to put some of this technology in lesser-generation servers," said Martin Cativa Gronda, vice president of enterprise architecture at FirstBank Puerto Rico in San Juan, Puerto Rico.
He said he hopes to see some of the benefits in the near future, but thinks some of the most significant benefits will come from photonics, after hearing details about The Machine at a booth on the show floor at the HPE Discover event in Las Vegas.
HPE views The Machine as similar to a concept car, a vision of pulling together forward-looking technologies, such as photonics, memristors and persistent memory, said Paul Miller, vice president of marketing for HPE converged data center infrastructure. Such concepts often find their way to market individually as tie-ins to existing products, rather than the complete envisioned package.
"They take the elements of the concept car that are really cool and build it into your current products," he said. "You'll have the turbocharged engine, with the cuts and lines of a concept car in existing products."
For example, HPE Synergy is preplumbed for photonics when the technology and switches become available, he said -- the copper switches can be removed and replaced with photon-enabled switches.
Other elements of research and development that have gone into The Machine project will show up in Apollo and Superdome servers as well, he said.
CEO Meg Whitman stands behind the original promise to have a prototype this year, even after Martin Fink, HPE's CTO and director of HP Labs, said he will retire at the end of this year. Fink, who has led development of The Machine since the vision for it was first unveiled in 2014, took a much lower profile at this year's HPE Discover customer and partner event compared to recent years, only speaking about The Machine via a preproduced video.
The Machine should help IT pros better keep up with the growth of data, as well as increased speed and density, said Craig Carlson, associate director of IT systems at Texas Christian University (TCU) in Fort Worth, Texas.
"To me, The Machine is like going from the horse and buggy to the automobile," Carlson said.
Craig Carlsonassociate director of IT systems, Texas Christian University
The best use for The Machine will be to help increase how quickly organizations can use data, he said, noting TCU's data grows by 10% annually.
Expectations for The Machine have shifted since 2014. Last year, Fink said initially it will be delivered using DRAM, not memristors. This year, what was to be a new Linux-based operating system developed by HPE was opened up to the open source community via GitHub.
Current operating systems are all based on the x86 processor, and HPE needs a new operating system for something such as The Machine, Carlson said.
"If I was Microsoft, I wouldn't be happy," he said. "But to put a saddle on a Model T won't get you there -- you have to have a motor in there."
It remains to be seen which applications would migrate to The Machine first, but likely they will involve large amounts of data, such as SAP HANA, suggested Scott Spevacek, senior Unix systems administrator at Telephone and Data Systems Inc., headquartered in Chicago.
Like Gronda, Spevacek is most excited about the possible advancements brought about by photonics, which he called a defining factor for The Machine.
From chassis to chassis, IT pros can add individual pieces of compute or RAM and not need CPU embedded with it, which would provide more flexibility, he said. For example, his VMware virtual desktop infrastructure environment is always memory-bound -- to get more memory, he has to add RAM and more CPUs, which he doesn't need. "Hopefully, this will solve it," he said.
Eventually, The Machine may change the way data centers work by putting everything in memory and eliminating hard disks, said Tomas Hernandez, CEO at CP Corp., an integrator in Caguas, Puerto Rico.
"The revolution will be around the huge amounts of data that have to be handled and the physical limitations we have today with storage preventing us from doing more things," he said.
Robert Gates covers data centers, data center strategies, server technologies, converged and hyper-converged infrastructure and open source operating systems for SearchDataCenter. Follow him on Twitter @RBGatesTT or email him at firstname.lastname@example.org.
HPE merges IT services business with Computer Sciences Corp.
HPE finds a data center friend in Docker
HPE evolves its hyper-converged strategy
What will 2017 bring for SAP S/4 HANA and HCP?