Mike_Kiev - Fotolia
For Hewlett Packard Enterprise to survive as a key supplier to the enterprise data center, it must evolve to emphasize multicloud computing and the rise of smaller edge locations in internet of things environments, according to experts.
Hewlett Packard Enterprise (HPE) has had a busy year. The spin-merger of the services business with Computer Sciences Corp. is complete, and a similar spin-merger of its software business with Micro Focus is expected to be complete this fall.
In the meantime, the company has purchased hyper-converged infrastructure vendor SimpliVity Corp., and Nimble Storage, to broaden its portfolio of hyper-converged and composable infrastructure. At its annual IT event, HPE Discover, in Las Vegas next week, HPE is also expected to unveil its Gen10 servers, which incorporate Intel Corp.'s next-generation chips with scalable architecture, according to social media posts by HPE in advance of the event and sessions scheduled at Discover.
How all these changes will affect the future of HPE customers is uncertain -- but above all, the company needs to craft a unified message, experts said.
"I like the different parts, but [HPE] has to assemble those parts into something that is simple and stable," said Patrick Moorhead, analyst and president at Moor Insights & Strategy in Austin, Texas.
The company should also focus on the role of IT infrastructure in a post-cloud IT world, which means focusing on specific areas that won't be assimilated by public cloud providers such as Amazon Web Services (AWS) and Microsoft Azure, said Dana Gardner, an analyst at Interarbor Solutions in Gilford, N.H. For HPE, that means it must forge partnerships with public cloud providers and provide enterprises with infrastructure they need for local internet of things (IoT) and machine learning projects.
Help IT navigate the clouds, out to the edge
Like many major legacy vendors, HPE has no public cloud offering since it shut it down Helion in 2015. But many customers don't care, and in fact have changed their expectations for the future of HPE.
Large companies use more than one public cloud and will increasingly do so, and will need help to manage that, in addition to on-premises infrastructure and private cloud.
"HPE has a card to play that they haven't played yet -- in a multicloud world, how can they manage that?" Moorhead said. "It is so obvious to me that they could be the company that manages the multicloud environment."
David Schnedintegrated director of IT infrastructure, St. Joseph's Health Care
Cardinal Health Inc. in Dublin, Ohio, uses AWS but wants to connect its hyper-converged infrastructure into a future multicloud environment to enjoy the advantages of the public cloud, such as flexibility and operational expenses billing, without straining resources and staff.
"What I want is to see how they can still manage my [on premises], but then use the public cloud without creating a whole new set of tools," said Keith Templin, systems engineer at Cardinal Health.
St. Joseph's Health Care is not yet in the public cloud and recently upgraded its storage and compute from HPE, to increase capacity at a lower cost than similar, previous purchases, said David Schned, integrated director of IT infrastructure, at the London, Ont., hospital.
"We don't need cutting-edge stuff; what we need is functionality and efficiency," he said. "I want stable infrastructure."
Schned said he will attend HPE Discover next week to research security technologies and endpoint integration.
Also next week, Gardner and Moorhead expect to learn about the future of HPE as it relates to artificial intelligence and machine learning, plus its support for IoT projects at the edge.
"I want to see them tell a holistic IoT story," Moorhead said.
They also expect to hear how the company will provide customers with infrastructure to support other needs, from hyper-converged to composable infrastructure.
Research becomes real -- when?
HPE's long-awaited research project, code named The Machine, should be on display for its third year in a row. So far it has resulted in a 160 terabyte operating prototype of what HPE calls memory-driven computing, which shifts from a decades-old, processor-centric architecture to one that is memory-centered.
The prototype is real and not vaporware, but comparative benchmarks to x86 architecture have not yet been well documented, said Richard Fichera, an analyst at Forrester.
"It is not fairy dust like the first version of The Machine they talked about," he said. "But this is still not a machine for your standard IT organization for a while."
Its first use likely will be to wrangle big data and applications such as Spark, he said, with financial services and research organizations among the first to want it. What remains unclear, however, is when the prototype will evolve into something that can be used in production.
Robert Gates covers data centers, data center strategies, server technologies, converged and hyper-converged infrastructure and open source operating systems for SearchDataCenter. Follow him on Twitter @RBGatesTT or email him at email@example.com.
HPE finds niche market at the edge
HPE fits Docker Engine into server strategy
HPE helps users achieve vision of an SDDC