I have something of a checkered past with HPE, the enterprise business that was created when Hewlett-Packard split itself into two units a year or so ago. I first engaged with the business formerly known as HP half a dozen years ago. Back then the company was very much still in the business of selling hardware -- be it servers, printers or laptops -- and had not yet focused on its transformation into a business that could compete in the modern world.
I spent a few years in the HP orbit, attending a number of their global user conferences and generally saying things as I saw them. Unfortunately, I didn't see things in a particularly good light -- HP introduced (and reintroduced a handful of times) a public cloud service that they told us, with a straight face no less, would be competitive with Amazon Web Services (AWS) the king of the public cloud.
Eventually, and after much angst, HPE dropped its public cloud initiative altogether. Then there was the debacle that was the acquisition of Autonomy, the acquisition of Eucalyptus and the appointment of its CEO, Marten Mickos, to a critical cloud role, only for him to be disposed of soon after. It all seemed a mess to me. And in my own inimitable (and not overly diplomatic) way, I told HP exactly what I thought.
Which, unfortunately, resulted in me spending a few years out in the wilderness with pretty much no contact with the company. And then something happened. A couple of years ago, HP hired Microsoft veteran Bill Hilf to run much of its cloud division. In turn, Hilf brought over a large number of experienced and effective Microsoft executives and went about rebuilding the cloud business. He made acquisitions, in particular, Cloud Foundry-based Platform as a Service (PaaS) Stackato. With a renewed purpose, HPE re-engaged with a number of the more "problematic" analysts and influencers, in an effort to create dialog and better explain their strategy. [Disclosure: I was, by way of a previous acquisition by the parent company of Stackato, an adviser to the company.]
So it was always going to be interesting for me, after a multi-year hiatus, to attend HPE's global customer and partner conference in Las Vegas recently. [Disclosure: HPE paid for most of my travel and expenses to attend the event.]
I attended as part of HPE's influencer summit, which in itself is indicative. In contrast to some more traditional vendors that keep a very strong separation between analyst relations and the less formal blogger/influencer ranks, HPE lumped us all into the same program. From my perspective, it was a good move, and it was interesting to note that the most vocal and thoughtful attendees at the summit seemed to come not from the traditional analyst ranks, but from the independents. I get the impression that the traditional analysts were, at least to an extent, a little miffed to be lumped in with the riffraff, but from my perspective, it was the right move for HPE, which is in a battle to win hearts and minds.
Before giving my thoughts on where HPE is in its journey, it is worth reflecting on some recent news, and also the announcements at the show.
Splitting off services
A few weeks ago HPE announced that it was entering into a complicated transaction that would see its services business (itself the result of the acquisition of EDS a few years ago) split off and merged with CSC to create a standalone global services play. In a small influencer Q&A at the event, Meg Whitman, HPE's CEO, explained how this would work and made a fairly pointed comment contrasting HPE's strategy to divest non-core parts of the business to create a leaner, more agile business, with that of Dell who is, of course, entering into the monster acquisition of EMC. Whitman strongly feels that lean and mean is the right strategy for the modern age and made no bones about the fact that she is dubious about the synergies that will be realized by the Dell/EMC deal.
In a follow-up session with the HPE guy who is actually going to be the number two executive at the new organization, we dived into the synergies that the companies believe exist in the merger -- cost savings, greater scale, different areas of focus -- it all makes sense.
Partnerships aplenty
The Discover conference was taken up with a number of partnership announcements. We had Drew Houston, the CEO and founder of file-sharing service Dropbox on stage telling the world that his company's recent high-profile move away from AWS and onto their own private cloud was an HPE deal. Details were a little sketchy and it's hard to know exactly how the deal worked, but the fact that Houston was on stage was interesting.
According to some publications, Dropbox wrote a check for HPE Proliant and Cloudline servers in a deal which was financed by HPE Financial Services. Apparently the servers were modified to suit Dropbox requirements.
At the same time Whitman announced that HPE is using Dropbox internally -- also an interesting announcement from a company that is so vehemently "enterprise focused" (Dropbox is, after all, struggling to prove to the world that it has real enterprise chops).
Also on stage was Ben Golub, CEO of container company Docker. Whitman announced that the company is going "all in" with Docker containers and that it plans to ship Docker Engine preinstalled on all of its servers, as well as providing Docker support. In a cringeworthy moment, which few people picked up on, Whitman and Golub discussed how much easier it is to hire millennials once a company embraces the concept of containers. It was a rare example of the HPE of a few years ago, trying to appear like one of the cool kids when it most definitely is not (but more on that later).
Finally, HPE announced a partnership with everyone's favorite Internet of Things (IoT) player, GE. Under the partnership, HPE will integrate its own IoT technologies with GE's Predix platform. The partnership will take advantage of HPE's Edgeline servers, hardware that, as the name implies, is designed to work in a decentralized environment. Edgeline integrates compute, storage, data capture, control and systems and device management. In addition, HPE announced that the HPE Vertica Analytics Platform runs on the Edgeline EL4000, delivering historical and predictive analytic insights from in-database machine learning algorithms across a broad range of IoT analytic use cases.
Analytics to the fore
In the Q&A, Whitman spent time pitching HPE's smarts when it comes to analytics. In a combative mood, she stated that she was ready for HPE's analytics to go one-on-one with IBM Watson, Big Blue's much-touted and Jeopardy-winning analytics platform.
I have to say that the perception in the room is that neither HPE nor IBM are seeing much in the way of customer adoption of their analytics platforms, but HPE's foray into IoT and delivery of intelligence-driven IT operations software at least provides some proof points for the quality of its analytics offerings.
Helion gets more coherent
Helion was originally the name for HPE's public cloud offering. As discussed previously, HPE's foray into the public cloud was a serious misstep, and some might have thought that the Helion name would be tainted forever.
That is not the case, however, and HPE is using Helion broadly as the branding for a number of horizontally applicable cloud products. At the show HPE announced HPE Helion Cloud Suite 1.0, Helion Stackato 4.0 and Helion CloudSystem 10. These products all aim to offer enterprises a consistent approach towards their infrastructure, and an ability to enable the developers within their organizations to be more agile. Some brief detail about these three products:
- HPE Helion Cloud Suite 1.0: is HPE's cloud software offering. With this suite, HPE has combined assets from their IT operations management portfolio and married them with their cloud platforms (OpenStack and Cloud Foundry) Cloud Suite is built for a hybrid environment, enabling customers to manage IT whether on-premises or in other datacenters; it allows them to automate infrastructure management, whether it is running on bare-metal, virtual machines or containers; in addition it has a full DevOps suite for both traditional and cloud-native applications
- HPE Helion Stackato 4.0: is HPE's latest version of its cloud-native application platform. It includes the Cloud Foundry distribution, as well as a full CI/CD toolchain and runs on whichever cloud customers want
- HPE Helion CloudSystem 10: CloudSystem is the converged software plus hardware proposition. It takes Helion Cloud Suite (including Helion OpenStack and Helion Stackato) and integrates it with HPE’s infrastructure -- blades, rack-mounted hardware and Synergy composable infrastructure
(Rage of) The Machine
A few years ago HPE announced The Machine, an entirely new look at the way computing occurs. The Machine leverages memory rather than processors and is built around a technology that was purely theoretical until a few years ago, memristors. The Machine also uses photonics (i.e. light) to connect the different parts of the computer, instead of electricity. It is fundamentally a completely new way of performing computing, and is still years away, but HPE is pressing ahead and getting people interested in working with it. The company announced four separate developer tools based around building software to run on The Machine.
One is a new database engine that aims to speed applications by taking advantage of a large number of CPU cores and non-volatile memory. Another tool offers a fault-tolerant programming model for this non-volatile memory, while yet another gives developers a way to explore The Machine's new architectural approach. The last tool is a DRAM-based performance emulation platform that aims to emulate what applications running on The Machine will experience.
We often hear of so-called "disruptive technologies" in the industry, but if and when The Machine sees the light of day, it will certainly herald a new paradigm for computing.
What it all means: MyPOV
A few years ago HP was, to be honest, a bit of a mess. It was a hardware company that was trying to spin its way into a story of software. That was a clash that didn't resonate. Since then HPE has worked hard to deliver a consistent and coherent story about its future.
But it's not easy. The fact of the matter is that no matter what the executive ranks say, it is at the sales force level where the rubber hits the road. HPE's sales force is traditionally compensated in ways that encourage a continuation of a hardware-centric view, and this is the internal cultural element that HPE has been facing.
In a one-on-one session with the head of all of HPE's software businesses, Robert Youngjohns, I got an honest appraisal of where HPE is on this journey. Youngjohns admitted that it is early days and that much work still needs to be done in this area.
It was interesting to hear Whitman bare all in the influencer Q&A and state very clearly that where HPE fails is when it tried to do stuff that isn't within its core DNA and experience. I reflected upon that during the keynote when both Docker and Mesosphere had speakers on stage. While both of these technologies are important, and have a role to play in HPE and its customer's future, it felt a little bit like Whitman, and HPE more generally, was jumping on the "cool new thing" bandwagon and chasing fads. I'm not suggesting that either containers or cloud-native approaches are fads, but HPE's pushing of these companies seemed more marketing that functional. (It should be noted that HPE was a participant in the recent capital raise by Mesosphere, the company commercializing Apache Mesos.)
Youngjohn's perspective on this is that HPE is achieving two things. Firstly, it is showing its customer base that it is looking deeply into emergent technologies. Secondly, it is experimenting with technologies that it uses to build its own core solutions -- HPE has used both Docker and Mesos, for example, in recent product buildouts.
This "trying to appear cool" issue is one I keep coming back to. Indeed, one journalist said to me via private message that sometimes watching HPE is like watching your parents listening to rap and pretending to "relate" to it. It's something that HPE needs to work on and a trap they need to try to avoid falling into.
This tendency to chase the new shiny thing was a topic I discussed with a number of HPE executives at the event and is something that they are aware of. Indeed, in a briefing at the show, Jay Jamison, HPE's product marketing czar, stated a focus for meeting its customers where they are today and helping them make a planned and well thought out move into newer technology areas:
HPE is committed to meeting customers where they are. As customers look at Mesosphere and Docker as infrastructures that are important to them, HPE is committed to being a strong partner and supporter of those technologies, both in our software, as well as with our hardware infrastructure capabilities. [Ours is] a very broad strategy that we think enables customers to have a strong partner that’s focused on meeting them where they are, and providing them solutions to the problems they most commonly face.