Arthur C Clark once said that “Any sufficiently advanced technology is indistinguishable from magic.” You see something happen, but how is a total mystery. New data storage technologies have brought a little of this magic into the datacenter and have the potential to completely change how we think about and manage storage.
Today’s business computing environment is all about application enablement. No matter the industry or vertical you are in, applications are key to business operations and value. The challenge for IT managers is in designing infrastructure around the diverse needs of these apps, particularly when it comes to data storage.
Some applications demand very high speed but only handle relatively small data sets; others have less stringent speed requirements, but the order in which data is written to storage is very important. Other types of apps generate large volumes of data but rarely need to access it. The needs for each of these types of applications are totally different, and planning for those needs practically requires a crystal ball: What hardware will meet current needs? Will that hardware support future needs? What will those needs be? What protocol should you use? Will the data be structured or unstructured? How can you possibly plan infrastructure investments responsibly when the landscape is constantly changing?
The unfortunate answer is that you can’t. Good IT managers do their best to keep up with the state of the industry and make the most of their budgets, but infrastructure sprawl is an inevitable output of unpredictable workloads. Disparate hardware systems get added to datacenters in a scramble to meet varied application (and budget) requirements and the result is complexity. Managing that complexity requires human middleware as technicians manually map app characteristics to the correct protocols and tune hardware to fit. The process is unwieldy, inefficient, and ultimately unsustainable. More to the point, it relegates IT into a reactive role, always scrambling to respond to demands from the business, instead of being a strategic partner in generating value.
Why are there so many decisions to make?
We have lots of choices to make when it comes to storage infrastructure, but choice isn’t always a good thing. Sometimes it is just a distraction. Do you honestly care about whether apps store data structured or unstructured, what storage protocol they use, or what the hardware looks like? No, you don’t. You just want it to work. The truth is that a lot of these concerns are more about vendors trying to differentiate their offerings than about serving your needs as a consumer. The distinctions are a red herring and are largely a waste of your time.
So how do you design a storage infrastructure that supports all of the apps you need to run and doesn’t bog you down in complex manual interventions? Software-defined storage (SDS) offers a partial solution to the problem by disaggregating storage intelligence from hardware, but this only scratches the surface of what is possible. A complete solution would make the minutiae of storage operations completely invisible. By combining SDS with hardware technologies like high-density server platforms, persistent memory, multi-core processors, and high bandwidth Ethernet networks, it is now possible to build an intelligent datacenter that provisions storage with the flexibility to do just that. Developing a common approach to storage across the business will allow you to build a data fabric that simplifies management and enables applications regardless of their particular needs.
Data fabric is an industry term. Several companies are using it, although in slightly different ways.
HPE’s point of view is that a true data fabric must be independent of the underlying hardware form factor and software-defined storage deployment model (including bare metal, hypervisor embedded, or containerized).
What we can learn from the smartphone
Think about the smartphone in your pocket for a moment. What is “smart” about it is that so much of its function is completely transparent to you as a user. You don’t need to worry about how your phone stores data or in what formats. Often you don’t even know where the data gets stored, be it internal memory, an SD card, or the cloud. You just know that it works, and you are probably relieved not to have to think about it any more than that.
Applying that notion to your datacenter for a moment, today you are involved every step of the way. Using conventional technologies and methods, you do have to worry about how your apps store data, where, and in what formats. It is both complex and time consuming, but imagine if all of that could be abstracted away, just like on your smartphone. While some have promised, no vendor has ever combined best-in-class server, storage, and networking in a way that can meet the needs of any application workload, on any form factor, using any media. Until now. That’s exactly what we are doing with data fabric.
The result is a faster, more responsive datacenter with no need to fear what the future might bring. It’s about real-time and late binding of the right storage with all of the right attributes to deliver on business requirements. At HPE, we are not in the app business, but we are in the business of making apps better.
Look for more information about data fabric from HPE, this week, during the HPE Discover technology tradeshow in Las Vegas, June 7-9.