Flash breakthrough promises faster storage, terabytes of memory
Diablo Technologies plans to put flash in the memory channel for a fast route to server processors
IDG News Service - In the ongoing quest for faster access to data, Diablo Technologies has taken what could be a significant next step.
Diablo's Memory Channel Storage (MCS) architecture, expected to show up in servers shipping later this year, allows flash storage components to plug into the super-fast channel now used to connect CPUs with memory. That will slash data-access delays even more than current flash caching products that use the PCI Express bus, according to Kevin Wagner, Diablo's vice president of marketing.
The speed gains could be dramatic, according to Diablo, helping to give applications such as databases, big data analytics and virtual desktops much faster access to the data they need most. Diablo estimates that MCS can reduce latencies by more than 85 percent compared with PCI Express SSDs (solid-state disks). Alternatively, the flash components could be used as memory, making it affordable to equip servers terabytes of memory, Wagner said.
Other than on-chip cache, the memory channel is the fastest route to a CPU, Wagner said. Not only do bits fly faster over this link, there are also no bottlenecks under heavy use. The connection is designed to be used by many DIMMs (dual in-line memory modules) in parallel, so each component doesn't have to relinquish the bus for another one to use it. That saves time, as well as CPU cycles that would otherwise be used managing the bus, Wagner said.
The parallel design of the memory bus also lets system makers scale up the amount of flash in a server without worrying about diminishing returns, he said. A second MCS flash card will truly double performance, where an added PCIe SSD could not, Wagner said.
Diablo, which has been selling memory controllers for about 10 years, has figured out a way to use the standard DDR-3 interface and protocols to connect flash instead of RAM to a server's CPU. Flash is far less expensive than RAM, but also more compact. The MCS components, which come in 200GB and 400GB sizes, will fit into standard DIMM slots that typically accommodate just 32GB or so of memory. The only adaptation manufacturers will need to make is adding a few lines of code to the BIOS, Wagner said.
Enterprises are more likely to use MCS as high-capacity memory than as low-latency storage, said analyst Jim Handy of Objective Analysis.
"Having more RAM is something that a lot of people are going to get very excited about," Handy said. His user surveys show most IT departments automatically get as much RAM as they can for their servers, because memory is where they can get the fastest access to data, Handy said.
- 2014 Healthcare Data Management Survey Summary This report provides insights into how much information Healthcare IT organizations are managing, the rate of data growth they are experiencing and which...
- State of Cloud Security Report In a relatively short time, cloud computing, specifically Infrastructure-as a-Service, has shifted from a new but unproven approach to an accepted, even inevitable,...
- What is this "File Sync" Thing and Why Should I Care About It? All of a sudden, getting a file from your work laptop to your iPad became as simple as clicking "Save." So it's no...
- Server and system administrators challenged to keep up with enterprise storage explosion Read this whitepaper to learn how administrators are leveraging their existing skills to simplify the management of storage and servers.
- Brunswick Moves Messaging and Collaboration to the IBM cloud Gerry Orten, Jr, Electronic Messaging Manager at Brunswick talks about why Brunswick moved to the IBM cloud.
- Increase Your Data Center IQ Discover how to improve network efficiency, lower IT costs and more proactively manage your physical, virtual and cloud environments. All Data Storage White Papers | Webcasts