The promise of storage-area networks (SAN) is great. What information technology department wouldn't want a dedicated, high-speed network for a shared pool of storage devices? It offers centralized management and the scalability to handle terabyte databases. Plus, it off-loads backup traffic from the LAN.
The industry hype about SANs has been great, too. Vendors have spewed out at least 1,460 press releases about SANs in the past 18 months - surpassing even the "wireless Web" on the hype meter (with 1,056 releases) during the same period.
But some corporate IT managers don't buy it. They look at SANs and see a risky, immature, expensive technology with interoperability problems, according to three studies and interviews conducted with a dozen users.
Buyers may sympathize with the IT manager who complained that "there's so much smoke and baloney around SANs, it's impossible to know what's real."
Compatibility Problems Remain
In a recent survey of 100 storage-savvy IT professionals, Enterprise Management Associates Inc. (EMA) in Boulder, Colo., found that 80% had a major gripe about SANs that is holding back their purchase or deployment of the technology (see chart).
So while users are eager to gain the benefits of SANs, "the act of implementing the technology at this stage is daunting enough to give IT professionals serious pause," the EMA study said.
So What Exactly Is a SAN? As if storage-area network (SAN) technology weren't complex enough, it's impossible to find even two networking cognoscenti who agree on what a SAN is. A SAN can be as bare-bones as a server connected via copper to a LAN having a dedicated backup connection, one industry analyst said. Another scoffed at that, saying direct-attached storage not only isn't a SAN, but it's a technology on a short path to extinction. For other analysts, it's not a SAN unless it runs on Fibre Channel and has fail-over and virtual storage capabilities. Confused? Forget the technology and start with the basic concept, said Stanley Worth, a product manager at Fibre Channel switch maker Crossroads Systems Inc. in Austin, Texas. A SAN is about time and space, he said. "Time is what keeps everything from happening all at once. Space is what keeps it all from happening in the same place. "Networking allows the spreading of data across space, and storage lets you move the data across time," Worth said. In other words, a SAN is the confluence of storage networking and data resources around space and time, he said. The Storage Network Industry Association in Mountain View, Calif., defines a SAN as "a network whose primary purpose is the transfer of data between computer systems and storage elements, and among storage elements." But "the definition will be different for every SAN," just as it's different for every network, said Jerry Lynch, operations director at the Online Computer Library Center in Dublin, Ohio. Lynch oversees all backup to a SAN. "For me, it's sharing new tape drives across multiple platforms and across Fibre Channel switches, not redundant hardware for each platform," he said. "It's not a crisp definition, nor should it be," Lynch said. SAN technology "is still new and still evolving." | |||
"Daunting" is an understatement for Shawn Tu, a systems administrator at Assurant Group, a credit-related insurance company in Atlanta. After hiring a systems integrator, investing months of work and spending hundreds of thousands of dollars to build a SAN, he flipped the switch - and it failed.
The SAN's Fibre Channel switches and Solaris servers just didn't work together, Tu said. Numerous interoperability glitches required custom fixes. As the frustrated Tu put it, "The customer site shouldn't be the interoperability [test] site."
Tu isn't alone. In a recent Computerworld survey of IT managers, 49 of the 160 respondents cited interoperability as their biggest concern about SAN implementations.
All in all, Assurant paid $250,000 for its initial 1 terabyte (TB) SAN and finally got it working after almost two months, "but I could not quantify how much of my sanity was lost in dollar terms," Tu said.
Lack of interoperability isn't a new problem for SANs, but it's a lingering one.
"The SAN interoperability issue is a double-edged sword [for vendors]," said Steve Duplessie, an analyst at Enterprise Storage Group Inc. in Milford, Mass. "They all agree they need to propel the interoperability issue forward." But with their allegiances divided between two prestandards bodies - the Storage Networking Industry Association (SNIA) and the EMC Corp.-led FibreAlliance - the vendors may never agree on standards proposals, he said.
The finger-pointing is intense. An IBM official, for example, said EMC isn't interested in standards because it's the market leader. Hopkinton, Mass.-based EMC said that's untrue and that it's the server vendors that have suppressed standard development.
It's possible that users could goad vendors into cooperating. And there are glimmers of hope. In what one analyst called "a bloodless coup," the industry adopted protocols developed by San Jose-based Brocade Communications Systems Inc. to ensure that different Fibre Channel switches can talk to each other.
Work in Progress
The industry is keenly aware of the interoperability problem and says it's working on it.
For example, EMC offers storage networking for 35 server environments, including mainframes and open-system servers, said Don Swatik, EMC's vice president of strategic planning. And the FibreAlliance has proposed a specification for managing devices across a heterogeneous network, he said.
Speeding release of SAN standards is the primary goal of the SNIA, said Gary Phillips, the consortium's board secretary and systems technology manager at Compaq Computer Corp. SNIA members are developing definitions for a range of standards to submit to a standards body, Phillips said.
Furthermore, many vendors - including EMC, IBM, Hitachi Data Systems in Santa Clara, Calif., and EMC spin-off McData Corp. in Broomfield, Colo. - have built interoperability labs to test their products with those of other vendors. EMC said it has spent more than $1 billion on interoperability testing during the past six years.
But these labs don't go far enough, Tu said. "They test [for interoperability of] each other's hardware, but they don't do very well testing software on different operating systems," he said. "Ours worked almost flawlessly on NT but had lots of problems on Unix."
In a study conducted by International Data Corp. (IDC) in Framingham, Mass., more than 80% of 301 IT professionals rated "open standards" for SANs as very important.
Underlying user concern is that in the absence of standards, the simplest way to build a SAN is to get everything from a single vendor. But then the user is locked into one vendor and its technology choices, which could become obsolete.
"The reality is that SANs are risky, with lots of gotchas," said Matt Rock, director of engineering services at systems integrator Intelligent Solutions Inc. in Medford, Mass. "It's a valid fear" that a single-vendor SAN could be superceded by new standards, he said, and then the user would have to rip it out and replace it.
Although users complain of immature or nonexistent SAN standards, some analysts call it a nonissue. "Considering how new SANs are, I think the standards are maturing fairly quickly," said William Hurley, an analyst at The Yankee Group in Boston. "It's just that the need - the insatiable consumption of spindles within the enterprise - is growing faster," he said.
"It's a red herring," said John Webster, an analyst at Illuminata Inc. in Nashua, N.H. "SANs are about cost vs. benefit."
The High Cost of SANs
Ah, but those costs. An extensive, enterprise-level SAN could cost $4 million or more.
Richard Boyle, vice president of technology deployment at The Chase Manhattan Corp.'s global private banking unit in New York, got quotes ranging from $210,000 to EMC's bid of $1.2 million for a 3TB to 10TB SAN. "If I went to my boss and quoted him $1.2 million for a SAN, he would throw me out of his office and then open the door," Boyle said. "I asked for $210,000, and the CIO still questions why the amount is so high."
SANs are more expensive if they require an enterprise subsystem, such as EMC's Symmetrix, IBM's Shark or Hewlett-Packard Co.'s HP 256. "You can cut your total costs by as much as half by using distributed storage" such as Compaq's MA 8000, said Ron Johnson, an analyst at Evaluator Group Inc. in Englewood, Colo.
That's true, said Jill Kaplin, IBM's director of SAN marketing strategy. "If you're not doing tens of terabytes of storage, a big subsystem like the Shark may not be the most efficient way to go," she said. But as storage approaches 10TB, economies of scale and performance come into play, she said. "If you've got an environment with a big Oracle or DB2 database or SAP application, you're going to need the performance" an enterprise subsystem provides, she said.
Rich Ward, associate vice president of support at Keystone Mercy Health Plan in Philadelphia, said he's concerned about costs. He's planning a pilot SAN for the health plan's Medicaid division, but the price tag may hit $200,000 to $400,000.
He's also worried about taking the SAN plunge too soon. If he deploys a SAN now, there's a danger that "a year from now we'll have to rip it out in order to add functionality," he said, "or we buy it, put it in and it becomes obsolete."
Shifting Technologies
One of the nagging worries about SANs is that there isn't complete agreement on exactly what technologies to use.
The prevailing view is that Fibre Channel is the data transport protocol of choice. But some vendors advocate other protocols such as Enterprise Systems Connection, SCSI over IP or InfiniBand, an I/O architecture being developed by the Infiniband Trade Association in Portland, Ore. And one company runs a SAN over High Performance Parallel Interface, which is commonly used for supercomputers.
For large companies, the problem with Fibre Channel is that the distance between connected devices is limited to a range of 30 meters to 10 kilometers, depending on the connector type and wavelength.
At Chase Manhattan, Boyle said, the bank has as many as 10 SANs (depending on the SAN definition used), but they're in different buildings and no SAN island connects to another. The lack of support for wide-area networks in the SAN technology means there's no effective way to connect them, he said.
Despite the problems, there are SAN success stories, where SANs have made it easier to manage the burgeoning growth of data from enterprise applications and data warehouses. With administrative costs for storage rising 100% to 300% annually, companies will be forced to consolidate storage to get costs under control, analysts said.
Backup of that data is the "killer app for SANs," said IDC analyst John McArthur. "LAN-free backup isn't the kind of thing that gets CEOs excited, but it makes data center managers' lives easier," he said.
Ultimately, Boyle said, he believes Chase Manhattan's SAN investment will pay off. The 30 servers in his department have been consolidated into four, and he expects the SAN to save about $340,000 annually - not including labor-cost savings. "It's worth every buck that was invested," Boyle said.
Nevertheless, a healthy dose of skepticism is prudent, said one IT manager who plans a pilot SAN this year. "It'll be like having a [navigation system] in my car," he said. "It's really cool, but I just don't need it."
At McKessonHBOC Inc., a San Francisco-based drug distributor and medical software vendor, technologist Stephen Zander also said he has no illusions about his planned SAN pilot. "It's not a magic bullet," he said. "There's a lot of talk, and to take it at face value is not wise. It's maturing technology."
Tu added this cautionary note: "Make sure your system integrator knows all the products you want to work with and has a lot of patience. There are no blueprints for a SAN."