As one computer company ramps up its “virtualization” software for storage area networks (SANs), industry analysts advise caution for enterprises that are considering the technology.
Hewlett-Packard Co. (HP) in January described its plans for VersaStor, software that turns disparate storage devices into a single, easy-to-manage entity.
“You’re pooling all of your storage resources,” explained Paul Patterson, product marketing manager with Hewlett-Packard (Canada) Co. in Mississauga, Ont. “In the past, if you had a device operating on HP-UX, another device operating on NT and another operating on Linux, they had to be dedicated.”
Virtualization adds a layer of abstraction to the system, so that users can access storage as if it were a utility, rather than a collection of boxes, each dedicated to one or another application, Patterson said.
“It’s not unlike the power coming into your home. You don’t care if it’s coal burning, if it’s water-based or nuclear. All you know is, when you flip the switch, the light goes on.”
VersaStor is the next iteration of HP’s Continuous Access Storage Appliance (CASA). It sits behind SAN servers and in front of the storage media to turn the unwieldy data platform into a more manageable mass. HP plans to release VersaStor products by the end of 2003, Patterson said, adding that the company leads the virtualization market.
“We have a living, breathing product today called CASA that does what we’re promising on virtualization.…I feel that a lot of our competitors are talking about the ability to do it, but we’re actually doing it today.”
HP may well be out front with virtualization, but analysts say the technology faces certain hurdles before attaining maturity. They suggest corporate Canada should step carefully when considering it for the SAN.
“Our recommendation to customers – and has been for the last 12 months – is hold off on decisions of virtualization as a strategic part of your storage management infrastructure,” said Anders Lofgren, senior industry analyst with Giga Information Group Inc. in Cambridge, Mass. “It’s an immature market with immature technologies.”
“Confusion” best describes the virtualization market, Lofgren said, describing the juvenile nature of the industry today.
“There are a lot of different definitions. You could have virtualization that virtualizes within a single array, a single box. You could have virtualization at a network level, which allows you to virtualize across multiple boxes.”
Virtualization within individual boxes is a reality, Lofgren said. For example, HP has its StorageWorks Virtual Array, a device that affords easy logical unit number (LUN) and file creation. But vendors are making a play for SAN-wide virtualization, and that’s where the trouble starts.
Sure, virtualization has its benefits. Utilization rates improve with easy-to-share storage resources. But the technology also raises questions, Lofgren said.
“There are discussions about how virtualization integrates with current, existing storage management functionalities. How does it integrate with my remote mirroring capabilities, my point-in-time copy? Do those things still work when I have virtualization? How does it integrate with my storage management tool?”
And where should the enterprise turn for virtualization needs? That’s another point of confusion, Lofgren said. He pointed out that most of the usual suspects – the companies normally associated with SAN-sized projects – are still working out their virtualization plans.
Sun Microsystems Inc. has its N1 initiative, but the company offers few details about this virtualized vision. IBM Corp. has its virtualization engine in the oven, but it’s not ready for consumption. As for HP, it’s talking about VersaStor today but “it’s still 12 months away,” Lofgren said.
He added, “I think it’s a big deal for HP. That technology has been around for quite some time. They’d better deliver on it now, because it’s been years in the making and it’s been talked about for several years. They need to deliver on this product.”
Stan Peck, a director at IDC Canada Ltd. in Toronto, said he doesn’t put too much stock in first-to-market claims such as those made by HP.
“Some might say they’re there today, but I think the more fair statement is it’s like a journey and vendors are making good progress on it. They’re partnering with others and trying to come up with standards to get there faster.”
Peck also suggested that enterprises should take the go-slow approach to virtualization. Make sure the vendor’s platform upsets not one hair of the operating environment before moving ahead.
“I would challenge the vendor to prove that they could do it, test it out,” Peck said. “Most of them have lab environments where they test things out.”
Lofgren from Giga said virtualization as a concept is far from fruition, but as a buzzword it’s over-ripe.
“The word has become so overused and abused that it doesn’t have much meaning anymore.”
But judging by Patterson’s words, that’s just fine for HP. The company envisions a day when companies access storage as a utility, which means they care less about high-tech phrases and more about how the technology makes SAN management that much easier.
Virtualization will turn storage devices into invisible enablers, just as the light switch masks the workings of the power grid, Patterson said. After all, “no one turns on a light and says, ‘Oh, I hope it’s nuclear.'”