A new survey finds many IT departments of medium to large enterprises underestimate the impact of storage when virtualizing their IT environments, to the point that implementation gets slowed down or even stopped altogether.
Specifically, one-third of survey respondents estimated too low the effect of desktop and server virtualization on the storage budget.
“You get excited about saving all this money with consolidating many servers to a few and completely forget that part of that has to be used to offset getting additional capabilities on the storage side,” said George Teixeira, president & CEO of DataCore Software Corp.
Blogging Idol 2011 is back! Blog your thoughts for the chance to win a prize.
The survey, which polled 450 medium and large enterprises across Canada, U.S. and Europe, was commissioned by Fort Lauderdale, Flo.-based DataCore Software.
Fifty-six per cent of respondents also expressed surprise by the performance bottlenecks that occur when multiple workloads are consolidated. Similarly, they were met by higher availability requirements in order to prevent failure of that single platform.
“Between those two requirements for faster performance and better business continuity, it really drives costs up because now you need an enterprise-class storage than what you think you needed,” said Teixeira.
Similarly, when it comes to the cloud, IT departments will first move less-critical operations, such as print and Web services, before tackling core operations, such as e-mail and databases. But Teixeira said the storage expectation must be that they will also need “pretty heavy capabilities behind it.”
The survey also found that 41 per cent of IT administrators have to deal with two or more different storage systems from the same vendor, while more than 60 per cent cannot even control multiple storage resources as a single pool.
Teixeira said the problem is one of inertia because new systems are often deployed with their own private storage system. “You just keep going that way. And when you say, ‘I can virtualize all of this.’ Then they’re forced to think about, ‘How do I share it? And if I share it, what’s the impact?’” said Teixeira.
Similarly, when deploying apps to the private cloud, the storage issue will remain the same given elevated requirements for performance and high availability, said Teixeira. “Either way, if you try to get higher requirements for performance, there’s a cost associated with it,” said Teixeira.
According to Dave Pearson, senior analyst for storage at Toronto-based IDC Canada Ltd., storage virtualization is a relatively new, albeit key, component of virtualization. But he said the storage component is still a “support” technology led in most cases by server virtualization.
Pearson said storage virtualization forms part of what he calls virtualization 3.0 where it’s not just about getting rid of physical machines and improving processes anymore. “It’s now part of optimizing the entire solution, making sure you’re really getting your money’s worth out of the hardware that you’re committing to these virtualization projects,” he said.