TORONTO – VMware Inc. envisions the data centre of the future as a flexible and dynamic infrastructure, or platform, atop which services and applications are provisioned regardless of underlying systems.
Applications can be deployed regardless of the operating system, and can be used whenever required and paid for based upon usage, said Srinivas Krishnamurti, director of developer products and market development with VMware.
SLAs can be set at the application level and then enforced throughout the underlying levels of the data centre, said Krishnamurti.
“The platform is key,” said Krishnamurti, who spoke at the Virtualization Forum 2009 in Toronto on Tuesday.
A platform approach, continued Krishnamurti, aligns with how applications are being written today: in a systems- and device-independent fashion.
The concept of application freedom is a lot like device freedom — Krishnamurti said in the future users will be able to access data regardless of the device.
Bringing a service model, like paying for usage, to customers’ IT infrastructure in the form of private clouds will rid the complexity of managing today’s data centres, said Krishnamurti.
The Palo Alto, Calif.-based virtualization vendor wants to help customers build and run internal clouds given the excessive money and time spent managing the myriad components of data centres, said
Using the telephone as an analogy, Krishnamurti said customers should be able to use services much like they use their telephone service.
“You pick up your phone and you get a ring tone, but you don’t know how all the stuff in the backend is connected … it’s completely irrelevant,” said Krishnamurti. “It’s a service.”
“From an IT perspective, people want to manage applications, not everything that runs beneath it,” said Krishnamurti.
He added that vSphere 4, announced last April, helps make this happen by separating the hardware from software and is the foundation of the private cloud.
Michelle Bailey, research vice-president for enterprise platforms with Framingham, Mass.-based IDC Ltd., thinks that virtualization is “disruptive” for the data centre and will drive IT policy creation, application deployment and the general operation of the data centre.
She’s observed, for instance, businesses demanding that new applications they adopt run on virtualization.
Bailey said 2010 will see more virtual than physical machines. Already, virtualization has moved from test and development functions to business-critical applications.
But while virtualization will increasingly gain notoriety, Bailey said businesses should reinvest money saved on hardware into management and automation tools.
“Customers really aren’t deploying management tools in the way we would have expected them to,” said Bailey.
Virtual machines are still being managed as if they were physical machines, said Bailey.
Toronto Hydro has two data centres, 30 and 40 years of age, that are already 45 per cent virtualized. The utilities company’s director of IT infrastructure, Stephen Walker, said the plan is for 70 per cent virtualization by 2010.
Aside from the benefits of energy efficiency and reduced cost, the motivation for server consolidation was the company’s own Data Centre Incentive Program (DCIP), which provides cash incentives to businesses that run green data centres.
Echoing Bailey’s comment on the need to focus more on managing virtual environments, Walker said Toronto Hydro made sure to engage in things like administration, management and capacity planning around virtualization.
“So now that you have six applications on one server, how are you efficiently managing that, making sure there’s no I/O contention and other things like that,” said Walker.
The new environment is changing Toronto Hydro’s IT policies and procedures, said Walker, like having to create ones that govern procurement, utilization, monitoring and capacity planning.
“So it has changed the way we think,” said Walker.
In-house IT staff who were experts on standalone machines have since undergone training on various virtual products.