The dust on the publicity around utility computing seems to have settled, and the industry is finding that the hype has yet to move to the realm of reality.
The promise of IT computing capabilities delivered to users as a utility, in the same form as electricity or water, was heavily marketed by all the major players in the industry several years ago.
Statistics today, however, reveal that local enterprise adoption is failing to meet market forecasts.
IDC Canada’s Mark Schrutt says the Canadian IT utility market is still relatively small, estimating it at $200 million to $300 million.
Despite the slow adoption, however, Schrutt says he expects the industry to experience growth in the near future.
Schrutt adds the industry needs to promote more high-profile success stories to help utility computing expand its market reach.
Ottawa-based chip maker Tundra Semiconductor is one such success story.
Tundra is an early adopter of utility computing services. In 2004, the company started buying CPU cycles from Ottawa’s Gridway Computing in order to test new product designs.
“There are several points of chip development that are very compute-intensive,” says Randy Mullins, director of engineering at Tundra. “Plus, it’s a very peaky kind of activity, as there will be times where we need 100 computers to run our functional application.
Because of this, it would be very expensive for us to maintain enough compute capacity in-house to meet our peak needs.”
Before using Gridway’s utility service, Tundra had to stretch out the peak development time, causing the cycle to last longer and adversely affecting quality control.
Purchasing new hardware was deemed inefficient by Tundra because it would only need the additional resources during peak periods, which would mean new hardware would likely sit idle for most of its lifecycle.
Tundra’s cyclical business environment makes the company an ideal candidate for utility computing, according to vendors. But how many similar companies have followed suit and established a utility infrastructure?
“From the buyer’s perspective, there has been a great amount of other distractions,” IDC’s Schrutt says. “We’ve seen the movement of utility computing from infrastructure and into process and application. Some of the early buzz was on the infrastructure and utility hosting side.
Now, that’s moved into software-as-a-service, which has created a little bit of confusion in the marketplace.”
One technology that has allowed organizations to leverage the concept of utility-based computing is virtualization.
The technology enables the creation of virtual instances of machines or operating systems from a single high-capacity physical server. The number of virtual machines can be increased or decreased based on current operational needs.
The “server-hugging” mentality of many CIOs, however, has become a stumbling block for adopting a virtual infrastructure. The biggest challenge, according to industry analysts, is a cultural one where department leaders have shown reluctance to give up the notion of owning a physical box.
“Even in terms of risk mitigation, as a business owner you try and mitigate risk to the extent that you control things, that you can see them and touch them,” says Bogomil Balkansky, director of product marketing at server virtualization vendor VMware.
“It kind of makes you feel good to go into the data centre, see a server with your name on it and be able to touch it.”
Another concern for IT is the budgeting for utility infrastructure services. Cost analysis can be tricky because organizations are being required to project the future, as utility infrastructure is often on a per-user, per-usage basis, according to Schrutt.
These customer-end challenges aside, industry analysts say a lot of work from the vendors is still needed before this type of computing can reach its full potential. The major area of concern is the lack of consolidation among utility services.
“There is no one single vendor that you can go to that will help you set this up,” Info-Tech analyst Nauman Haque says. “You might contact VMware for your virtualization aspect, and then BMC Software for your server and infrastructure management software and so on. As of yet, we haven’t seen a single vendor step up and provide an out-of-the-box solution for cradle-to-the-grave implementation of utility infrastructure.”
Despite these challenges on both sides of the market, don’t start writing utility computing’s obituary just yet. Vendors like Sun Microsystems say that the rising cost of running data centres will have companies looking for alternatives in the future.
“In Europe and North America, the cost of compute (power) isn’t the big issue anymore,” says Mark Herring, director of Sun’s Network.com. “It’s the cost of running compute, the cooling of it, the electrical, and we’ve seen these factors driving customers to look toward the utility (model).”
Although customers have historically been wary of new technologies, Herring says, they eventually come around.
“I think we’ll start getting used to the fact that we don’t own and run it,” he says. “Just like now, we don’t own our word processors or our search engines, but we’re happy to start using these things almost like a utility.”
Herring says that as CIOs get into this mindset, they will clearly see the benefits of utility computing in all forms. For now, virtualization seems to be the utility computing model that’s gaining more prominence in the enterprise.
According to Info-Tech’s Haque, whether utility adoption would gain more traction depends less on the size of the company and more on the maturity of infrastructure management. He says companies sold on the idea that they have to centralize IT assets will naturally be ahead of the curve in terms of data centre consolidation and virtualization.
Balkansky says VMware serves over 20,000 customers ranging from multinationals to small businesses. In fact, VMware is partners with every Fortune 100 company and all of them have been implementing virtualization for their infrastructure in varying degrees.
VMWare recently started offering its virtualization suite as a utility service. The VMware Service Provider Program offers its virtualization technology as a hosted service provided through various hosting partners.
Under a new licensing model the hosting providers can offer virtualization on a per-virtual machine, per-month basis, allowing customers to reduce or increase virtual machine counts depending on capacity requirements.
Balkansky says that despite VMware’s vast client list and its focus on utility, the industry still has a long way to go in terms of the virtualized infrastructure’s untapped potential. He expects that as infrastructure comes up for renewal, many companies will seize the opportunity to give the utility option a try.
“If you look at any company, there is this rolling schedule in data centres of these old machines coming off and new machines coming in,” Balkansky says.
“What we find is that most customers find that moment of retiring old equipment and putting new equipment in place as the time to virtualize workloads, so the cycling through of putting more and more workloads on virtual machines really follows the process of recycling and rejuvenating equipment in the data centre.”
And while the lack of media coverage could be a concern for other tech industries, some utility infrastructure vendors believe that silence is golden.
“You’re getting the impression that media coverage has died down because this is no longer the bleeding-edge technology,” Balkansky says. “Everybody’s doing it to some degree, the technology has proven that it’s here to stay, and it’s something that is quickly becoming the platform for the new data centre.”