Site icon IT World Canada

Utility computing plans take shape

High-tech heavyweights determined to deliver on utility computing continue to pull out their chequebooks for key technologies they hope will fill out their product lines.

The buying spree continued this week as Hewlett-Packard Co. said it plans to acquire Talking Blocks Inc., a maker of Web services management software. HP last month bought Extreme Logic Inc., a consulting firm that specializes in Microsoft Corp. .Net applications and infrastructure. HP’s moves are part of a trend that in the last 12 months saw competitors IBM Corp., Sun Microsystems Inc. and Veritas Software Corp. spend millions of dollars to buy nine companies between them – all in the name of utility computing.

Late last year, storage vendor Veritas spent US$537 million on application performance management vendor Precise Software and another $62 million for automated server provisioning start-up Jareva Technologies.

Acquisitions aside, IBM Corp. has said it will invest about $10 billion to develop utility computing technology in the next few years. Sun too anticipates investing $10 billion in research and development, acquisitions and other activities over the next five years building out its N1 strategy.

HP says its acquisitions will further its Adaptive Enterprise strategy, which promises to provide corporations with intelligent software to manage data-center resources that are based on changing application and end-users’ needs.

HP’s utility road map rivals on-demand plans from management competitors BMC Software Inc. and IBM, and intelligent data-center initiatives from Sun and Veritas. And this week Microsoft got into the fray by rolling out its Automated Deployment Services, which supports the automatic and simultaneous installation of Windows 2000 and 2003 implementations to multiple servers.

Industry watchers say vendors with cash on hand will speed their time to market with acquisitions instead of building new technology in-house. Yet despite all the buying activity and vendor claims of utility computing gains, industry watchers say it’s too soon to tell if the technology investments will guarantee success.

“Companies like HP, IBM and Sun have deep pockets, and, unfortunately for the small vendors, there are a lot of good bargains right now,” says Glenn O’Donnell, a research director with Meta Group Inc.

“HP added a small piece to its adaptive puzzle, but no one has nailed down utility computing entirely yet. They are all working to get, and stay, ahead of the pack,” he adds.

Forrester Research Inc. and Gartner Inc. in separate reports put HP neck-and-neck with IBM in terms of their utility computing offerings and potential to deliver products. Other companies in the utility- computing competition include BMC, Computer Associates International Inc. and Micromuse Inc., each of which makes network and systems management software. Microsoft’s and Sun’s plans fall further behind the management vendors, in part because Sun’s focus is on managing its servers and Microsoft’s strengths lie primarily in managing its own systems.

Because the utility computing model could require customers to invest in new hardware, software and systems integration, many of them are wary of the technology.

Companies such as Priceline.com Inc. in Norwalk, Conn., see utility computing becoming a reality in their own data centers, just not yet. Priceline CIO Ron Rose uses automated server configuration and provisioning software from start-up BladeLogic. He says this type of automation across heterogeneous networks is the precursor to larger utility-computing plans.

“I believe that a true utility computing model would turn the overall data center into one big data-center operating system,” Rose says. “So the first step is make multiple servers appear consistent to the management software.”

Andy Davies-Coward, chief creative officer and co-founder of 422, an animation company in Bristol, England, says he couldn’t deploy utility computing across his network.

“A company like 422 cannot afford to buy and deploy this type of computing power in-house,” he says. He’s using HP’s Utility Data Center (UDC) on a test basis provided by HP Labs. “HP provided the client software, but outside of that, we’re not touching the technical part of UDC,” he says. UDC, available as stand-alone software or as a hosted service, provides automated management and resource allocation features but lacks the application intelligence to deliver true utility computing.

Vendors would seem to agree with Priceline’s Rose. IBM, Sun and Veritas each grabbed an automated server provisioning vendor and announced plans to integrate the technology into their offerings.

IBM this week introduced its Project Symphony strategy to deliver utility computing with the help of technology from its Think Dynamics acquisition. That buyout added automated server provisioning software to Big Blue’s Tivoli software division.

This type of technology can be used to manage multi-platform servers with standard, repeatable methods and reduce the number of administrators required to manage servers. In theory, automated server provisioning would let a utility-computing data center roll out a new server if the load on existing servers threatened to negatively affect application performance.

HP chose to partner with Opsware for this capability, but industry watchers speculate the partnership could develop into another acquisition.

But with its recent acquisition of Talking Blocks, HP says it will address a different utility-computing shortcoming: complex application management. Making sense of and putting to use application performance data collected across data center components remains elusive to most management software makers, Meta’s O’Donnell says. Part of the problem is the complexity built into applications, which vary in the technologies they use from .Net to Java to XML.

“In terms of application development, management is an afterthought, so there is no standard way to collect management data and get a read on the behavior of applications,” O’Donnell says.

Only recently have companies such as Entuity Inc., Mercury Interactive Inc. and Opnet Inc. started to track and record the paths an application takes when fulfilling an end-user request. Because utility-computing vendors propose to help companies first align business applications with the IT infrastructure and manage the network to support and optimize application performance, it is critical that vendors get a handle on how applications use the infrastructure.

For its part, HP says Talking Blocks’ service-oriented architecture technology, which manages the relationships between services and end users, will be integrated into HP’s OpenView management software portfolio. It also will provide network managers with integrated, correlated data from disparate management sources. OpenView then would feed the intelligent application performance management data into HP’s UDC, which could take automated actions to reallocate network and storage resources based on demand.

“Talking Blocks could help OpenView collect all this various application information and better assimilate the data and better communicate it with other applications such as UDC,” says Stephen Elliot, an analyst at IDC.

He says HP and other vendors fighting to gain utility-computing ground will have to wait for a user community interested in these products, and only following that could a market leader emerge. Industry watchers and vendors agree it will be another three to five years before complete utility computing data centers emerge and it could take as many as 10 years before there is widespread customer adoption.

“The vendors are really battling for an uncertain future,” Elliot says. “But utility computing does have them taking stock of what technology they have and filling gaps where they need to.”

Exit mobile version