A bevy of software and systems vendors unveiled products designed to make more efficient use of network resources, such as servers and storage gear, at Networld + Interop in Las Vegas last month.

All of the vendors tossed around terms like “utility” and “on-demand” to describe their wares, painting a picture of a computing world where network resources and business processes become better-aligned automatically with little human intervention.

The announcements included:

–IBM’s SAN Volume Controller, SAN Integration Server and SAN File System, designed to give a single view of networked storage resources

–IBM Server Technology for WebSphere Application Server, which pools application server resources

–EMC’s PowerPath software that virtually groups separate storage systems together

–Computer Associates’ Unicenter Network and Systems Management 3.1, built to allow companies to better track services and applications

-Hewlett-Packard’s Adaptive Enterprise strategy, incorporating hardware, software and services, which aims to tie the IT infrastructure to business processes

IBM later cemented its focus on “on-demand” computing by picking up Toronto-based Think Dynamics, a builder of automated server provisioning software, for an undisclosed sum.

In the storage market, all of the major vendors are working on virtualization strategies that will allow users to view and manage their separate storage units as if they were one, large resource, said Jamie Gruener, a senior analyst with consulting firm The Yankee Group, in Boston.

“By virtualizing storage, you can now move towards automation and begin to meter usage,” Gruener explained.

The ultimate idea is to create a service-bureau type of environment for internal IT staff, so companies can understand what their IT infrastructure costs and how to charge various business units for its use, Gruener said.

However, he noted, “utility” and “on-demand” services are, at the moment, more of a concept than reality.

“A lot of people are still getting their arms around what they have in place and figuring out a road to take,” he says. “I don’t think at a customer level that the utility storage environment has taken off.”

One hurdle vendors will have to overcome in creating a “utility” environment is multi-vendor support, Gruener says.

“It will be all or nothing for many vendors,” he explains. “A lot claim their environments are heterogeneous, but there aren’t really a lot pushing that. Some exceptions are companies like Veritas and Computer Associates that aren’t tied directly to the network infrastructure.”

Joe Schinker, a network engineer at call center provider West in Omaha, Neb., said he needs more information from vendors before he’ll believe the utility computing hype. “My question to vendors is, “What is the business model or requirement they have heard from customers or potential customers to being this offering to market?'” Schinker said.

Other users though believe utility computing models have potential. George Fiedorowicz, vice-president of technology operations for Concord EFS, a Memphis, Tenn. Provider of electronic transaction-processing and funds-transfer services, said utility computing promises a more efficient future for his IT organization, which has to oversubscribe its network at peak demand periods, such as heavy retail seasons.

“Let’s face it, every IT shop is fairly lean these days,” he said. “Anything that can help you provide better [ quality of service ] is worth examining.”

– With files from IDG News Service