Shared computing matures

Periodically there is a brief, yet concerted, buzz about shared computing, but it invariably fades as events and developments push it out of the limelight. Sept. 11 reared its head and brought increased security concerns to the forefront, and once again the desire for shared computing faded.

For years computational power has increased at phenomenal rates, even as prices fall, and with it the need for shared computing becomes less pressing.

But with a sluggish economy and companies looking for any way to increase return on investment, the time may be ripe for grid and utility computing to move into the mainstream.

“The old way was to build these large, fixed-cost infrastructures,” said Scott Penberthy, vice-president of business development with IBM Global Services in Somers, N.Y. “Customers are shifting and saying that they have to take the cost down.”

“I think in these cost-conscious times organizations are looking at it (grid computing),” agreed Ron Silliman, senior analyst with Gartner Dataquest in Paoli, Penn.

PricewaterhoseCoopers recently released a technology forecast for 2002-2004 in which grid and utility computing environments are deemed to be ripe for growth.

Of the shared computing possibilities, “grid has probably got the most promise,” said Terry Retter, director of strategic technology services at PricewaterhouseCoopers’ Technology Center in Menlo Park, Calif. But he added a caveat. For grid to move out of the education/scientific arena (where it is today) it is going to take a seminal development similar to what the browser did for Internet use, he said. “Otherwise [growth] is going to be linear and slow.”

The problem, Retter said, is that the tools and applications necessary to manage data across multiple domains are not as developed as the hardware.

One of the reasons shared computing has been successful in the scientific community is that, for the most part, they have the skill set and manpower to use brute force to manage resources across disparate domains.

business world very different

This strategy does not carry over into the corporate world. “Business people don’t want to be technicians,” Retter said.

But the tide is changing. Today industries which need extensive computational power, but without the associated cost, are turning to shared computing to solve their problems. Both the automotive industry, which uses grid solutions to help solve complex engineering problems and the biotech/pharmaceutical industry, which uses grids for computational-intensive protein modelling, are at the forefront of adoption.

Shared computing is on the threshold of expansion outside of these industries because some of the issues in resource management and in application distribution across multiple devices are being resolved, Retter said.

Sun Microsystems Inc. has technology aimed at solving some of these problems. The Sun ONE Grid Engine is designed to manage everything from bandwidth and memory to disk space across distributed environments. “Now you can glue together thousands and thousands of resources,” said Wolfgang Gentzsch, senior director of grid computing with Sun in Menlo Park, Calif.

Experts agree that grid computing growth will start within corporations, and most likely within specific divisions. Grids are much easier to manage when the entire structure remains inside a corporate firewall.

But there are still many hurdles to overcome.

One is what Gentzsch calls a “mental firewall.” Executives have to overcome the notion that data has to be constrained to be safe.

Retter said there is also a need for application vendors, specifically those in Web services, to develop with shared resources in mind.

“If the applications can be broken into smaller components and then managed across more complex technological environments, then I can start to see how [it can] provide better computing resources to solve real business problems,” he said.

In addition, “as you move down that path you are going to need better security, resource management and better administrative controls [which are] able to rapidly reallocate resources.”

Gentzsch said there will also be a huge demand for better virtualization and provisioning of resources technology to make grids more manageable in order to reduce the number of people needed to operate them.

Silliman sees still another hurdle. Once computing moves out of the corporation, and involves multiple vendors and technologies in a shared environment, there may be a tendency to play the blame game when things go wrong.

“I think that is where we are now. But utility computing is actually a step in the direction of maturity, moving away from the blame game,” he explained. The industry is still several years away from the necessary maturity in terms of openness and standards for all of this to fully succeed, he added.

All the experts agreed that computing as a utility model will take off, only once the problems around grid computing have been solved.

“Computing as a utility model is one of those things that has been talked about for 25 years,” Silliman said. But it is getting to the point where it may take off, he added.

Would you recommend this article?


Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.

Jim Love, Chief Content Officer, IT World Canada

Featured Download

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now