If controversial IT pundit Nick Carr is right, there may be little IT left to manage in a few decades — thanks to the notion of utility computing.
Carr, the former executive editor of the Harvard Business Review, recently predicted that all computer technology, except the devices we physically interact with such as keyboards, monitors and printers, will reside outside the corporate walls and be sold to the enterprise as a pay-as-you-go utility. His comments appeared in an MIT Sloan Management Review article entitled The End of Corporate Computing.
Other industry observers, however, say there are still many hurdles to overcome before this becomes reality.
In Carr’s model, IT utility providers (Carr suggests companies as diverse as IBM Corp. and Google Inc. could fill this role) will be able to offer complete packages at a fraction of traditional costs because of their enormous purchasing power. All network, storage, security, back-up and applications — from word processing to ERP — would be owned and managed by the utility. Companies using this model would only own the data residing on the utility providers’ systems.
Carr isn’t dissuaded by the argument that IT is too complicated for the utility model.
In his MIT Sloan Management Review article Carr draws a parallel between utility computing and electrical generation, where organizations access power from a centrally managed supplier and pay only for what they use.
“The fact that IT is more complex or has more layers (than electricity) means a slower transformation into a utility model,” he said. “But on the other hand, in some ways, IT actually is more amenable to full utility supply than electricity,” he said. “With electricity, only the generation function can be centralized.…With IT everything can be centralized.”
The first tentative steps are being taken with this model. Sun Microsystems Inc., for example, charges US$1-per-CPU-per-hour for customers to process data at its Sun Grid Centers.
Tundra Semiconductor Corp., an Ottawa-based chip developer, buys pay-as-you-go CPU cycles to run simulations during its chip development phase. The cycles are purchased from Ottawa-based Gridway Computing Corp. “The Gridway model gives us access to [the power] we need,” said Randy Mullin, Tundra’s director of engineering. In the past, Tundra’s own computer grid ran all of its simulation tests. Buying CPU cycles only when needed is much cheaper, he said.
Mullin said Gridway has proven it is able to provide the quality of service he demands. “They have always been there when we needed them,” he said. Phillip Chung admits the idea of moving critical data off-site was “kind of scary.” The CIO of Montreal-based Madacy Entertainment LP is using an Oracle Corp.-hosted financial application solution. But having his data reside outside of Madacy’s four walls is not a concern because Oracle’s Austin, Tex.-based data centre (where Madacy’s data resides) is SAS 70 certified, he said.
SAS 70 is an internationally recognized auditing standard for IT, and that is enough for Chung to have confidence in Oracle’s security and reliability, he said.
But Carr admits that, at the enterprise level, it will be more difficult to convince companies that the utility model is as reliable and secure as traditional in-house solutions, and that this is more of a barrier to utility adoption than some of the technical hurdles that need to be overcome. “The onus [will be] on the supplier to prove that it warrants that trust.”
Though Carr envisions utility service providers using their purchasing power to create cheap IT solutions (this could include everything from fully managed ERP systems to access to basic office applications such as word processing, he said) the question remains whether they can compete at the enterprise level.
“With bigger companies, the benefits of moving to a utility model only start to arrive when the utility has greater scale economies than your own internal operations can provide,” Carr said. “So if you are a really big company, it is going to take some time before that happens.”
Today, even smaller customers find utility costs high.
For Tundra’s everyday computer needs it is more cost-effective for the company to own the machines themselves, Mullin said. Though Tundra might move to a utility model for all its computing needs in the future, the costs would have to drop significantly, he said.
Another stumbling block is figuring out how to create generic yet customizable applications to offer as a utility.
“[Utility models] really depend on a lot of different companies accessing essentially the same application,” said Gordon Haff, senior analyst with Nashua, N.H.-based Illuminata Inc. If an IT utility were to customize an application so that a client could get market differentiation from its use, it would be no different than today’s costly outsourcing arrangements and thus offer no incentive to change to a utility model, he said.
Carr agrees that application customization is a hurdle, but says it can be overcome through further advances in Web services. Suppliers will need to create user interfaces so customers can easily customize utility supplied components, he said. “You take cheap utility supply technology and link it to your own individualized processes [to] get the competitive advantage,” he said. “Companies will increasingly be able to do it themselves rather than [calling] in experts all the time,” he predicts.
Carr says traditional IT heavyweights like Hewlett-Packard Co. and IBM are good bets to be early players in the market. “Their strength is that they have expertise in setting up big computing operations,” he said. “It is more of a hunch than pure analysis, (but) I’d say it’d be about 20 years before we get to the point we can say this looks like a mature model for utility supply…and we [see] large companies shift over almost entirely to that model.”