Can IT be delivered with a monthly utility bill?

It’s the end of corporate computing, as we know it.

So says Nicholas G. Carr, the Harvard Business Review scribe who two years ago asserted that IT Doesn’t Matter. That point of view, which suggested there’s no business competitive advantage gained from information technology investment, ignited a firestorm, particularly among vendors that took great exception to the debunk of what most positioned as a key value proposition.

More from Dan McLean

To read more of Dan’s articles for The Globe and Mail, click here.

Last week, Mr. Carr added to the IT Doesn’t Matter discussion by postulating that businesses will soon discover that it makes no sense to own and manage computing capability. In a nine-page discussion document, called The End of Corporate Computing, Mr. Carr contends IT is at a stage where the transition from information technology as a built and owned corporate function will move to a service delivered by a utility provider.

The IT community isn’t likely to react as vociferously to Mr. Carr’s statements this time, since the notion of the pay-as-you-go IT utility has been around for quite a while. But perhaps it should maintain a bit of skepticism, all the same.

It’s a question of time – with Mr. Carr suggesting we’re apt to see the IT utility sooner rather than later – and ownership, where Carr believes few, if any, companies will choose to own and maintain their own information technology infrastructure.

In his discussion paper, Mr. Carr describes IT as a “general purpose technology.” Using the analogy of electricity when it was poised to become a utility service, he points out that the first electrical utility suppliers who emerged in the late 1800s were limited in their capacity. Electrical utility customers back then were those who could afford what was a relatively high-cost service, and large industries that required huge amounts of power were apt to build their own electrical generating stations. But over time and through technological development, generating capacity by utilities was vastly increased and price of electricity was driven down. A key tipping point came in the early 1900s, according to Mr. Carr, when the share of electricity production by utility companies in the U.S. rose from 40 per cent in 1907 to 70 per cent by 1920.

He sees the same thing happening to data networks and computing power. Mr. Carr explains that companies currently purchase from IT vendors a wide range of products needed to build “complex information processing plants” or data centres that are housed within their own corporate walls. This situation is similar to what many large companies in the late 1800s did in building their own electrical power plants to keep their operations running. Businesses today hire their own people to maintain information technology, and periodically bring in specialists to solve complex problems – again, a similar situation to the pioneers of the power grid.

The problem back then and now is the inherent wastefulness of this approach. First, there’s a tendency to overbuild. Second, much of the basic function and staffing of IT is similar across most companies, so it’s easy for a third party to handle it economically, Mr. Carr says.

“When overcapacity is combined with redundant functionality, the conditions are ripe for a shift to centralized supply,” he suggests.

Carr isn’t so far off the beaten track this time, in my view, and many would agree in principle with his prognostication. The concept of computing sold as a utility has been around since the days of “time sharing” of mainframe processing cycles some 40 years ago. Plus, the IT service community has been promoting many types of commercial utility computing for at least five years – things like voice over IP, grid computing, software as a service and Web services, among other things, are all examples of how computing can be delivered in a utility form. IT outsourcing, one of the industry’s hottest trends, has grown from the very idea that IT service providers can achieve an economy of scale through the efficient delivery of ubiquitous computing and IT-enabled process functions, which reduces the cost to customers.

That said, there are several points that Mr. Carr does not appear to fully consider or flesh out in his dissertation. It might be argued that the author is taking too simplistic a view – that fundamentally the evolution of an electrical power grid, which happened some 100 years ago, cannot be reasonably compared to the evolution of IT in the modern world. Many, many radical business and technology changes have taken place in the past century, and Mr. Carr may not have given enough consideration to this factor when making his comparisons.

More importantly, data and computing power is a whole lot different from electricity. Don Tapscott made the point during the Rumble in the IT Jungle debate with Mr. Carr at Toronto’s CN Tower last week. Data is complex, especially the digital kind – it’s fundamentally information in many forms (voice, video and data), transferred in “packets” of many different sizes. That creates a whole lot of performance, quality-assurance and delivery issues that power utilities don’t have to deal with. For example, a power brownout might be an inconvenience or the customer might not even notice it, whereas things can go badly awry for a business if packets get lost or misrouted, if specific types of data aren’t given priority on a network, and so on.

In short, Mr. Carr hasn’t addressed what is likely the biggest obstacle to creating an IT utility: Managing all that data as it traverses its way along a multitude of cyber pathways through potentially millions of unknown sources and destinations. Knowing where data goes, how it gets there and ensuring in the transfer that you’re able to deal with things like latency, security and quality of service adds a huge layer of complexity. And the current state of distributed processing is pretty poor from a management perspective – that is to say the tools, processes and techniques used for IT infrastructure management are pretty basic and not widely used by most businesses. Mr. Carr is envisioning a utility that would demand a highly managed environment, with intelligence way beyond what’s typically in use today. It simply doesn’t exist right now, and it will take a massive amount of effort and investment to build.

Still, Mr. Carr’s ideas are interesting and worthy of consideration. IT at large can often appear to be a gated community of vendors and professionals seeking to maintain the status quo in the interests of self-preservation. To his credit, Mr. Carr challenges computing convention, even if the reality of what he believes is a lot more complicated than he’s making it sound.

–This article appeared in The Globe and Mail on April 21, 2005.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now