Dan McLean: IT utility concept exciting but still in nascent stage

Dan McLean: IT utility concept exciting but still in nascent stage

Published: May 16th, 2003

By Dan McLean

Few services concepts in recent history have received the grand attention of what’s often referred to as the IT utility.

There’s good reason to get excited. The idea of computing on demand, computing on tap, the adaptive infrastructure, and many of the other emerging utility computing concepts that are appearing in the market hold the promise of a widely available computing architecture that could be utilized by any business or organization.

A “public” IT utility, which looks and operates like a hydro electric utility, for example, would offer that always-on, highly reliable, plug-in infrastructure to anyone who needs it, whenever and wherever it’s needed. You’d pay only for what you used and the computing services would be available virtually anywhere and at anytime. In the case of an IT utility, you might buy highly varied offerings, which would not only provide varying computing and communication functions, but potentially on-demand application and business process capabilities as well.

This sort of widely available public utility is the creation of a highly integrated and seamless resource, so that the service you’d buy from one provider would be completely interoperable and seamlessly connected across every platform that existed in the public utility. It would be a true wide-spanning, end-to-end, common architecture and environment, designed to deliver IT power, as it were.

The truth to much of what’s being envisioned for a computing utility today is that this nirvana of IT is largely a vision and one that is a long way off to becoming reality. The computing utility of 2003 exists in various forms and iterations, but certainly as nothing that comes close to the seamless public architectures most people associate with other utilities and would, no doubt, expect of the ultimate IT version.

In recent years, utility forms of computing have been seen in concepts like hosting, where various providers offer customers pay-per-use supporting IT infrastructures for applications such as e-business/e-commerce, storage, business continuity or more basic computing processing capability.

Hosting offer some utility-like features, particularly around pricing and delivery of services. From a platform perspective, however, hosting services are platform-unique, with little or no interoperability or integrated function between the hundreds of hosting companies in Canada who offer services.

However, the pricing model for hosting services can be based on pay-per-use or payment for varying “flavours” of services. In the latter case, a customer builds a somewhat tailored hosting service from a set of available standard functions and capabilities that best fits business demands. The hosting services themselves are network-accessed and delivered to customers and their customers. Unlike the model of highly customized outsourcing, there’s typically no transference of assets or staff.

More recently, many IT service providers are providing on-demand types of IT infrastructure services. Again, the “utility” aspect of these offerings is typically around how customers are billed – the utilization of a pay-per-use formula rather than a traditional type of outsourcing contract, which predefines a fixed term for both the types of services being delivered and the service levels themselves.

But these on-demand services are private and not public services, so that an offering you’d buy from, say, HP would not extend beyond those platforms owned and managed by HP. In fact, many on-demand services are simply outsourcing with a different services pricing structure. There’s no seamless computing infrastructure jointly built by multiple vendors and utilized by all these service providers. The experts agree this type of utility platform may be at least seven to 10 years in the future.

But the technologies, which will provide the building blocks for the computing utility, are today being introduced and developed, and in some cases are actually being put in place. The most familiar utility enabling technology is that of multiservice IP communications, which is slowly but surely being integrated throughout every private and public network. IP is currently seen as the only technology for voice, data and eventually video transmission across communication networks.

Much less advanced, but equally as important is the notion of Web services, which is the “middleware” fabric designed to create interoperability between disparate applications and processes. The advent of Web services is in an extremely early stage of development and there remains much work to be done, particularly in terms of agreed-upon standards and cooperative development.

Then there’s the management fabric that represents the necessary intelligence in networks and systems, which would provide things such as performance throttling capability, self-healing reliability and essential security. Work continues around smart-agent technology, for example, which involves intelligent functionality being built into each and every network-connected device.

These agents not only are able to collect various types of traffic information, but they are also able to make intelligent decisions that would direct “behaviour” in networks, systems and applications or processes. More and more automation in management function is also a necessary goal and a trend continues towards creating networks and systems management environments that provide proactive and heuristic capability that would direct devices to react in specifically defined circumstances with little or no operator intervention.

These previously mentioned concepts are but a few of the innovations driving computing towards utility. The idea is among the most ambitious ever conceived. The notion holds tremendous interest and potential. A true computing utility is evolving but isn’t here yet. The fact is it may not get here for a long, long time.