Big Blue presses ahead with predictive computing


While several major players – including HP and Microsoft – have been dedicating significant funds and resources to promoting autonomic computing – IBM is definitely ahead of the pack.

Big Blue has been researching autonomic systems the longest and hardest, opting for a more holistic approach.

“IBM is also a major player with its Tivoli product line. If you have a network management environment that requires fewer human resources to keep it up and running, that becomes a significant purchase criterion for potential adoption,” says Levy.

In the evolution of intelligent IT systems, Big Blue’s ultimate objective is to make computers invisible, says Marin Litoiu, senior researcher at IBM Canada’s Centre for Advanced Studies (CAS) in Toronto. “We want them to hide behind the wall, like electricity, and try to make the user see the computing, not the computer,” he says.

But many elements are needed to turn dumb systems into intelligent ones. And evolution in this area has been slow and incremental because all elements must work in concert. Any revolutionary breakthrough in one element, say processor speed, would be like an amoeba suddenly sprouting an eye. To take advantage of the new capability, the other elements such as hardware, software and so on need to catch up to evolve the computer organism as a whole.

“Before the technology, you need science, but we don’t have the full mathematical apparatus yet to make systems smarter,” says Litoiu. Different mathematical models are needed depending on what you want the system to do, be it self-diagnosis, self-repair, or capacity planning. Fuzzy logic, chaos theory, regression analysis, what-if scenarios and so on are all brought to bear on problems.

Litoiu says IBM’s new breed of systems will have more predictive capabilities in the near future . Systems will be able to anticipate what will happen a day or so in advance. Mathematical models similar to those used to predict the trajectory of hurricanes will be used.

These use not just historical patterns, but also look at the present using a set of equations to describe the current environment and to extrapolate from it. “Just like weather systems, the longer the predictive interval, the less accurate you are,” he says. “But in most cases, short-term predictions can be accurate, and this forward-looking capability will make computers more successful and attractive.”

Litoiu points out that all engineering fields have gone through this process of developing self-managing systems, and ironically enough, computers are typically used to achieve that. “Stability controls in automobiles use computers. But we haven’t looked into computers automating themselves until recently, because we now have a critical mass of computers that are becoming hard to manage.”

Mathematical models are also used in aviation to self-manage in different scenarios. During a bout of turbulence, for example, the flaps in plane wings are set to move up and down at a particular speed and angle to adapt.

But Levy believes comparisons of the computer sector to aviation are limited.

He points out only a few thousand aircraft are sold every year by a small number of manufacturers, the standards in cockpit design are well-established, and most other components are common in all models and controllable. Building autonomic capabilities is easier in this context. “Aviation is a micro-world run by a group of vendors who run the whole show. There are no open standards or development ecosystems, and it’s all done on a relatively small scale.”

In marked contrast, millions of desktops and network operating systems ship every year and there are thousands of vendors, so it is more difficult to impose standards on this immature and heterogeneous sector.

“The will is there but it will take more time for the world to catch up,” says Levy. “Standards, software design, hardware, processors – everything must mature to make real autonomic computer systems possible. You need a universally strong foundation, or it’s ultimately doomed to fail.”

Read Part 1 of this article: The ghost in the machine – the promise of autonomic computing


Related Download
3 reasons why Hyperconverged is the cost-efficient, simplified infrastructure for the modern data center Sponsor: Lenovo
3 reasons why Hyperconverged is the cost-efficient, simplified infrastructure for the modern data center
Find out how Hyperconverged systems can help you meet the challenges of the modern IT department. Click here to find out more.
Register Now