Plug-and-play progressing

With ‘plug-and-play’ hardware, different peripherals are linked seamlessly to PCs using software that recognizes the new hardware and integrates it with existing hardware attachments. On the other hand, plug-and-play software is not likely to reach this stage using today’s approach to building applications, though some progress is being made. Plug-and-play application software will soon appear on the market.

One approach is IBM’s “On Demand” strategy. In theory, it is similar to ‘just-in-time’ inventory that is so loved by manufacturers.

In on-demand software operations a user does not buy software packages, but has the application software transmitted through a network when needed. The user pays to rent it. The problem with current on-demand solutions is the size of applications requiring excessive network bandwidth.

While the elimination of operating systems is on the horizon, they will be around for a few more years.

An examination of one Windows OS shows 598,699,867 bytes of drive space occupied before a single application is run. When you consider that most users need less than five per cent of an OS’s functionality, this represents an obscene resource waste.

The control of such a huge amount of code is difficult and involves establishing several interface layers in a hierarchical system. It is impossible to guarantee that so much code is free of errors, and thus many places exist where security can be breached, as evidenced by the $46 billion annual cost incurred as a result of worms, viruses, hackers, identity thieves and spammers. In 1995,

ComputerWorld Canada reported that Microsoft Corp. was about to change its programming and ease the legacy burden by developing what was then known as Intentional Programming (IP). IP involved the construction of a large number of reusable elements.

Microsoft offered IP as something new, but the technology to do IP had been developed at the University of Manitoba about thirty years before. IP-enabled language was operating system oriented. The Manitoba development eliminated conventional programming languages altogether, along with the need for operating systems. A few years later Robin Bloor, a U.K. guru, showed how the concepts of IP could be generated with the Manitoba pioneering.

Plug-and-play software cannot be developed with current software fragmentation, which has been caused by an adherence to the von Neumann philosophy of computing.

The Turing approach, which has been proven successful, is ideal for plug-and-play. Turing involves the creation of ‘states of mind’ (essentially reusable modules, so IBM is on the right track but heading in the wrong direction), which can be plugged together to build any application desired.

Recent developments of the Turing concept have gone even further, replacing reusable code modules with a numeric structure. The numeric approach can eliminate most of the virus, hacker and worm intrusions plaguing the in-dustry.

Hodson is a theoretical physicist, speaker and writer. He can be reached at

Quick Link: 059865

Related Download
3 reasons why Hyperconverged is the cost-efficient, simplified infrastructure for the modern data center Sponsor: Lenovo
3 reasons why Hyperconverged is the cost-efficient, simplified infrastructure for the modern data center
Find out how Hyperconverged systems can help you meet the challenges of the modern IT department. Click here to find out more.
Register Now