Avoiding future bloatware a key goal

The IT Industry today is complaining of “bloatware” in database packages: in operating systems: in run systems such as Java. How has this developed?

In the early days, an operating system was developed by an affiliate of Exxon Mobil, called IOCS. It took 700 bytes and handled all the input/output needs of the day. One of Microsoft’s operating systems today takes 700 million bytes and does nowhere near a million times what was done by IOCS. Less than five per cent of Windows’ “bloat” is needed by the vast majority of users. The “bloat” in Microsoft Word is so complex that apparently they have been unable to rewrite the system. “Bloatware” uses huge, generally unwanted resources and can never be error-free. It is inherently insecure and hacker-friendly. Why, then, does bloatware exist?

In the beginning, computer memory was tiny and developers had to optimise their code (first in machine language, then in assembler and later on in Fortran, Cobol and other languages). When memory became cheaper and bigger, most developers became sloppy and forgot optimization. Primitive operating systems were developed to act as “traffic cops” for the proliferation of languages and applications. These tried to be all things to all people and became burdensome to the industry. One example was a system handling large databases. Developed by Imperil Oil, it was simple, sophisticated and user-friendly. The code was given to IBM, which developed it without much planning to a “do everything for everyone” kludge called MIS.

Later, IBM developed the OS/360 operating system, which was trumpeted to run without ‘bombing’. It was so complex that software development was in layers, with defined interfaces between layers. After its introduction, software changes occurred about every two months, necessitating major user involvement. It was a necessity at various points for users to recompile every application they had written (and hope they still worked). The problem was trying to be all things to all people.

The next cycle occurred with the launch of Sputnik, followed by Landsat and other satellites. In this situation data was streamed from space in one long, continuous bit stream, giving images of the earth (or of clouds) containing millions of pixels. These needed to be interpreted in a reasonable time frame, so new techniques and algorithms had to be developed, along with parallel processing. Developers had to be innovative. Most of this early software remains undocumented to this day. As the industry adapted to parallel processing and as hardware became cheaper, developers still had to be innovative to try and accomplish, in real time, complex calculations. As a result, bloatware, almost of necessity, has tended to be less severe in that branch of IT.

The third stage was the introduction of the microcomputer. Again, memory was limited and developers had to be innovative. In fact, Microsoft had its origins in this era, developing DOS. Their early version was atrocious but they improved it to where it became a half-decent operating system. Millions of people still use DOS and one useful strategy for Microsoft, instead of introducing another bloatware, would be to upgrade DOS with an ability to add packages (such as a GUI) which the user wants, not what some na

Related Download
Virtualization: For Victory Over IT Complexity Sponsor: HPE
Virtualization: For Victory Over IT Complexity
Download this white paper to learn how to effectively deploy virtualization and create your own high-performance infrastructures
Register Now