A costly affair

Opinion

Computerworld (U.S.)’s Frank Hayes recently wrote a column chastising IT for being complacent about bugs in software despite the fact that bugs cost U.S. users about US$60 billion per year. If you haven’t already read it, shame on you. Get out your Nov.15 copy of Computerworld Canada or go online and check out Frank’s column, “Don’t shrug off bugs – they cost in terms of money, risk and downtime.” I couldn’t agree more with him, but this topic begs for one additional observation that is too often missing from the debate.

First, like Frank, I am appalled at how we tend to shrug off software bugs, especially when there are so many methods of prevention at our disposal.

We have self-policing languages like Java, which automatically does garbage collection, buffer overrun protection and more. Combine these features with Java’s exception handling, and you end up with the equivalent of a runtime debugger. All programmers have to do to exploit this capability is make it a habit to catch exceptions and report the errors. Mix in a beta program to expose those errors, and you’re likely to end up with a program that is relatively bug-free and, just as important, resistant to malicious attacks.

None of this is rocket science, and none of it is a particularly expensive addition to the development cycle.

If you don’t happen to like Java, that’s no excuse. There are plenty of other languages and runtime environments with similar features.

If you have no choice but to use error-prone languages such as C or C++, there are plenty of development tools for these languages that will sniff out buffer overruns, pointer errors and other common programming mistakes behind most application bugs. And, of course, give your applications to the most brain-dead users you can find in order to test, test and test your applications again.

That brings us to the one remaining obstacle to stable client software, the unpleasant problem nobody likes to address. I’ll give you a tip on how to track it down. Sit down at a Nintendo Co. Ltd. GameCube or a Sony Corp. PlayStation 2 and play some games from start to finish. Then do the same on a PC. Chances are, you finished the console games without encountering any quirks, bugs or game crashes. At most, you might have been able to exploit a programming bug to cheat at the games.

In sharp contrast, you probably encountered your first problem with the PC games when the installer complained that your version of DirectX was out of date. (DirectX is the Microsoft Corp. graphics API designed mostly for PC games.)

Assuming you had enough CPU horsepower and memory to make the game enjoyable once it was installed, the game probably crashed at least once, if not several times, before you were done.

Console games are more stable because a game console is a highly predictable platform with a stable API. If you can find any differences between the hardware or software in two PlayStations or GameCubes, the differences will be subtle and unlikely to affect the way a program behaves.

Pick any two PCs, however, and they are likely to have radically different display cards and drivers, different DirectX APIs or different versions of the operating system. They probably won’t even have the same chip sets on the motherboard.

Replace the PC with a console, a.k.a. network appliance or network computer, and you create a predictable platform for software developers, which should result in much more stable software, not to mention more secure software. Network computing fizzled for a number of reasons the first time around, some of them good ones, some bad. For one thing, once Larry Ellison’s low price tag was imprinted on everyone’s brain, there was no way to build a network computer fast enough to run Java well, or to sell one at a profit. One very bad reason network computing failed is that we have such an irrational love affair with the PC that we tolerate its unstable and insecure design.

I think it’s been long enough since the network computer’s initial failure that we can revive and rethink the concept. There may be no other way to recover some of that US$60 billion we lose on bugs each year.

Petreley is a Computerworld (U.S.) columnist, a computer consultant and author in Hayward, Calif. He can be reached at[email protected].

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now