Code quality to take a hit

As if the year 2000 problem won’t be enough of a Pandora’s box, here’s another one for IT organizations: software developed by user companies next year could be buggier than ever.

According to soon-to-be-released research from Stamford, Conn.-based Meta Group Inc., more than half the world’s biggest companies have disbanded their in-house software quality assurance departments as they have pulled specialists onto their Y2K projects. The trickle-down effect, according to Meta Group, is that few of these companies will redeploy their quality teams, so application development error rates will increase.

Meta Group’s software quality assurance research, which is drawn from its “Worldwide IT Trends and Benchmark Report,” is based on survey responses from information technology executives at 318 of the world’s 2,000 biggest companies.

But IT executives and other analysts had mixed opinions about whether year 2000 projects will break up corporate software quality assurance teams or ultimately degrade the integrity of future software development efforts, as Meta Group’s survey suggests.

Capers Jones, president of Software Productivity Research Inc. in Burlington, Mass., is clearly on the pessimistic side of the fence, even without any Y2K-induced reshuffling of staff. The current state of software quality “is very troubling,” he said, both for software vendors and user companies that develop their own applications.

Companies that are cutting back on their software quality assurance efforts, said Jones, “are suspect in terms of their actual knowledge of software economics.”

The Meta Group research finds that legacy system developments that rely on “waterfall” programming techniques such as Cobol will be harder hit than object-oriented and newer languages that rely on newer, more sophisticated development approaches. Still, e-commerce applications may also be vulnerable because “there will be less documentation and testing” during application development cycles, said Malcolm Slovin, a Meta Group analyst.

Several IT executives and analysts disagreed with Meta Group’s conclusions. If companies experience any software quality degradation going forward, “it’s more about the pace of [e-commerce demands and] software innovations that companies have to embrace,” said John McKinley, chief technology officer at Merrill Lynch & Co. in New York.

“If anything, I think most [quality assurance] specialists who worked on year 2000 projects will take that experience under their belts and apply it positively” to future development efforts, added John Burns, vice-president of projects at Canadian Imperial Bank of Commerce in Toronto.

In addition, as companies such as Monsanto Co. rely more on packaged commercial software and less on in-house development, “there’s more of an emphasis on systems integration and less on application development,” said John Ogens, year 2000 project director at the St. Louis-based maker of agricultural and chemical products.

Research at Meta Group’s crosstown rival Gartner Group Inc. has found that year 2000 projects have had only a negligible impact on software quality assurance activities. The incidences “have been very few and very isolated,” said Lou Marcoccio, head of Gartner’s year 2000 research.

Nonetheless, some industry veterans said they believe Meta Group may be on to something. “Anytime you have a major systems implementation, you’re going to have exposure” in other areas, said Dick Arns, executive director at Chicago Research & Planning Group, a Chicago-based user group of CIOs and corporate technology officers.