Site icon IT World Canada

Don

If you thought the quality of the software you’re buying is something you could take for granted…well think again!

According to one expert, there is a long history of products being launched before they are fully debugged. As a general practice, software companies release a product once it reaches at least 80 per cent functionality, according to independent analyst Warren Shiau.

And that’s not altogether unacceptable, he said. “Unless we are talking about a very simple [end user-type] desktop application, the complexity of the software is such that it’s unrealistic to expect it to be released completely bug-free.”

Shiau added that while software companies don’t deliberately release buggy codes, there are limits to what are acceptable to corporate users. If companies continuously release faulty software, they should not be surprised if they lose customers, he said.

According to Cynthia Morris, there’s a vital and indisputable link between software quality and the level of confidence customers have in a vendor. Morris is research and development director at Cary, NC-based SAS Institute Inc.

She said SAS Institute’s quality control processes kick in right from the design and development stage up until when the product is ready to be shipped. Testers and developers, she said, work closely together in the early stages of software development, exchanging design ideas and strategies.

Testing, Morris said, plays a huge role in SAS’s production process, with 136,000 automated tests in place. “We have testers within each division and they report to either a testing manager or testing director, who [in turn] reports to a division head independent from the product group.”

The developers themselves are expected to perform unit testing and once they are satisfied, they pass the product on to the testers for further manual and automated tests.

Automated testing allows SAS to run audit trails that track time stamp as well as its results. In case of a regression, fixes on defects are controlled to determine what areas of the test need to be re-run and re-validated.

According to Morris, SAS implements a multiple sign-off matrix. Everyone interacting with the product signs off when they are satisfied. Persons signing off include the test manager, development manager, tech support, tech writer, and the product director. The product only ships after all the signoffs are completed, the SAS director said.

She described the company’s comprehensive internal defect monitoring system. When a defect is entered, she said, its priority is determined. If the flaw is listed as an “alert” priority type, the developer, tester and tech support person are notified. They discuss the source of the flaw, and whether it will pose problems to customers as they use the software. “It is discussed within a group of individuals that care about customers. If it (needs to be fixed) we hold up the release of the product for the fix to be put in and validated,” Morris said.

When a problem is encountered, the fix is more likely to be made before the release than after – even if that means delaying the launch, she said, adding that this is a practice established by SAS Institute president and CEO, James Goodnight.

According to Morris, proof that this system really works can be found in the fact that 98 per cent of SAS Institute customers renew their license each year. (SAS licenses it products to customers on a yearly basis).

Morris said there need not be a conflict between meeting deadlines and achieving exceptional software quality. The trick, she said, is achieving a balance between the two.

She said SAS achieves that balance through ongoing communication between its marketing and product development teams. These discussions determine what can be realistically achieved within the timeframes set, she said.

A Canadian software quality expert agrees that realistic development timelines are absolutely vital to proper quality control. “People involved in driving quality need to [also] be involved in defining reasonable timelines to deliver the quality levels expected,” said Robert Koblovsky, vice-president of Ottawa-based Data Kinetics Ltd. (DKL).

Affiliated with the Quality Assurance Institute in Orlando, Florida, DKL provides software quality training for corporations. The company hosts the annual International Quality Conference in Toronto.

Koblovsky said interest in quality assurance has grown noticeably over the past year and the growing number of quality trainees is proof of that.

Attendees to the International Quality Conference have also increased, from just over 100 three years ago to 250 last year.

This year, Koblovsky said, DKL expects to get close to 500 participants.

Exit mobile version