Next time you sit down at a computer, scroll for a link on a Web site, or even double-click a mouse, imagine having several dozen IBM employees staring over your shoulder, trying to ensure that your experience as a user is as enjoyable as possible.
The scenario isn’t as far-fetched as it may seem, given IBM’s invitation to users to test Big Blue’s products and provide feedback in the development of both software and hardware products at its User-Centered Design Lab (UCD) in Markham, Ont.
“We are looking for non-defect-oriented problems…problems with design,” said Karel Vrendenburg, program director and corporate team lead for UCD. He added that anything from ergonomics to human interaction factors, such as keyboard comfort, are taken into consideration when developing technologies.
Take, for example, the ThinkPad. The UCD labs brought in economy-class airline seats to simulate real-life environments for testers of the notebook. They discovered that something as simple as a small beam of light placed strategically at the bottom of the monitor would provide enough light for users to work on an airplane.
“It was really simple, but it was fundamental in making the user experience better,” Vrendenburg said. “We need to get into the head of a user. Insights gleaned from users are critical in designing an appropriate solution for them.”
At UCD, user suggestions for adding lights to laptops and adding autonomic features in DB2 version 8.1- where automatic controls help to alleviate human interaction with certain low-level tasks – were taken into account.
“We haven’t take the human out of the equation, but we’ve let the human focus on day-to-day impact of the overall database,” said Rick Sobiesiak, manager of the DB2 UCD.
Before any changes could be addressed for DB2 or other products tested at the lab, including the company’s WebSphere and Tivoli suites, as well as Big Blue’s Web site, several key questions had to be answered, Vrendenburg said. For example, who will be using the product; what does an individual want the product to do; what are their priorities when using specific software; what do they like and dislike about the way they’ve been getting their tasks done?
The lab also looks into competing products to get an idea of other products available on the market; test participants are also recruited.
Prototypes are then created, which can be as simple as pieces of paper with proposed screen designs sketched on them, or can be developed so that they look like finished products, IBM said.
Often there will be a site visit, so UCD members can observe the user and find out the issues that are important to them in their own work environment.
If software companies were to ask Simon McLaughlin, an Ottawa-based technical analyst, he would tell them of the changes he would like to see in the multimedia, presentation and development software he uses daily.
“All software products are hard to design and implement because there could be a million specifications required to make a customer happy with a limited amount of money to make it happen, so invariably some usability gets tossed aside,” he said.
For McLaughlin, cleaning up cluttered interfaces is one of several priorities on his to-do list.
“I feel that many software companies are trying to make the deadline rather than make the grade. A good program is still plagued with errors, but it gets most of its point across. A bad program has to defend the disclaimer at the bottom of the package – ‘we are not responsible if the program does not work as expected,'” he added.
When users enter the UCD lab, after of course being comforted by dinner, they are taken though a rigorous two- to four-hour interactive test session where staff – watching the testing through one-way glass – take note of all reactions, comments and user performance before deciding what to keep and what to change about the design.
“What is obvious to the designer isn’t always obvious to the user,” said Mike Beltzner, a member of the Toronto UCD team. “It’s a good opportunity for [the designer] to step out of the box.”
Beltzner is quick to point out that there are no wrong answers during the design evaluation.