Site icon IT World Canada

Benchmarking

Having recently launched a brand new online service, your staff approach you with big smiles on their faces. The numbers are in and, according to a survey of your clients, 75 per cent of those who use the online service are satisfied.

So: 75 per cent, you think.

Is this good or bad?

You remember reading that the average satisfaction score for online services is 68 per cent, but if other services like yours are scoring in the 80s then you have a lot of work to do.

So how does 75 per cent compare to other services like yours? Further, what should you focus on to improve the quality of the service and raise client satisfaction even higher? T

hese are not new questions, and are certainly not questions restricted to online services. They are, however, questions that we are beginning to answer in Canada.

The Common Measurement Tool As early as 1997, public servants across Canada began working together to develop a Common Measurement Tool (CMT) for assessing and benchmarking client satisfaction with public services.

Consisting of a uniform set of questions and response scales, the CMT makes it easy for public sector managers to design client satisfaction surveys, focusing on issues such as the timeliness of service and accessibility.

Whether designing a survey from scratch or simply enhancing an existing survey, the CMT has proven to be a valuable tool when assessing client satisfaction.

What Drives Satisfaction with Online Services With the growth of electronic service delivery (ESD), the questions in the CMT were expanded to assess factors that specifically drive satisfaction with online service delivery.

For example, while the timeliness of service is still of critical importance in ESD, other factors such as the navigation ease of the web site, the quality of the information and the visual appeal of the site have proven to be defining factors in whether a user is satisfied with a given online service experience or not.

By specifically asking clients about these key drivers of satisfaction, service managers gain important insight into the strengths and weaknesses of their services. Moreover, in times of limited resources, this information enables managers to focus their resources more effectively and support the business case for service improvement.

By concentrating on the key drivers of satisfaction, public sector service managers know they are making changes that clients will notice and that will result in higher levels of client satisfaction that can be tracked and reported. It is this logic that persuaded the federal government to incorporate CMT questions into the assessment of its Canada Site as well as all other online services.

Ottawa maintains a panel of Internet users who regularly review government web sites and services. As part of this innovative client consultation process, users answer a series of CMT questions, providing feedback on the factors that drive satisfaction with online services.

In fact, in ranking Canada #1 in its annual assessment of e-government leadership, Accenture noted that the government uses the CMT as part of a “sophisticated performance measurement framework.”

Benchmarking Client Satisfaction

As valuable as the CMT is in designing surveys and focusing service improvement initiatives, the real value of using a common set of questions lies in the ability to then compare or benchmark the results against those of peer organizations and services.

To facilitate this benchmarking, the Institute for Citizen-Centred Service (ICCS) recently launched a CMT Benchmarking Database.

Users of the CMT now can submit their survey data to a central repository to compare themselves with organizations with similar characteristics such as size, mandate, service or client type – anonymously, to ensure that their results are not made public.

In return for adding their survey data to the benchmarking database, CMT users receive a benchmarking report that compares their results against up to three different peer groups.

For example, a provincial service that enables clients to apply for a hunting licence online might wish to compare its results against a) other online services, b) other services involving an application, and/or c) other services focused on hunting/fishing licenses. T

hese various benchmarking groups can be mixed and matched according to the needs of the CMT user, providing the kind of apples-to-apples comparison required.

Building a Resource for the Public Sector Recognizing that benchmarks are only as valuable as the quality of the comparison, as the amount of data in the CMT benchmarking application grows, this resource will become increasingly valuable to the public sector service community.

Eventually we will be able to say with great accuracy whether the 75 per cent satisfaction rate noted earlier is, in fact, a good result or a poor one for the service in question.

More importantly, we will also be able to use this benchmarking exercise to bring together communities of interest, enabling public servants across Canada (and around the world) to learn from each other in improving the quality of public services and client satisfaction.

Charles Vincent and Wendy Paquette are Program Managers for the Institute for Citizen-Centred Service (www.iccs-isac.org).

Exit mobile version