Site icon IT World Canada

Outsourcing

Success in most outsourcing agreements is measured by defined service levels, often contained in an appendix of the contract entitled “Service Level Agreement (SLA).” As these are intended to measure the success of the overall relationship, many people then assume that the rest of the agreement, once signed, is no longer needed unless there is a dispute or audit.

SLA metrics are based on what was contemplated when the contract was negotiated and may not reflect what is needed or expected now. A change process to update the SLAs is part of a good governance structure (another topic).

Like any number, SLA metrics are only as good as the data collected to compute them and these arbitrary numbers are generally surrogates for the business outcomes desired. So “help desk responsiveness” is translated into how many rings before the rep answers the phone, not whether the problem is resolved on that call. Or “number of call tickets closed” doesn’t measure whether the same user had to call back three times for the same thing if the data collected and reported doesn’t directly relate the new call to the previous ones. As fewer is better, determining and collecting the data on repeat calls would reduce the closure numbers – perhaps not the best incentive for the vendor to put effort into tracking and changing procedures to reduce the repeats.

As with anything measured, focus on the numbers won’t necessarily improve or maintain the quality of service. For things like application response time, a number is pretty good e.g. “less than 1/2 sec response to prompting for next entry 98% of the time during peak hours at the user workstation” is relevant and practical as it relates to the business workflow. For other things, like, must have one meetings per month to discuss service incidents, isn’t going to be very helpful without some discipline and goodwill around these meetings: stated agenda, required participants and their authority, what outcomes are required and how will these be executed, escalation and dispute resolution (which is why the rest of the contract doesn’t get filed away).

What each SLA measures and how has to be well understood by both parties, as the vendor will provide most or all of the information. Too few measures won’t yield enough information to determine whether the relationship is delivering what is expected. Too many measures and the process devolves into an exercise to create a batting average: “we achieved 95% of all target measures.” As the measures are rarely all of equal importance, 95% could be either fantastic or abysmal, but it’s not 100% which may have been the expectation.

Building a reasonable set of SLA measures requires a careful look at the business outcomes desired, rather than diving in with a technical system performance view (CPU utilization, etc). After all, outsourcing is about letting go of the operational and technical detail, using others to do that and focusing on managing the business, which of course, includes just enough of the detail to assure yourself things are working.

The final and most important measure is the least precise. Client satisfaction with the services needs to meet the objectives set out for overall service provision, regardless of who delivers it. Keeping in close touch with user concerns and changing expectations is the critical success factor. These changes will demand a very good change management process be built into your agreement and into your budget planning for the life of the outsourcing arrangement.

There’s a lot more that can be said about SLAs – what do you think?

Exit mobile version