IBM Corp. and Ontario’s Ministry of Community and Social Services have been stung by the province’s auditor general for the bungled launch last year of a new software system for managing social assistance claims that will cost taxpayers tens of millions of dollars for software fixes and overpayments to recipients.
So far well over $240 million has been spent on a project originally budgeted for $202 million, and the total could hit $290 million. Because the Social Assistance Management System (SAMS) won’t be fully functional for another four months the final bill isn’t in yet.
Nor can it be said that SAMS performs better than the troubled system it replaced until all of problems are fixed.
It seems to be a perfect example of how anything that can go wrong will go wrong in a managing a complex software project three groups were assembling
— IBM, which was to convert two years of legacy data but delivered data three years late that had 114,000 errors;
–Curam Software, which IBM bought after the government chose its case management application. Responsible for customizing the software, the delivered application had 2,400 serious defects;
–and the ministry, responsible for reporting and automatic letter generating features .
The ministry received the bulk of the blame in last week’s annual report from auditor Bonnie Lysyk. Its executive committee — which included two government CIOs — was told of problems but it “knowingly assumed the significant risk of launching a new computer system that was not functioning properly.”
That committee knew that not all launch criteria and other requirements had been met when it okayed the system. But, Lysyk added, it also didn’t know a high number of serious defects had been found and that some crucial tests had produced poorer results than reported.
Not only that, according to the office of the provincial controller, SAMS is the only computer system ever connected to the government’s accounting system without passing government mandated payment testing.
The auditor also said the department didn’t properly oversee the work of IBM, which not only didn’t meet deadlines for data conversion was also responsible after the purchase of managing Curam. However, Lysyk acknowledged the ministry IBM’s job more difficult by changing the government’s requirements late into the project. That was a factor in delaying data conversion, which in turn meant that a scheduled pilot test with converted data couldn’t be held.
SIDEBAR: Ontario effort ‘among the worse’ a project manager has ever seen
When launched in November 2014 — 20 months behind schedule — the software had many serious defects that caused numerous errors, including benefit calculation errors and the issuance of many letters and tax information slips with incorrect information.
“In fact,” Lysyk said, “until most of the serious defects are identified and fixed, the system will continue to generate errors” and be open to fraud.
Complaints about SAMS began immediately, forcing the province to hire PriceWaterhouse Cooper to straighten things out. Its plan “will allow us to satisfy our commitment to implementing all of the auditor general’s recommendations for our ministry,” Daniel Schultz, the ministry’s senior media and issues co-ordinator, said in an email.
“We have already completed 75 per cent of the transition plan and expect to fully implement it by spring 2016.”
He also noted that the province began a review in December 2014 of its IT organization, led by Gartner Canada, focusing “on identifying ways to improve efficiency and value for money and to modernize how IT supports government business and service delivery, including how we manage and oversee large projects.” That report is being used to develop a strategy to modernize government IT.
The ministry will also be completing a comprehensive lessons learned assessment on SAMS through the office of the Corporate Chief Information Officer to determine how decision-making could be improved going forward, Schultz added.
Asked why the project was managed so poorly, he replied that “There are lessons learned by all of the people who worked on this project and the auditor general identified some of them – including information sharing. We will ensure those lessons change the way we do things in the ministry and we will share the lessons across government.”
The government insists that when it is fully implemented SAMS will be superior to the old system. SAMS has successfully processed seven million payments for nearly 900,000 social assistance clients since it was implemented in 2014, it said in October. Technical issues that caused overpayments in late 2014 have been fixed.
Asked to comment on the auditor’s report, an IBM spokesperson said by email “it would be inappropriate for IBM to discuss the specifics of its client contracts.”
About 11,000 ministry and municipal caseworkers and mangers rely on SAMS to help determine eligibility, calculate and deliver about $6.6 billion social assistance a year to approximately 900,000 people.
SAMS, which replaced a problem-riddled system, was built around an off-the-shelf case management system from Curam Software Ltd. chosen by the ministry in 2009 after a competition. The ministry budgeted $202 million for a new suite, but Curam’s solution came in at $165 million.
IBM [NYSE: IBM] was originally hired to oversee building the interfaces and convert data from the old system. Then it bought Curam in 2011 and took over more supervision of the project, which the auditor’s office says didn’t help — an IBM project manager was overseeing IBM’s work.
“When we reviewed the (IBM) project manager’s work, we found the project manager neither tracked the hours Curam consultants spent fixing SAMS’ defects nor included this information in his analysis,” the auditor wrote.
“Consultants billed an average hourly rate of $190. They were overseen by other consultants who were paid daily rates as high as $2,000. Many consultants took much longer than anticipated to complete their work, and in some instances billed for time spent on fixing errors in their own work.
“The ministry’s budget for Curam’s consultants more than doubled, from $14 million in the original budget to $32 million at launch. The vagueness in consultants’ time reporting, and the lack of independent oversight during much of the project, made it difficult to assess how efficiently consultants were working.”
In April 2013 the IBM project manager was replaced by an independent contractor to oversee Curam’s work, who improved the way work was documented and analyzed. But From March 2014 on, the independent consultant stopped assessing if work was done efficiently or even on time because development of SAMS was essentially complete and consultants were mainly working on fixing defects, says the auditor. The ministry decided not to assess how efficiently this work was being performed.
Meanwhile, six months before launch, to spread the workload within the department the ministry decided to shift the reporting responsibilities of the staff testing the readiness of SAMS from the technical project director to the business project director, who didn’t have an IT background.
Among the problems was the government’s early decision to implement all of the pieces of SAMS at once — or, as the report says, in one “big bang,” — which as most project managers know is risky without a pilot. But the ministry thought it could be done with adequate pre-launch testing.
SAMS was supposed to launch in March 2011, but it didn’t until 20 months later in part because the data wasn’t converted in time. “IBM failed to meet its deadline on three occasions,” the auditor wrote, “and the ministry extended the deadlines three times.” In fact the government complained in writing to IBM’s CEO about the inability to abide by the terms of its contract and meet the revised deadlines. In response IBM gave a stronger commitment to finish the job and did deliver converted data in April 2014, the auditor said, but this was far too late to keep to a targeted May 2014 launch. And the data had costly errors.
The ministry told the auditor that IBM tried to compensate it for Curam’s and IBM’s poor performance by providing $12.8 million in what were called “free services” — unbilled overtime (although some was estimated, not tracked) and discounts on its hourly work. However the auditor rejected the characterization of discounted hourly rates as “free services” because “the discounts were negligible, and the ministry was still paying significant rates per hour.”
“It is true that the ministry revised its requirements for SAMS on several occasions, while IBM was still doing its work,” added the auditor, “and this posed challenges for the data-conversion process.” Meanwhile the cost rose to $242 million.
Because of IBM’s delays, the auditor added, there wasn’t an effective pilot of the system with converted data.
Part of the problem, the auditor said, started in the beginning: “In some ways, SAMS was poorly designed, and the ministry had not addressed this basic flaw at the time of our audit.”
For example, SAMS forces caseworkers to enter the name of a school for each child in a family applying for benefits—including children not yet in school. To get the software to accept the application, caseworkers type “fake school” for children not yet in school. And caseworkers must enter fictitious address information for clients who are homeless or move frequently. “These are not defects,” said the auditor, “they are design flaws.”
IBM finally delivered the data in April 2014, but at launch, there were about 114,000 errors in the data that caused SAMS to generate incorrect results for client eligibility and benefit payments.
In addition there were 2,353 major software defects either known prior or found after launch, almost half (1,132) related to eligibility determination and incorrect payment amounts as well as system functions that didn’t work.
The ministry launched anyway, the report says, because it considered the risks of delaying the launch greater than the risks of launching a system that was not fully ready.
The ministry’s IT staff were installing software upgrades to fix defects, but weren’t fully testing them. This was partly because it did not know how to test them, the auditor wrote: Just prior to launch, it didn’t renew contracts with certain consultants who would have been the most effective in testing the fixes.
As of October 2015, the auditor found SAMS responsible for about $140 million in benefit calculation errors ($89 million in potential overpayments and $51 million in potential underpayments). In addition many letters and tax information slips with incorrect information had been issued by the minstry, some of which may never be resolved.
Front-line workers have to spend much of their time performing “workarounds” to deal with complex errors that SAMS was generating, the auditor found, so have less time to serve clients.
And so far it can’t generate reports with accurate information, which affects the ability of the ministry and municipalities to administer social assistance.
Among her recommendations Lysyk said to improve the decision-making process used to launch a major information system, the ministry of community and social services should ensure that the decision to launch an IT system is based on relevant criteria and information that provides decision-makers a complete and accurate status of system readiness.
She also said the provincial internal audit team should independently review key information used in assessing the system’s state of readiness while making the decision to launch. Internal audit proposed it go over SAMS’ readiness for launch, but it couldn’t strike an agreement with the project leads on the scope of a review. Internal audit told the auditor-general’s office that the ministry believed the IBM consultants on the project team had all the expertise needed to advise on SAMS’ readiness for launch.