Site icon IT World Canada

Basel II: Be glad it didn’t happen to you

The top-tier Canadian banks were slapped several years ago with yet another regulatory standard: Basel II. The second Basel accord demanded improved maintenance and documentation of capital adequacy and risk exposure, which kicked off a challenging multi-year process filled with trials, tribulations, and (eventually) triumph in time for the November 2007 deadline.

ComputerWorld Canada checked in recently with TD Bank and the Royal Bank of Canada, several of the country’s top IT outsourcing and consulting firms, and research analysts that were able to give us a blow-by-blow of Basel II compliance — and a possible blueprint for future compliance changes.

GETTING STARTED

Dave Cassie, who aided global banks in achieving Basel II compliance at Plano, Tex.-based EDS, said that projects were generally started back in 2005, giving banks a two-year window. “The Canadian implementation required a high level of detail, and end-to-end changes, as they were very complex systems. They ended up spending a lot more money than planned,” said Cassie.

The hard part was obtaining the very detailed data and analytic engines to understand their risk exposure and capital adequacy. “The challenge was that banks have been around (a long time), so their operational systems have been around for a long time — they have a complete set of systems,” Cassie said.

It also required a change in organizational thinking, according to Nicole Gadbois-Lavigne, senior executive consultant with Montreal-based IT outsourcing firm CGI, who also aided in Basel II compliance for the Canadian banks. “Before the Basel II standards came in, risk was based on portfolio differences, such as the types of loans. It was a top-down approach,” she said. “(Afterwards) there was a much deeper granularity. It’s a bottom-up, client-based approach that goes across the entire institution.”

Another challenge to understanding the required calculations and data you need is finding that different data. For instance, there’s transaction data — with equity trading, for example, there are multiple systems, multiple geographies. You have to find a way to find and collect that information, and then test it to prove that it does give a full picture of the bank. And Canadian banks do have a long history of mergers and acquisitions, so the same information can be stored in different places, according to Cassie.

“Since it has to be done on a per-client basis now, that required a consolidation of all systems, but it’s still such a siloed system, so we had to help with data warehousing and cleansing,” said Gadbois-Lavigne.

TD Bank suffered from a super-siloed system, according to Daryl Philip, manager of system infrastructure for Basel II. Said Philip: “They required a minimum of five years’ worth of data. You have to know the history to predict the future, but banks are notorious for not having their data in order. And we had to look at the data and get all the information so that we could write the programs. The data was scattered all over the bank, and we had to bring it together.”

To help counteract the disparate locations, Philip built account-level records. “So that way,” he said, “the information about when it was opened, whether it defaulted, whether it’s a write-off was there.” After being customized like this, the data was put into storage, where it could be better accessed by the modeling team.

The very nature of banking — constant change — also muddied the waters. “It’s a changing environment, so when it comes to sorting and managing the pertinent information, the in-flight set of parameters was always changing,” Cassie said. This is made even more confusing by the increasing amount of near- or real-time information, which is thrown in with information that is refreshed on a different basis (such as daily or weekly).

Banks usually resorted to a large data warehousing solution with business intelligence analytic engines that can prove risk exposure and capital adequacy. “Now it doesn’t matter which branch the customer goes to — they’ll get the same assessment — before there wasn’t as much quantitative data,” said Gadbois-Lavigne.

HOW IT WENT DOWN

Gadbois-Lavigne said that the confluence of the IAS, Sarbanes-Oxley, and Bill 198 standards made Basel II just the next thing in a long of risk compliance standards for the financial industry. TD Bank even had an established Basel group, according to Philip. “But the team has grown a lot over the last five years,” he said.

Said Cambridge, Mass.-based Forrester Research analyst Chris McClean: “There’s a better approach here. There was more of a reactive approach to Sarbanes-Oxley, but with Basel II, there was a looser timeframe, so they’d seen it coming for a while. There was a lot more strategizing and up-front discussion.”

This didn’t mean that it wasn’t a daunting task, according to Matteo Callea, the Basel program manager — data management for the Royal Bank of Canada. “Basel I was much (simpler), while Basel II was much bigger and a lot of structure was needed,” he said.

Thus, the Royal Bank team began working on its Basel II implementation back in 2004. First up, according to Gadbois-Lavigne, was an assessment of the type of processes and internal controls already in place — Canadian banks have around 50 regulatory bodies to which they are accountable, she said, “so it doesn’t make sense to have to do the same work several times over.”

And, said Gadbois-Lavigne, “It really helps banks if they have already established (best practices of how to deal with new compliance standards). That way, you have a map that is a clear view of what there is to do, reducing the amount of work and the cost.”

Mohammad Rifaie, vice-president of enterprise information management with the Royal Bank of Canada, said the overall solution architecture had three components: the business architecture, the application architecture, and the data architecture. “We then did a gap analysis, and that spawned a roadmap of how this would affect business projects and IT projects.”

This process was helped along by the bank’s in-house data management team, who, he said, were able to look over the requirements and offer guidance on the project. Rifaie said, “Yet the key to our success is the accountability. It was not a committee — there was an executive vice-president (who presided over everything as well), and there was an overall architecture.

The whole solution has to be architected and you have to implement a roadmap.” Over at TD Bank, the Basel II modeling team worked on building predictive models, said Philip, which would look at customer accounts and predict what would be lost if the account was defaulted, for example. “My job was to out the models into a monthly reporting flow that could show us probable loss, i.e. what the bank should put aside (in case of defaulting),” he said.

Both the Royal Bank and TD Bank IT staffers stuck with adding on to their infrastructure rather than implementing a slew of new infrastructure or applications, which worked with the entrenched mega-systems typical in the financial industry. There was a side benefit to this strategy, however:

“Because of not investing in anything really new, and augmenting our existing structure, we were able to maintain the data lineage, metadata progress and repositories, data quality, and modeling standards,” Rifaie said.

In the end, said Rifaie, the bank didn’t go over-budget. This was helped along by its “minimalist” approach, according to Ca

Exit mobile version