Teamwork enhances RBC

The Global Market Risk Management System at RBC Financial Group spells competitive advantage for Canada’s largest company as measured by assets and market capitalization.

“We’re one of only five banks in the world that can measure market risk at this level on a daily basis,” says Suzanne Haddad, senior vice-president, Corporate Service Technologies, recounting first the business significance of RBC Financial Group’s recent major undertaking. “On the technical side, although the system was extremely complex and introduced new technologies, the team was able to deliver the solution within a very tight timeline and within budget. This was a great achievement that could have only been accomplished through strong team work, commitment and in-depth competencies. RBC Financial Group processes approximately 400,000 trading transactions and performs more than 300 million computations overnight to independently re-assess the market risk on proprietary trading portfolios managed in four global trading centres – Toronto, New York, London and Sydney.

The Global Market Risk Management System, referred to as GMRM by RBC, collects these millions of data points nightly from each day’s trades and calculates on a transaction level the value at risk (VaR) and other risk measures. It provides actionable risk reports to all trading centres overnight so trading risk management and senior executives know before the opening bell at 8 a.m. the next day just what is on the books to plan their actions to mitigate the risks. It also provides in-depth risk reporting and analytics for legislators, regulators and RBC’s executive and board of directors. As such, it enables the company to readily meet the increasing OSFI requirements for the use of models to measure specific risk on credit risky financial instruments. It also adds capacity for new business through orga-nic growth and acquisitions as it is flexible and scalable to cover additional products.

“The previous version of the system didn’t have the granularity the business people needed to drill down to the trade position level to decipher why the VaR was higher than expected, exceeding thresholds, or just out of whack,” reports Maureen Murphy, group manager, risk management technology. “The second generation of GMRM enables the analysts to dig in and find out that an incorrect amount was input on the trading ticket for example, which can be corrected and rerun the next day.”

“The difference between this system and its forerunner is that the first system was geared toward calculating the VaR as a compliance initiative in accordance with the 1998 computing risk measure regulator requirements,” adds James Liao, senior project manager. “The second system is strategically geared toward delivering information. We have more information on transaction and an overall integration of information and technology so we’re ready for more organic growth.”

He also describes the first generation GMRM as “a custom-built program that mapped data to a risk engine to calculate the VaRs. The second one was a best of breed approach to integrate different technologies together.”

Haddad explains that the current GMRM introduced three components: a centralized data warehouse repository of information, new specific risk models and stress testing scenarios, and business intelligence tools to mine the data.

“What was one of our thrills on this project was crafting and designing a system that had not been attempted before,” Murphy adds. But that brought technology challenges in making vendor hardware/software products and in-house built components work together to deliver the solution. She recalls it took a lot of time testing and ensuring the environment was right before getting to the actual application. It was also tested running parallel for four months before going live. During that time the new system’s results were also provided to the Canadian regulator OSFI as further proof that the mathematical models and approach were sound.

The system runs on a Sun Unix platform with a Compaq SAN, Oracle database and Veritas storage management software. Murphy says they have several tools for filling the gaps on how those pieces get connected.

They use RiskWatch from Toronto-based Algorithmics Inc. as the risk calculation engine. Since the trading systems feeding into this system are built for their own purposes, the input data formats are very diverse. The project team standardized the input format with a mapping tool from Evolutionary Technologies International (ETI) to support the extract, transform and load (ETL) process. Liao says RBC is also building a custom tool for another layer of mapping to translate data into the risk engine format which allows the business to integrate new mapping rules faster and easier than before.

Microsoft Analysis Services is used as a quick multi-dimensional analytical tool to mine the data.

Liao says the new system is capable of posting and managing more than 40 days of transactional information, compared to the old system’s seven-day capacity. He describes the old system as storing numbers at the top portfolio level compared to the new system managing data at a transactional level.

The first GMRM system was not at this advanced level for two keys reasons. One is that at that time, around 1996, there were not many data processing applications then that could handle this type of volume or level of complexity and those that existed would have been hugely expensive, Liao explains. Secondly, the demand or need was not identified at that time. Haddad explains that the new system was a result of continuously improving the management and mitigation of market risk.

“This is an ongoing process,” she notes. “As new demands/business needs are defined, we continuously need to upgrade our systems. With the business and technology folks in close partnership, there is greater opportunity to be more proactive.”

Listening to Murphy describe the project team, one can understand how that close partnership came about. For one, they were roughly equal in number with the project team including 12 to 18 people on the tech side and the same number on the business side. The teams were co-located which enabled constant communication and discussion between team members. They also began in step with what Murphy describes as “a real recognition of both parties as to what knowledge and experience they were each bringing to the table.”

Cross-training at the beginning of the project built on that recognition, she says, ensuring that the business people gained understanding of some of the technology limitations and challenges, while the tech group learned more about what the business needed to do in terms of foreign exchange, equity trading and interest rate risk.

“So from a technology perspective we better understood what the data needs were,” she explains. “It’s not so much what these people did with the data but why they needed it. Then we were able to explain to them what our challenges were in trying to deliver. Right from the beginning, there was a concerted effort made to ensure that both parties understood what the business people did with the data and why they needed it.”

She says it helped that there were very few player changes on either side of the team from the requirements and architecture right through to the development and implementation. And, when the team needed to be supplemented with contract help for the coding, for example, consideration was given to finding the right people to fit in the group in terms of skills and personality.

“Business people tend to look at what happens 90 per cent of the time and most of the coding is for the 10 to 20 per cent that doesn’t happen often but you don’t want the system to crash if and when the unexpected happens,” Murphy adds. “So it is getting them to think about exceptions and exception processing – what do you want us to do? Default these values? Go back and take yesterday’s data files?”

Mutual respect, co-location, cross-training, consistent team players, sharing information – RBC seems to have found the winning formula for teamwork that ensures IT capabilities bring business value.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Previous article
Next article

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now