How much risk can you handle?

Time for IT security professionals to do a gut check over the risks of consumerization.

If you don’t do a quantitative risk analysis of employee-owned devices in the enterprise, “you might as well go with your gut,” rather than rely on a qualitative  or quasi-quantitative risk analysis, said Peter Davis, principal of risk analysis firm Peter Davis & Associates.

Davis was speaking at the 2013 SC Congress security conference in Toronto on Tuesday.

The good news: subjectivity and objectivity aren’t discrete binaries, but rather a spectrum of data-supported decision-making. “It always comes down to a human at some point,” Davis said. But the idea is to reduce subjectivity in risk analysis; it’s less precise, more uncertain.

But just having data doesn’t mean your risk analysis is any more accurate. You can’t aggregate the colours of a threat survey. Polar diagrams “represent and mean nothing.” Just because it’s structured and formal doesn’t make it meaningful. “Astrology is both structured and formal,” Davis said.

Good data and good models are necessary for a good risk analysis. It starts with the information you have within your own system and supply chain, but there’s plenty of other data to draw to determine the value of assets, systems and processes.

“You can find exemplars in other industries,” he said. Much of the modeling of risk analysis for disaster recovery, for example, is drawn from the nuclear power industry. There are also many relevant surveys that can address the probability of, for example, someone losing a smart phone, which can be factored into the analysis.
There are roadblocks in people’s lack of understanding in mathematics, though. Ask a person trying to draw a red marble blindly out of one of two urns, telling them one has one red marble in 10, while the other has eight in 100, Davis suggests. Most people will choose the urn with the most red marbles, even though the other has a two per cent higher probability, he said.

“Information security people don’t like to talk about probability,” but prefer to focus on the possibility of a threat.

Davis advocates an open source risk analysis process called the FAIR (Factor Analysis of Information Risk) standard, created by the Open Group. The standard models risk as a factor of loss frequency and loss magnitude. Loss frequency breaks down into threat frequency and vulnerability (which he defines as the probability that the threat capability exceeds the ability to resist the threat; think Superstorm Sandy versus the New York subway). Loss magnitude breaks down into primary and secondary losses, and so on.

“It keeps decomposing,” Davis said.

Minimum, maximum and most likely (or mode) values for the variables undergo a Monte Carlo simulation, which repeatedly samples the numbers at random.

Eugene Taylashev, manager of information security for International Financial Data Services, described three other models of risk analysis.

A simple threat assessment uses business language and processes oriented toward line-of-business managers. The seven-step process relies on the knowledge of the businesspeople involved.

It begins with a list of concerns. For example, moving the front end of a Web site to a cloud services vendor might cause the front-end to become unavailable, possibly violating a customer SLA; the data might be exposed or corrupted, with either leading to penalties for the company. The steps involve separating the event from the impacts (unavailability I an event, SLA violation an impact); estimating the frequency and impact on a logarithmic scale; producing a risk heat map and developing risk treatment options.

A failure modes and effects analysis (FMEA) is better suited to technologists, but still requires only a single spreadsheet page. Threat probability, vulnerability severity and ability to detect are rate from one to three and multiplied together for each asset; this produces a risk priority number.

The ISO 27005 standard is more rigorous and in-depth. Anything with a business value – primarily business processes and information, but also the supporting systems, personnel, supply chain—is listed. All related threats are evaluated on the basis of their likelihood versus their impact on business value. Risks over a certain threshold are priority to ameliorate.

Would you recommend this article?


Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.

Jim Love, Chief Content Officer, IT World Canada

Featured Download

Dave Webb
Dave Webb
Dave Webb is a freelance editor and writer. A veteran journalist of more than 20 years' experience (15 of them in technology), he has held senior editorial positions with a number of technology publications. He was honoured with an Andersen Consulting Award for Excellence in Business Journalism in 2000, and several Canadian Online Publishing Awards as part of the ComputerWorld Canada team.

Featured Articles

Empowering the hybrid workforce: how technology can build a better employee experience

Across the country, employees from organizations of all sizes expect flexibility...

What’s behind the best customer experience: How to make it real for your business

The best customer experience – the kind that builds businesses and...

Overcoming the obstacles to optimized operations

Network-driven optimization is a top priority for many Canadian business leaders...

Thriving amid Canada’s tech talent shortage

With today’s tight labour market, rising customer demands, fast-evolving cyber threats...

Staying protected and compliant in an evolving IT landscape

Canadian businesses have changed remarkably and quickly over the last few...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now