Site icon IT World Canada

SecTor 2020: Don’t point a finger too fast after a hack, says expert

People pointing fingers at a person illustration

Source: Ponomariova_Maria | Getty Images

It’s commonly believed by infosec pros that people are the biggest weaknesses of organizations. Clicking on bad attachments, misconfiguring software and hardware and choosing bad passwords are blamed for the majority of data breaches.

But one expert says the world is too quick to point fingers at people for cyber incidents. “Human error is never the cause, it’s a symptom of the underlying systematic problems,” Mark Sangster told the SecTor 2020 virtual conference on Wednesday.

Sangster, vice-president of industry security strategies at Waterloo, Ont.,-bases managed security provider eSentire, agreed that “We can learn more from our mistakes than from our successes.”

However, he added, too often, people introduce issues that block out their ability to find the real causes of incidents.

These include “hindsight bias” (if that software had been patched, the breach wouldn’t have occurred)” and “time bias” (if that field goal kicker hadn’t missed his last-second attempt, we’d have one the game. But what about all the missed plays that led to the game being tied, so the kicker had to be called in?)

Or, Sangster said, we forget about management not giving infosec teams the money to buy needed security software and hire people with expertise to better protect the enterprise.

“When we blame people, we miss the chance to learn,” he said.

Aircraft, oil rig incidents

There are examples of this scenario outside the IT world, Sangster argued. For example, after an Air Canada flight safely landed at a small airfield in Manitoba in 1983 after running out of fuel at 41,000 feet, the pilot was acclaimed as a hero. Then he and other staff were disciplined because of errors, including wrongly converting the fuel from imperial gallons to metric. No one faulted the regulators for making Canadian airlines for converting to metric.

Another example, Sangster cited, was the loss of 11 people after the Deepwater Horizon drilling rig in the Gulf of Mexico caught fire and sunk. The operators of the rig were faulted, but the project was under pressure to get work done, and there was chain of command conflicts.  The rig was also drilling at depths never encountered before. In short, “they were flying by the seat of their pants.”

There’s no shortage of cybersecurity-related “blame the person” examples, Sangster said. After Montreal’s Desjardins Group acknowledged a huge data breach in 2019 at its credit union, it blamed an employee for sharing personal information. But, asked Sangster, how did that happen? Why weren’t events logged and flagged?

It’s like the organization said, ‘I found my last culprit, I’ve used my hindsight bias to say it shouldn’t have happened, I’m using my outcome bias to blame them and my time bias to say that’s the last link in the chain.'”

‘Stop blaming one actor’

Similarly, after the Capital One data breach involving six million credit card applications, an external threat actor was blamed. But it later emerged the financial conglomerate had long warned about security issues.

“It’s time to stop blaming one actor, whether it’s a human or mechanical piece and look at systematic causes,” Sangster said. “Ask what is responsible, not who. Understand why they made the decision. Good people make good decisions. Sometimes they don’t work. They have bad outcomes.

“Seek forward accountability. This isn’t about blame. Take outcome bias and bury it in the ground. When you start out looking for a throat to choke you are effectively trying to close  the case without determining what all the evidence is.”

Exit mobile version