Welcome to Cyber Security Today. This is the Week in Review for the week ending Friday, April 21st, 2023. I’m Howard Solomon, contributing reporter on cybersecurity for ITWorldCanada.com and TechNewsday.com in the U.S.
In a few minutes Terry Cutler of Montreal’s Cyology Labs will be here to comment on recent news. But first a look at some of the headlines from the past seven days:
Fortra issued its initial analysis of what led to the compromise of customers that used its GoAnywhere MFT file transfer platform. It was, as everyone knows by now, an exploitation of a zero-day vulnerability. But Terry and I will look at other findings in the report that suggest IT departments could have stopped the attacks.
We’ll also look at a report from researchers at ESET who found IT departments aren’t sanitizing the outdated routers they allow to be sold on the used market.
We’ll have thoughts on a proposal by the U.S. to toughen cybersecurity requirements of companies that are allowed to handle federal government data.
And just as we were to record this show 3CX issued a report about its supply chain attack last month. It started with an employee downloading an infected trading app on their personal computer. We’ll look at that report.
Also in the news, some Canadian artificial intelligence experts and startups urged the federal government to quickly pass a proposed AI law that would regulate use of the technology here. An imperfect law is better than nothing, some argue. But opponents say the law is too flawed as it is.
NCR was hoping to have secure access to much of its Aloha restaurant point of sale system back up today after a ransomware attack. Some customers would have been offline for days.
QuaDream, which makes spyware for smartphones that governments and police departments allegedly use against political opponents and reporters, is closing. This came after researchers at the University of Toronto’s Citizen Lab and Microsoft published critical reports.
Also this week Citizen Lab published a report on another commercial spyware company, the NSO Group. It says last year the company created at least three zero-click exploits that can compromise iPhones. Examples were seen on victims’ phones in Mexico. High-risk iPhone users like reporters and human rights defenders are advised to turn on the devices’ Lockdown Mode.
Police forces have again called on the tech industry to re-think adding end-to-end encryption on their platforms for user privacy. The 15 police agencies that are members of the Virtual Global Taskforce — including the RCMP, the FBI and the U.K. National Crime Agency — said such privacy protections hinder the investigation of communications of alleged child sexual abusers. They ask the industry to only add privacy protections with safety solutions that allow police to identify potential abusers. This comes after Meta said end-to-end encryption will be added to Facebook and Instagram.
Finally, the U.K.’s National Cyber Security Centre issued a reminder that pro-Russian cyber groups are getting imaginative these days against western countries. The centre issued recommendations to IT departments on how to better security their systems which are worth reading.
(The following is an edited transcript of one of the topics discussed. To hear the full conversation play the podcast)
Howard: Just as we were about to record this podcast 3CX issued a report on how its business phone app was compromised last month. An employee downloaded an infected trading app on their personal computer. The hacker used that access to get into 3CX and compromise its software build, so when customers downloaded the 3CX client it passed [approval by Windows] because there was a legitimate but compromised digital certificate. Well, remember that trading app? The employee’s computer accepted it because there was also a compromised legitimate digital certificate in it. In other words, a supply chain attack led to a supply chain attack. This sounds terrible.
Terry Cutler: Yeah. This was downloaded on a personal computer. First, I don’t know why we’re still letting employees’ personal computers to connect to the corporate network. I thought we learned our lesson in 2020 when there were over 4 million cyber attacks against remote workers. I know some organizations give incentives to employees to buy their own laptops if the firm doesn’t want to buy them, but there needs to be technology in place that at least will scan the device to make sure its security is up-to-date before it connects to the IT network. AT least implement a zero-trust security model because employees [working from home] are not cybersecurity experts. They’re relying on IT to make everything secure.
Howard: Certainly it shows that security issues can start with the personal PCs of employees who can also log into the corporate environment … But the catch should be the company has to have some responsibility to screen every employee’s computer when they log in for malware. This incident is a reminder that IT departments need to impose tough screening.
Terry: This is where it can get really touchy. When I worked for a private investigation firm we saw cases where a personal device was malfunctioning on the network IT wiped it — that included all the employee’s family photos and personal. So the company was sued by the employee. That’s why I think it’s very that organizations get away from employees using their personal computers and just give them corporate devices.
Howard: One solution is for IT to have a mobile device management application that segments the employee’s computer: There’s a partition for personal stuff and then there’s a partition for corporate.
Terry: In a perfect world it works great. But unfortunately, home users are a special breed. Things often go wrong.
Howard: Is it hard to blame the employee in this incident? The trading app they installed on their own PC had a properly signed digital certificate.
Terry: I don’t think they should be blaming the employee at all because employees aren’t aware of all the risks associated with their personal devices. Especially if they’re allowed to use them for work purposes. Even though you give employees user awareness training they’re not cybersec security experts. They’re doing their job. There needs to be proper technology in place to kick in once the employee does something wrong.
Howard: But if you’re going to say to the employee you can have a personal computer that’s also used for work, shouldn’t there be rules saying here’s a list of applications that you can only put on that personal computer?
Terry: ‘Hey, it’s my personal computer I can do what I want.’ You always hear that. Remember also, that the trading app had a proper digital certificate. How is the user supposed to know it had malware?
Howard: Here’s an interesting angle to this story: Remember that trading app the employee downloaded? It was no longer supported by the company. Somehow between the point when the company stopped supporting the app and the time when the employee downloaded it an attacker got into it — when the developer wasn’t looking — and compromised. It.
Terry: We’re going to see a lot of that in the future. We’ll see more at supply chain attacks because there are so many vulnerabilities. Developers need to start coding with security in mind. I know we’ve been talking about that for years, but they’re under strict [development] deadlines and maybe they don’t have the expertise for cyber security. So they’re trying to build apps as quickly as possible, but a lot of times they’re full of holes. They need to start doing more regular web application penetration tests. Find out where the holes are. The app that was no longer supported, maybe they left it on the shelf, or got rid of the development team or replaced them. We don’t know exactly what happened.
Howard: But if a company decides that an app is not going to be supported anymore why is it still on the company’s website for people to still download it?
Terry: Because a lot of times users still need it. Maybe they have old files they need to open. I still sometimes use software from 2014 that’s no longer supported that I use for mapping.
Howard: To show it’s taking security seriously 3CC announced that it’s taking seven steps to beef up cyber security. These are good steps, but shouldn’t they have been done sooner?
Terry: Unfortunately we only take action after it’s too late. And the problem with these zero days is they’re very, very, very hard to detect. So going forward, with more testing more vulnerabilities will be found. But there’s another service you can look at called adversarial testing. It’s not well-known but it’s a service where you can deploy a machine in the environment and run specialized scripts that mimic weird behavior, ransomware attacks, all the things that should set off alarm bells throughout the IT system. That way you get to know what’s working and what’s not.