Trust sinks secure ships

The last time you went to a computer industry trade show, you probably picked up a few trinkets handed out by vendors: pens, mugs, stress balls. What if a vendor offered you a handy little 1GB thumb drive if you agreed to watch a 10-minute demo? Would you take it home and use it? Heck, yeah!

What happens if that thumb drive isn’t as innocent as it looks? Maybe it comes preinstalled with a Trojan horse that’s going to infect your PC and send private information back to a hacker. Do you think this is far-fetched? Well, think again.

A recent article on the Dark Reading Web site discusses a situation in which a credit union asked an ethical hacking company to test its computer security practices. The consultant dropped a few USB thumb drives in conspicuous places around the building where employees would find them and access them to see what was on them. Unknown to the employees, the security consultant had preinstalled Trojan software that would load onto the unsuspecting employees’ PCs and send configuration data back to the consultant via e-mail. Of the 20 thumb drivers planted, 15 were used as intended.

The weak link, of course, is people. We’re so darn trusting, naive and greedy. It’s called social engineering, and it’s all about trust.

Social engineering is the principle behind phishing, pretexting, gimme schemes in which you seemingly get something for nothing and other methods that criminals use to gain your confidence before doing their dirty work.

The Art of Deception by criminal-turned-ethical-hacker Kevin Mitnick discusses how social engineering can be combined with computer hacking to wreak havoc on corporate, public and private networks. The lesson is that no matter how strong you build technology solutions to security issues, people can and do give up the keys to the kingdom.

Over the past year, I’ve looked at all kinds of security technologies. All these products do exactly what you tell them to do, and they rarely fail. But tell someone to protect his password, and he’ll move the sticky note with the password from the monitor to his top desk drawer. The human element is far more fallible than technology.

We can’t assume people will be smart about the way they use computers. A recent study conducted by the University of California at Berkeley and Harvard University found that a well-designed phishing Web site draws in people 90 per cent of the time.

What can be done about social engineering and the human weak link of computer security? The best thing you can do is create awareness and provide training for your employees. Honest people want to be a part of the solution, not part of the problem. They need to be taught about the threats and vulnerabilities — especially the ones relying on social engineering.

Mitnick cites his own case of stealing source code from Motorola years ago by sounding authoritative on the phone. He posed as a Motorola employee, and a real employee went out of her way to send him code that he claimed he needed for a project. The only problem was that it wasn’t a Motorola project. A trusting person didn’t think to challenge his story before giving him trade secrets.

So the next time you call or e-mail me, forgive me if I’m automatically suspicious of you. I’m trying to heighten my awareness of security lapses due to social engineering, and you and your company should, too.

QuickLink: 069671

Related Download
Improving the State of Affairs With Analytics Sponsor: SAS
Improving the State of Affairs With Analytics
Download this case study-rich white paper to learn why data management and analytics are so crucial in the public sector, and how to put it to work in your organization.
Register Now