The Pentagon’s been robbed recently of data. So has the US National Intelligence establishment. Not to mention most large financial institutions and many high technology manufacturers in aerospace, biosciences, IT and the like.
All of them have traditional information and communications technology setups. Logins and passwords, check. Must change passwords, must be strong passwords, can’t reuse passwords, check. Hardened firewalls, locked-down computers, data held server side, check.
How did they get broken into? None of that is good enough in the long run if someone is determined. Get a keylogger onto a laptop or phone, and you’re in (that’s one way).
Here’s the real crime: in many of these cases, the theft wasn’t detected on the spot so it could be shut down — and in one of the military/intelligence situations, they don’t even know what was stolen (just how many gigabytes left).
You see, the thief did something different: they encrypted everything on the way out. Their network sniffers have the data stream, but can’t read it.
The agency involved thought about technology the way a lot of us do: encrypting data, and encrypting all transmissions, is too much overhead.
But it’s what we need to do. If we’re serious about security, we have everything encrypted. When it’s stored. When it’s moving.
We also don’t trust a login, but reauthenticate connections periodically. “Do you still have the token, present it? You do, here’s a refreshed one.” You don’t have to bother the person, but you do confirm a device spoof hasn’t taken over.
We also don’t use logic to test whether a connection length suddenly changed, or typing speed changed to non-human levels.
We accept as necessary overheads for workload balancing, message passing, and the like. It’s time we accepted real security overheads as well.
Then we’d be less concerned about rogue users using USB sticks and strange devices — and actually be more secure.