Site icon IT World Canada

When trusted IT pros go bad

It’s a CIO’s worst nightmare: a call from the Business Software Alliance, saying that some of the software your company uses might be pirated.

You investigate and find that not only is your software illegal, it was sold to you by a company secretly owned and operated by none other than your own IT systems administrator, who’s been a trusted employee for seven years. When you start digging into the admin’s activities, you find a for-pay porn website he’s been running on one of your corporate servers. Then you find that he’s downloaded 400 customer credit card numbers from your e-commerce server.

And here’s the worst part: He’s the only one with the administrative passwords.

Think it can’t happen? It did, according to a security consultant who was called in to help the victim, a $250 million retailer in Pennsylvania. You never heard about it because the company kept it quiet.

Despite the occasional headlines about IT folks gone rogue, most companies sweep such situations under the rug as quickly and as quietly as possible.

An annual survey by CSO magazine, the U.S. Secret Service and CERT (a program of the Software Engineering Institute at Carnegie Mellon University) routinely finds that three quarters of companies that are victimized by insiders handle the incidents internally, says Dawn Cappelli, technical manager of CERT’s Insider Threat Center. “So we know that [what’s made public] is only the tip of the iceberg,” she says.

By keeping things quiet, however, victimized companies deny others the opportunity to learn from their experiences. CERT has tried to fill that void. It has studied insider threats since 2001, collecting information on more than 400 cases. In its most recent report, which analyzes more than 250 cases, CERT says the most common mistakes include failing to vet job applicants thoroughly, neglecting to adequately monitor the process of granting access privileges, and overlooking red flags in behavior.

But the threats posed by privilege-laden IT employees are especially hard to recognize. For one thing, staffers’ nefarious activities can look the same as their regular duties. IT employees routinely “edit and write scripts, edit code and write programs, so it doesn’t look like anomalous activity,” Cappelli says. They know where your security is weakest and how to cover their tracks.

Victimized companies typically won’t talk, but security consultants who help clean up the messes sometimes do. We talked to three security pros who shared these stunning tales of rogue IT employees.

Pirating Software — and Worse

The Pennsylvania retailer’s tale of woe began in early 2008, when the BSA notified it that Microsoft had uncovered licensing discrepancies, according to John Linkous. Today, Linkous is chief security and compliance officer at eIQ Networks, a security consultancy. His experience with the incident involving the retailer is from his previous job, when he was vice president of operations at Sabera, a now-defunct security consultancy.

Microsoft had traced the sale of the suspect software to a sysadmin at a company that was a Sabera client. For the purposes of this story, we’ll call that sysadmin “Ed.” When Linkous and other members of the Sabera team were secretly called in to investigate, they found that Ed had sold more than a half-million dollars in pirated Microsoft, Adobe and SAP software to his employer.

The investigators also noticed that network bandwidth use was abnormally high. “We thought there was some kind of network-based attack going on,” says Linkous. They traced the activity to a server with more than 50,000 pornographic still images and more than 2,500 videos, according to Linkous.

In addition, a forensic search of Ed’s workstation uncovered a spreadsheet with hundreds of credit card numbers from the company’s e-commerce site. While there was no indication that the numbers had been used, the fact that the information was in a spreadsheet implied that Ed was contemplating using the card data himself or selling it to a third party, according to Linkous.

The retailer’s chief financial officer, who had originally received the call from the BSA, and others on the senior management team feared what Ed might do when confronted. He was the only one who had certain administrative passwords — including passwords for the core network router/firewall, network switches, the corporate VPN, the HR system, email server administration, Windows Active Directory administration, and Windows desktop administration.

That meant that Ed could have held hostage nearly all the company’s major business processes, including the corporate website, email, financial reporting system and payroll. “This guy had keys to the kingdom,” says Linkous.

So the company and Linkous’ firm launched an operation right out of Mission: Impossible. They invented a ruse that required Ed to fly overnight to California. The long flight gave Linkous’ team a window of about five and a half hours during which Ed couldn’t possibly access the system. Working as fast as they could, the team mapped out the network and reset all the passwords. When Ed landed in California, “the COO was there to meet him. He was fired on the spot.”

The cost: Linkous estimates that the incident cost the company a total of $250,000 to $300,000, which includes Sabera’s fee, the cost of flying Ed to the West Coast on short notice, the cost of litigation against Ed, the costs associated with hiring a temporary network administrator and a new CIO, and the cost of making all of the company’s software licenses legitimate.

Preventive measures: What could have prevented this disaster? Obviously, at least one other person should have known the passwords. But more significant was the lack of separation of duties. The retailer had a small IT staff (just six employees), so Ed was entrusted with both administrative and security responsibilities. That meant he was monitoring himself.

Hall of Shame
A Rogue IT Gallery

The threat from trusted insiders is real. IT employees and contractors have been convicted of hacking, planting logic bombs, and stealing money and code.

2011: A software engineer at British Airways was found guilty of using his position to plan a terrorist attack on behalf of a Yemen-based radical cleric.

2010: An IT employee at Bank of America pleaded guilty to charges that he hacked the bank’s ATMs to dispense cash without recording the activity.

2010: A contract programmer who was fired by Fannie Mae was convicted of planting malicious code that was set to destroy all data on the organization’s nearly 5,000 servers.

2010: A Goldman Sachs programmer was found guilty of stealing computer code for high-frequency trading from the investment bank when he left to join a startup.

2010: A Utah computer contractor pleaded guilty to stealing about $2 million from four credit unions that he performed IT services for.

2008: A systems administrator at Medco Health Solutions who was worried about layoffs planted a logic bomb that would have deleted prescription data from Medco’s network.

2006: A systems administrator at UBS PaineWebber who was disgruntled with his pay and bonuses was found guilty of planting a logic bomb that affected about 1,000 company computers and caused about $3 million worth of damages.

— Compiled by Mitch Betts from press reports.

Separating duties can be a particularly tough challenge for companies with small IT staffs, Linkous acknowledges. He suggests that small companies monitor everything, including logs, network traffic and system configuration changes, and have the results evaluated by someone other than the systems administrator and his direct reports. Most important, he says, is letting IT people know that they are being watched.

Second, the company failed to do a thorough background check when it hired Ed. In CERT’s research, 30% of the insiders who committed IT sabotage had a previous arrest. In fact, any kind of false credentials should raise a red flag. Although the company had done a criminal background check on Ed (which was clean), it did not verify the credentials on his résumé, some of which were later found to be fraudulent. (He did not, for example, have the MBA that he claimed to have.)

Third, Ed’s personality could have been viewed as a red flag. “He seemed to believe that he was smarter than everyone else in the room,” says Linkous, who met Ed face-to-face by posing as an ERP vendor before the sting operation. Ed’s arrogance reminded Linkous of the infamous Enron executives. “He was extremely confident, cocky and very dismissive of other people.”

CERT has found that rogues often have prickly personalities. “We don’t have any cases where, after the fact, people said, ‘I can’t believe it — he was such a nice guy,'” says Cappelli.

Outsourcing Incenses Employee

“Sally,” a systems administrator and a database manager, had been with a Fortune 500 consumer products company for 10 years and was one of its most trusted and capable IT workers, according to Larry Ponemon, founder and chairman of the Ponemon Institute, an IT security research firm.

She was known as a pinch hitter — someone who was able to help solve all kinds of problems. For that reason, she had accumulated many high-level network privileges that went beyond what her job required. “There is this tendency to give these people more privileges than they need because you never know when they’ll need to be helping someone else out,” says Ponemon.

She sometimes worked from home, taking her laptop, which was configured with those high-level privileges. The company’s culture was such that IT stars like Sally were given special treatment, says Ponemon. “The IT people made an end-run around certain policies,” he says. “They could decide what tools they wanted on their systems.”

But when the corporation decided to outsource most of its IT operations to India, Sally didn’t feel so special. Although the company had not yet formally notified the IT staff, says Ponemon, it was obvious to IT insiders that time was running out for most of the department’s employees.

Sally wanted revenge. So she planted logic bombs that caused entire racks of servers to crash once she was gone.

At first, the company had no clue what was going on. It switched to its redundant servers, but Sally had planted bombs in those as well. The company had a hard time containing the damage because it didn’t follow any apparent rhyme or reason. “A malicious employee [who’s] angry can do a lot of damage in a way that’s hard to discover immediately and hard to trace later,” Ponemon notes.

Eventually, they traced the sabotage to Sally and confronted her. In return for Sally’s agreement to help fix the systems, the company did not prosecute her. In addition, Sally had to agree never to talk publicly about the incident. “They didn’t want her going on Oprah and talking about how she broke the backbone of a Fortune 500 company,” says Ponemon.

The cost: The estimated total cost to the company: $7 million, which includes $5 million in opportunity costs (downtime, disruption to business and potential loss of customers) and $2 million in fees for forensics and security consultants, among other things.

Preventive measures: What did the company do wrong? First, the incident is a classic example of “privilege escalation,” which is what happens when privileges are granted to an individual to handle a specific task but are not revoked when the person no longer needs them, says Ponemon.

Second, an entitlement culture led to no separation of duties and very little oversight of IT. Because of that, management missed an important red flag. After the incident, the company discovered that Sally had “lost” 11 laptops over the previous three years. The help desk staff was aware of this, but no one ever reported it to management, partly because of Sally’s status in the organization. Nobody knows what she did with those laptops; it could be that she was just careless — but “that’s a problem in and of itself if you’re a systems administrator,” Ponemon observes.

Third, given the tense atmosphere created by the outsourcing decision, the company should have been more vigilant and more proactive in monitoring potentially angry employees.

Even if you haven’t announced anything to your employees, it’s a mistake to think they don’t know what’s going on, says Ponemon. “The average rank-and-file [worker] knows within a nanosecond of when the CEO signs the [outsourcing] contract,” he says. If you aren’t already monitoring your IT people, now is the time to start. For best results, kick off the program with a very public pronouncement that you are now monitoring the staff.

According to CERT, many cases of sabotage are the result of a disgruntled employee committing an act of revenge. And such acts can happen in the blink of an eye, as the next story illustrates.

A Firing Gone Wrong

When this Fortune 100 company upgraded its security, it made a nasty discovery. One of its senior system admins, who had been there at least eight years, had surreptitiously added a page to the company’s e-commerce website. If you typed in the company URL followed by a certain string of characters, you got to a page where this admin, whom we’ll call “Phil,” was doing a brisk business selling pirated satellite TV equipment, primarily from China, according to Jon Heimerl, director of strategic security at Solutionary, a managed security services provider hired to address the problem.

The good news: Improved security caught the perpetrator. The bad news: Management botched the firing process, giving him an opportunity to take a parting shot.

Itself a retailer in high-tech equipment, the company wanted to get rid of Phil and his website as quickly as possible because it feared lawsuits from satellite equipment manufacturers. But while Phil’s manager and security staffers were on their way to his office, a human resources representative called Phil and told him to stay put. Heimerl isn’t sure exactly what the HR person said, but it was apparently enough for Phil to guess that the jig was up.

Already logged in to the corporate network, he immediately deleted the corporate encryption key ring. “As he was hitting the Delete key, security and his manager showed up and said, ‘Stop what you’re doing right now, and step away from the terminal,'” according to Heimerl. But it was too late.

The file held all the encryption keys for the company, including the escrow key — a master key that allows the company to decrypt any file of any employee. Most employees kept their own encryption keys on their local systems. However, the key ring held the only copies of encryption keys for about 25 employees — most of whom worked in the legal and contracts departments — and the only copy of the corporate encryption key. That meant that anything those employees had encrypted in the three years since they had started using the encryption system was permanently indecipherable — and thus virtually lost to them.

The cost: Heimerl hasn’t calculated how much money the incident cost the company, but he estimates that the loss of the key ring file amounted to about 18 person-years of lost productivity — a figure that takes into account both the work that went into creating files that are now permanently encrypted and the time devoted to re-creating materials from drafts, old emails and other unencrypted documents.

Preventive measures: Focusing only on what happened after they discovered the rogue website, the company made two crucial mistakes, says Heimerl. It should have shut down Phil’s access immediately upon discovering his activities. But managers also left themselves vulnerable by not keeping a secure backup of critical corporate information. (Ironically, the company thought the key ring was so sensitive that no copies should be made.)

The Best Defense Is Multifaceted

The overall lesson from these horror stories is that no single thing can protect you from rogue IT people. You might have great technical security — like the multitiered security system that ultimately detected Phil’s unauthorized website — and yet a simple mistake by HR can lead to disaster. Or there could be big red flags in terms of behavior or personality that go unnoticed — like Sally’s missing laptops.

It’s a combination of technical safeguards and human observation that offers the best protection, says CERT’s Cappelli.

And yet it’s hard to convince companies to do both. Executives tend to think such problems can be solved with technology alone, at least partly because they hear vendors of monitoring systems and other security products claiming that their tools offer protection. “We’re trying to figure out how to get the message to the C-level people that this is not just an IT problem,” Cappelli says.

It’s a difficult message to hear, and a lesson that many companies only learn the hard way. Even if more companies were forthcoming with the details of their horror stories, most CEOs would still think it could never happen to them. Until it did.

Harbert is a Washington, D.C.-based writer specializing in technology, business and public policy. She can be contacted through her website, TamHarbert.com.

Exit mobile version