The U.S. Office of Personnel Management and the the country’s National Security Agency are experimenting with user behavioral analytics (UBA) as a way to mitigate insider threats, it was reported this month. UBA can be an effective control against insider threats, but it may be an uphill struggle. Experts warn that the technology can be challenging to implement because the data it consumes is so wide-ranging, and the implications so far-reaching.
The idea behind UBA is to create a baseline of user behavior. Typically, companies analyze a user’s interactions with an IT system, looking at when they log into corporate systems and where from, what they do with enterprise applications and what kinds of files they access. The UBA tool can then monitor future behavior and give employees a constantly-updated risk score based on how far they’re deviating from perceived norms.
Taryn Aguas, principal in tech strategy and cyber risk services at Deloitte & Touche, said that UBA is still evolving as a product category.
“It’s still pretty immature. There are a small set of vendors with effective products, growing rapidly,” she said. “It’s at the stage where a lot of large clients are looking at it and purchasing these products, but they’re not going through large-scale implementations yet.”
UBA can be used to detect fraudulent customer behavior, but is most often viewed as a means of avoiding insider threats, according to Aguas. Companies can use it to spot when an employee does something suspicious like downloading lots of customer data out of hours.
“Outside of your traditional access controls, the user name and password, there really hasn’t been a lot of great control for insiders,” Aguas said.
Companies shouldn’t underestimate the effort involved in configuring these tools correctly, though, warned Randy Trzeciak, director of the Insider Threat Center at CERT, part of the Software Engineering Institute at Carnegie Mellon University.
“The UBA tools are designed to interact with the security information and event management tools, the intrusion detection and prevention systems,” he said. “They’re designed to collect system logs that are generated from the IT departments.”
UBA tools can also digest data from other sources, such as human resource systems and physical security systems, pointed out Aguas. An HR system might reveal that a user has recently had a bad performance review, which could elevate their risk score in conjunction with other events. A physical security system might know that an employee’s RFID badge was in an area of the building when that employee’s account was used to execute a certain action. That might have tangible implications.
There are no standards for integrating this log and application data, Trzeciak pointed out, meaning that UBA vendors and their customers will often have to write their own integrations. These systems won’t do much for you out of the box, he explained; there’s some effort and expertise involved.
Understanding what events are important and what risks they represent can also vary on a per-company and even a per-role basis. Codifying these rules into a UBA system requires organizational and even psychological knowledge, pointed out Aguas. Trzeciak said that getting the rules into the system still often requires a level of coding skill.
Federal government agencies certainly have the financial and technical resources to consider all of these deployment challenges. Larger companies may do, too. But organizations should think long and hard about whether they have what it takes to implement UBA meaningfully. The last thing anyone wants when taking decisions based on user behavior is a half-baked job.