An insider program can also leverage technology such as user behavior analytics (UBA) that would provide a head start to bringing all data together. “However, in order to make the most use of the tool, someone has to be able to filter out the noise. Having access to data in one platform is a great start but filtering through events and noise is even more critical,” Camacho said, adding that knowing how to find anomalies or patterns that don't make sense is one key function to the beginning of a successful program.
“In short, an insider program should be able to curate data points that reveal a toxic risk score of 'who' might be high concern for malicious activity,” Camacho said.
Matias Brutti, a hacker at Okta, said perhaps the least accounted for, but potentially most important role on the insider risk team is the red team, followed by the obvious incident response and monitoring team. A red team is responsible for playing the role of the insider threat while the blue team tries to monitor and defend against those threats. The red team members proactively try to find ways to exfiltrate data, obtain personally identifiable information (PII) and access unauthorized services outside of someone’s scope. “They do this to ultimately help build realistic processes and procedures that will prevent a real attack in the future,” he said.
Exabeam CEO and Co-founder Nir Polak takes a slightly different view on who encompasses the insider risk team. “We are seeing more companies create insider risk organizations, and often these do not report into the IT security organization. These teams typically include people with police or investigation backgrounds instead of IT skills. This can make sense, as insider attacks are often not technology-based. They use valid credentials with valid access to sensitive information, but with a goal of using that information for invalid purposes. In this environment, forensic and detective skills will be very valuable,” he said.
To put it simply, Polak said these teams are put together to create policy that minimizes risk, then select solutions that help implement the policy. They also use the same tools to monitor policy. For example, the policy might say that “employees shouldn’t have more access to confidential data than their current job requires,” and then the team implements a program to regularly review access.
“You’d be surprised how often employees accumulate access rights and then never give them up when they move to new projects,” Polak said.
Kris Lovejoy, CEO of BluVector, has found that where employees and contractors (part of the extended team) have clarity on “why” security is important, it’s much easier to assure they learn and adhere to the “how”. “That said, the flip side of the insider risk team function is to assure policies and processes exist which enable a speedy response when 'rules' are broken.”
Sign up for CIO Asia eNewsletters.