Information overload: Finding signals in the noise

Signal-to-noise ratios are hard to manage. As a security professional, you want the threat data, you want the attack notifications and alerts, and you need intelligence. But, when there's too much coming in, those alerts and notifications fall to the wayside. They're easily dismissed and ignored.

Share

Signal-to-noise ratios are hard to manage. As a security professional, you want the threat data, you want the attack notifications and alerts, and you need intelligence. But, when there's too much coming in, those alerts and notifications fall to the wayside. They're easily dismissed and ignored.

After all, if a device is generating 60 alerts a day - and for the first few weeks none of them amount to anything - as new alerts from that device arrive, they're eventually going to be dismissed.

This happens because the IT / InfoSec department has other things to worry about, and there isn't enough time (or people) to deal with a flood of alerts. It's possible the device generating the alerts will be properly tuned and configured later, but that depends on the staff's workload.

It's a defeatist argument, but it's also reality.

IT / InfoSec teams live with shrinking resources, including budgets and staffing, so they have to focus on the tasks that keep the business running, as well as help the business grow and profit. More to the point, most security purchases and deployments center on compliance, other wise known as "checking the box."

The check-box mentality is why there are 60 alerts a day to begin with. The appliance wasn't tuned right, it was installed to meet a compliance or regulation requirement, and the vendor promised a "set it and forget it" approach.

Yet, the default rules from the vendor aren't cutting it when deployed in a live production environment. So the organisation is flooded with false positives, leading to more noise than signal.

Another painful reality is the fact that some of these ignored alerts are valid warnings. It's a reminder that stats such as those from the 2010 Verizon Business Data Breach Investigations Report, which said that 86 percent of victims had evidence of a breach sitting in their logs, aren't always fluff.

A more recent example of this problem would be the Target breach. While the breach was happening, SOC (security operation center) employees were fed alerts from a FireEye device. However, while those warnings were investigated, they were eventually dismissed as noise.

"With the benefit of hindsight, we are investigating whether, if different judgments had been made, the outcome may have been different," Target's spokesperson, Molly Snyder, told the New York Times.

"Like any large company, each week at Target there are a vast number of technical events that take place and are logged. Through our investigation, we learned that after these criminals entered our network, a small amount of their activity was logged and surfaced to our team. That activity was evaluated and acted upon. Based on their interpretation and evaluation of that activity, the team determined that it did not warrant immediate follow-up."

Was it a mistake? It's easy to call it one now, but at the time, the InfoSec team at Target were just conducting their daily routine. They checked the alert, determined it wasn't a high priority, and moved on to other things.

This happens daily at organisations across the globe, but the difference is that in hindsight, the public knows what happened to Target due to this oversight, so it's easy to single them out.

"So [in] some of these very high profile breaches, the product was able to identify that the breach was occurring, but the people's intelligence wasn't able to respond because they got so many alerts. They got so much information that it was difficult," commented FireEye's Dave DeWalt.

Find your next job with computerworld UK jobs