There has never been a successful catastrophic cyberattack on North America's critical infrastructure (CI) -- yet.
The National Institute of Standards and Technology's (NIST) Cybersecurity Framework 1.0, to be issued Feb. 13 in response to an executive order from President Obama, aims to keep it that way.
But there is considerable debate within the security community about whether it will improve the protection of CI, which includes transportation, energy, food, water, financial services and other systems.
Some, like Andrew Ginter, vice president of industrial security at the Canadian firm Waterfall Security Solutions, contend that it takes a misguided approach to the magnitude and complexity of the threats.
Ginter wrote in a recent blog post that the framework is too complicated for top management and board members of Industrial Control Systems (ICS). Worse, he said, it, "leads senior management to ask the wrong kinds of questions about the security of critical infrastructure sites," by focusing on "actuarial" risk rather than the capabilities of the most sophisticated potential attackers.
The question, he said, should not be, "How many times was the North American power grid taken down by a cyber assault in the last decade, and what did each such incident cost? The answer is, of course, zero."
Instead, he said, it should be, "When our most capable enemies attack us, what is the most likely outcome?"
Joe Weiss, managing partner at Applied Control Solutions, has argued for years that government organisations like NIST and ICS-CERT (Industrial Control Systems Cyber Emergency Response Team) are too focused on "compliance" and not enough on real security.
But, Kevin Bocek, vice president of product marketing and threat research at Venafi, said the impending Framework 1.0, "moves IT security strategy forward to include modern defensive strategies. The framework places greater emphasis of detecting and responding to security incidents instead of just trying to prevent them."
Bocek said he thinks the framework, "strikes a balance between capabilities vs. actuarial for a broad audience." And he said the fact that it includes a focus on "detection, response, and remediation instead of just prevention puts the framework ahead of many current IT security strategies that assume attackers can be locked out at the firewall."
TK Keanini, CTO at Lancope, suggested that some of the criticism may be due to unrealistic expectations. The framework is not meant to be a magic bullet, he said, but instead, "a baseline to what is reasonable should an incident occur."
Advanced threats, he said, "evolve and innovate on a daily basis whereas the Cybersecurity Framework takes months, if not years, to gain consensus and be implemented."
NIST, a non-regulatory agency of the Department of Commerce, has had Framework 1.0 in the works for a year, following the president's executive order, "Improving Critical Infrastructure Cybersecurity," signed Feb. 12, 2013. The agency said it has been developed, "by collaborating extensively with critical infrastructure owners and operators, industry leaders, government partners, and other stakeholders."
Framework 1.0 is based in large part on sections titled Identify, Protect, Detect, Respond and Recover, as a system to protect CI assets and respond effectively to attacks.
Through spokeswoman Jennifer Huergo, NIST said it could not respond to Ginter's criticism. "We haven't had a chance to digest the blog post, and would need to give it more thought," she said, but added that the Obama administration considers the protection of CI a "high priority," and believes the framework, "will be a useful tool for helping to improve the cybersecurity of critical infrastructure and other industries."
The risk of a catastrophic attack is also a subject of continued debate. Some security experts have said even a major attack would be unlikely to do much more damage than a bad hurricane. Keanini said he thinks an "apocalyptic event" is unlikely. Instead, he foresees, "just a continuous stream of security incidents that keep cybercrime profitable and organisations and individuals getting better at incident response."
But others agree with federal officials, who have warned a number of times in recent years of the risk of a "Cyber Pearl Harbor."
The potential for catastrophic damage and loss of life was demonstrated seven years ago at the Idaho National Labs in what was called the Aurora Project, where a cyber attack destroyed a diesel generator.
James Lewis, director and senior fellow of the Technology and Public Policy Program at the Center for Strategic and International Studies (CSIS), famously told CBS's "60 Minutes" in November 2009, "if you can hack into that control system, you can instruct the machine to tear itself apart. And that's what the Aurora test was." He added that it requires a lead time of three or four months just to order major electrical generators, let alone get them manufactured and installed.
At the time, CNN quoted economist Scott Borg, who produces security data for the federal government, saying that if a third of the country lost power for three months, the economic price tag would be $700 billion, or, "the equivalent of 40 to 50 large hurricanes striking all at once."
Much more recent research is unsettling as well. While some security officials have said it would be difficult to take down a broad section of the power grid because of a diversity of control systems that would require multiple types of malware to attack, three researchers from the Network Science Center at West Point published a paper on Jan. 6, arguing that an adversary could target, "certain substations and sources of power generation to initiate a cascading failure that maximises the number of customers without electricity."
Find your next job with computerworld UK jobs