The focal point for keeping tabs with security and compliance activities has been the Security Operations Centre (SOC): a physical location that is the front-line to handle incident reports, review system logs and constantly monitors the environment 24x7.
The latest Wikileaks incident pits the security operations of some of the world’s most IT savvy companies (Visa, Mastercard, Paypal and Amazon.com) against relatively un-organised “hactivists”, and reinforces the importance of the SOC.
The Wall Street Journal gives a picture of what an SOC looks and feels like: “in PayPal's network operation centre, charts showing total payments processed per minute and total traffic to the site, along with other data, are projected on a large, curved wall in front of around 20 workstations, each holding three to five computer monitors.” Add security events and blinking lights for threat alerts and you get an idea of what a SOC is.
Despite the importance of the SOC, the conventional model of security operations is to pick and choose from a menu of out-of-the-box technologies and tools with no over-arching strategy or long-tail capabilities road-map. By the latter, I mean a lack of investments in niche and customisable applications such as illicit insider threats, dealing with persistent threats, and dealing with business or industry specific threats.
What is needed is a new working model that helps guide an organisation towards a better-quality operational picture that is more responsive, rather than reactive. I am not suggesting new technology or standards but a framework to orient the enterprise in sourcing, acquiring and deploying the arsenal.
The model starts with an abstract layer of requirements to help evaluate the maturity of an organisation. Here are some of the competencies or master specifications that could apply to the “next generation” SOC:
Process Automated Responses
- Operator activity should be automated processes that accommodate human-in-the-loop workflows for decision making that can be optimised. The aspiration here is that processes shuttle information to policy makers, engineers, and the C-Suite and application developers. If data encryption is ramping up, then security operations would be prepared to deal with the ramifications of additional reporting. An anti-virus clean-up task would trigger notifications to engineers and analysts to spell out countermeasures.
- Strategic plans and courses-of-actions would govern pre and post incidents and are designed to avoid disruption to the mission or business. Basically, what is needed is a sort of “rules engine” that makes the system hum. EINSTEIN 3 is a system that will be deployed by US government agencies that “will have the ability to automatically detect and respond appropriately to cyber threats before harm is done, providing an intrusion-prevention system supporting dynamic defence.”
Enablement of Virtual Resources
- Organisations will be able to rapidly source high powered computing resources, to process sudden or unplanned volumes of traffic as well as test countermeasures. For example, a bandwidth-based attack, such as denial-of-service, would be met with defence that shield parts of the network from collateral damage. There are vendors that offer tools to simulate the behaviour of malware in safe and condoned test-bed environments. Support for integration and interoperability is key.
- Private and public partnerships are used to analyse indicators of attacks early into the planning phases of the adversary. The National Security Agency (in the US) could work with private firms to alert them of possible cyber strikes. For example, Google is working with the NSA to help sort through the hack of its computers in China.
- A virtual team of experts and analysts will make-up the diverse mix of users and consumers. A recent study by the Centre for Strategic and International Studies nails home the point that we simply don’t have the talent to stay steps ahead in security. Thus, outside collaboration and the means to “source” or bring together top talent and niche security expertise will be critical.
- Fusion of all sorts of data is pivotal to create higher-fidelity alerts of intruders, and spot subtle yet suspicious activity. The idea is to monitor ingress and egress points and pull data from human resources, firewall logs and even law enforcement. The US government’s EINSTIEN 2 system is meant to up the ante and is installed at all connections between government computer systems and the public Internet. The system gathers and sketches out threat signatures from foreign intelligence and Department of Defence information assurance missions.
- Absorbing and monitoring Internet traffic; human, software and computer activity is one thing and then trying to make sense of it all is quite another. The mathematics behind statistical analysis and analogue techniques in data mining offer powerful aids to understand past and present events. Forecasting models can be used to project probability of whether a threat scenario will come to pass.
- Privacy-enhancing mechanisms will be needed to limit the collection and retention of personally identifiable information. When speaking about any surveillance recommendations the temptation will be to over-reach authority and the risk of privacy violation is all too real. Basically the principle here should be to redact or obfuscate sensitive content that is not pertinent to down-stream threat analysis or won’t help the advancement of compliance.
Drop me a line at [email protected] with your thoughts on whether the above themes are relevant to CIOs, CTOs, CISOs and business executives struggling to get their heads around where to prioritise their security investments
By Walid Negm, Senior Director Cloud and Cyber Security Offerings, Accenture