In Monster’s case it was thousands of people who are sick of their current job and are looking for a cure and in the case of the hospitals – well the people were just sick and looking for a cure. However, apart from sharing that tenuous link, both groups lost data.
In all the cases their networks were compromised. The hospitals used the same managed service provider, Verus, who has subsequently gone out of business as a result of losing all their clients data which should serve as a warning to all those managed service providers (MSPs) out there. The breaches at the hospitals were reported over several weeks but all of the data losses were eventually attributed to a single incident, in which Verus employees left a firewall down following the transfer of data from one server to another.
And this is quite frankly only the tip of the iceberg. Without legal obligation for companies to report breaches the likelihood is that most are going unreported.
However amidst all the finger pointing at Verus, Trojans and every other excuse, no one seems to have asked the question why it was possible to get access to the records just because a firewall was turned off or because someone breached the perimeter. In other words once inside the network patient records seem to have been just left lying around for anyone to have a look at.
The ability to help one’s self to highly sensitive and valuable, confidential information has never been as easy as it is today because virtually all that information is in digital format. And no matter what excuse may be given, legitimate or otherwise, it does not change the fact that organisations are playing a dangerous game when they underestimate the risk posed by the disgruntled insider determined to wreak havoc, or the insider who is just simply a bumbling idiot who is an accident waiting to happen.
Sensitive information requires extra-care. When sensitive information is compromised, the implications for the organisation can be catastrophic.
Access and distribution of sensitive information such as financial reports, clinical trial results, technical design, M&A data, is something that many organisations have not addressed adequately. Data must be secure, tracked; privacy should be maintained, and strict auditing should be applied.
Information leaks in all forms are occurring with increasing frequency today within some of the largest and most important organisations and enterprises. These breaches, whether inadvertent or as part of a coordinated attack, release highly sensitive information into the larger market where it is used to damage the originating organisation’s business, competitiveness and reputation, and also significantly impacts the privacy and confidence of their customers, partners and vendors.
Common solutions such as mail (CDs in the post for example), email or FTP suffer from several disadvantages. Distributing vast number of documents via mail is cumbersome and hard to track. FTP solutions are not reliable or secure. Email solutions, including encrypted emails, are also not reliable because they are dependent on the recipient's email infrastructure. Large files or encrypted files often tend to fail email security policies and bounce back. Organisations need global accessibility and connectivity while maintaining security.
So what steps should be taken to protect information. Well here are some basic steps that can be taken:
- Information needs to be protected from unauthorised modification, deletion, and exposure. Encryption and other security mechanisms are not helpful if someone hacks the computer and circumvents the security layers. For instance, encryption is good for confidentiality, but does not protect data from intentional deletion or accidental modifications. In order to build multi-layered security, a sterile environment must exist to accommodate and protect the security infrastructure.
- Ensure you have visual auditability – Owners of information need to actually see what happens with their information at all times. Combined with auto-logging and auto-alerting, it ensures that an organisation has a prevention and detection mechanism
- Separation of duties must be possible between the owners of the information and the administrators of the information. In other words there is no need for the IT Manager to be reading employee contracts, unless of course he or she is doubling as head of HR!
- Dual control ensures that highly sensitive data can only be accessed provided it has been authorised by another person. Similar to the concept of dual keys it ensures that access will only be allowed based on secondary confirmation. If an employee cannot simply walk into the CEO’s office and pick up a copy of the latest M&A transaction, then they shouldn’t be able to open a file on a server either, unless of course the board value the input from IT staff in making M&A decisions.
- Data should always be backed up in encrypted form, and kept encrypted even while on backup media, to prevent unauthorised disclosure.
- And access should be controlled based on user location. In other words it’s not the employers’ responsibility to help an employee show-off to the cute blonde in the Internet Café. Make sure that if the information is for internal use only then that’s exactly where it stays.
No organisation is immune to the risk of exposure, embezzlement, embarrassment. There is no such thing as the 100% trustworthy work force, and especially when you’re outsourcing or using contract staff. But until we start to see chief information security officers on the street because they are held accountable for protecting the data rather than the infrastructure these incidents will just go and on. After all, it very often is the case that we only act after we’ve been victimised and by then it’s too late.
Cyber-Ark Software is an Information Security company that develops and markets digital vaults for securing and managing privileged passwords and highly-sensitive information