Key management is the foundation of a solid encryption implementation. Unfortunately good key management can be hard to implement and hard to understand. But unless an organisation establishes a systematic approach to generate, rotate and store its keys, its encryption activities will be largely futile.
It seems as if it should be so easy to just have a key and use it to decrypt and encrypt. Most people don’t care about how the key is generated or how good the key is -- but these are the keys to your kingdom and you need to properly protect them. And that takes a little bit of effort.
One problem I often see is key management systems that are designed for one database brand or for one operating system. This makes it hard for an organisation to implement enterprise-wide key management. Things are starting to change in the industry, though. We have players who have taken steps to implement enterprise-wide key management systems, so it will get better.
I also believe that any solution that only does key management is of lower value than one that provided key management together with enablers; functions that can use the keys and encrypt your databases, files, and applications.
A mature solution should combine a good key management system with encryption agents or plug-ins for DB2, Teradata, Oracle and other platforms. Less mature solutions are missing those endpoints that are doing the encryption jobs for you. That is an issue because most database products are not open to centralised key management.
For example, the newest Oracle version cannot receive the key from a central key manager. It is the same thing with the newest SQL server, the newest Informix release and so on. So you sit there with the central key management system but there’s no way to use the keys when you’re using a point solution.
If companies want to strengthen the security of their key management plan, the most important step is regulating separation of duties. You need to make the key custodian a separate role and responsibility that is carried out outside your operations systems, away from your data management and database activities. It should be a separate key management system and it should be auditable.
Another best practice is defining different key classes. You should have different rules regulating the keys that lock down your most critical data.
The type of data the key is protecting should make the difference between whether it should be generated in hardware or software, or if the key should be stored or cached on a local distributed system. It will also determine how often the key should be replaced.
With less critical data you can go for more operationally efficient models of key management with lower cost, lower administration and higher performance.
It is always a balance. Some vendors tell their clients, “All of your keys are very secure; they’re locked away in a remote location”. But sometimes this claimed extra security adds a performance hit so that makes the system unusable. You’ve got to be realistic, especially when the data in question is highly critical.
Step back and look at the entire key management security chain and determine the weakest link. Fix that weak link but don’t be a total fanatic about it. To devise a workable security plan you really need to know your biggest vulnerabilities and risks.
Then take appropriate measures to find best practice solutions to address those vulnerabilities but be holistic and reasonable about it. For example, some people worry about protecting their keys in memory.
It is possible to do that, but you should really take a step back and look at your systems and ask, “Is that the most vulnerable point in my system?” No, probably not. Most likely you have a lot of other exposures that are much more important to address.
Find your next job with computerworld UK jobs