Why do so many IT security projects fail completely? And why do so many Mobile Data Protection (MDP) projects, ‘designed’ to seal the gaps in an organisation’s defences, fail to deliver - leaving egg all over the faces of the IT department and a management team facing the commercial and legal consequences of non compliance and catastrophic data breaches?

There is no easy answer to these questions but even a cursory glance at some high-level failures detects the truth of that old adage: “Failure to plan is to plan to fail.” The UK Government’s £21bn National Health Service (NHS) National Programme for Information Technology (NPfIT) was the classic example of a top-down master plan dictated from on high which presumed that clinicians would accept what they were given by their IT superiors.

The result was a Mexican stand off between the British Medical Association (BMA) with the support of clinicians who merely wanted input into the basic design of the system they would be using. NPfIT failed and the tax payer is now picking up the pieces.

Security of an organisation’s data is vital. It is therefore puzzling that so many organisations fail to succesfully encrypt their data. There is barely a week that goes by without a story in the media about unencrypted lost USB sticks, laptops, Personal Digital Assistants and corporate smartphones (to name only a few and now we can add iPads to the list).

According to Gartner: “Each year, hundreds of thousands of laptops, phones and removable media devices are estimated by various sources to go missing through loss or theft, to have their data copied without consent, and to be upgraded or exchanged without having their data removed.” Gartner also reports that sales of unprotected PC systems continue to outrun the provisioning of MDP.

The Mobile Data Protection (MDP) market is approximately a $1 billion market and growing 20% annually, according to Gartner. Mobile Data Protection products secure data on movable storage systems in notebooks, laptops, smartphones, and various removable media. They may also be used on desktop and servers. A £45 USB drive can hold the names of every person on the planet; an £800 laptop could contain £125 million worth of critical information. This is why encryption is the one security technology that you absolutely cannot afford to have fail.

The end user has to examine why current encryption strategies have failed to protect data, and what steps they will have to take to adopt best practices to effectively shield themselves from the consequences of a significant - and often costly - data breach. According the Ponemon Institute, the loss of a single record in the US can cost a company upwards of $225 to remediate. One of the largest mobile data breaches to date exposed 26 million records from one lost laptop - the potential costs of non compliance are staggering. Since 2005 over ½ a billion people have potentially had their identities exposed due to a data breach according to the Privacy Rights

As a result, organisations are required to protect data by encryption and to also provide evidence that the protection is working. Buyers who want common protection policies across multiple platforms, minimal support demands and proof that data is protected must do their research and only turn to the trusted experts. What can organisations do to avoid these costly mistakes?

Encrypting sensitive data is now one of the most important safeguards to protect organisations against security breaches, particularly those which arise from loss of hardware. Encryption provides a simple yet effective way of ensuring that lost information is unusable by malicious third parties and reduces both the risk of an actual breach occurring and the cost of remediation if one does.

As the nature, complexity and seriousness of threats grows, and as concerns over loss of sensitive data through removable media and by insiders or disgruntled business partners grows, the regulatory pressure to protect sensitive information - especially in healthcare, finance and retail - has also become acute.

However, despite the avowed interest in encryption of many organisations and the large number of them which have attempted to deploy encryption technology, the size and frequency of data loss and theft continues to grow. Why, if encryption technologies for the PC have been around for almost 30 years, is this the case? The answers may lie with some commonly held assumptions.

Common mistakes to avoid

The IT security industry has grown up over the years with some assumptions about the deployment of encryption solutions which are now outdated (if they were ever true in the first place).

Many encryption deployments are often based around the assumption that a single solution can meet the needs of all types of users, resulting in failure to provide 100% coverage. The problem is that while there may be some degree of conformity within the IT infrastructure (although this is increasingly no longer the case), there is not nearly as much uniformity across types of users and data.

Attempting to force the adoption of any one single encryption technology across an enterprise environment is very unlikely to work. The factors contributing to these include:

  • Hardware inconsistencies which are incompatible with device-centric encryption
  • A constantly changing Operating System environment (Multiple versions of Windows OS, Mac OS,
  • External Media, Symbian, iPhone, Palm OS, Linux, Android etc.)
  • The Impact on end-users of deploying encryption solutions
  • Difficulties in managing the encryption technology
  • The impact on current IT management processes
  • Challenges in integration with the broader IT infrastructure
  • The cost of end-user training
  • The unique problems of a rapidly burgeoning mobile workforce
  • Addressing emerging risks such as those posed by removable media, smartphones and mobile devices

The difficulties in using a single encryption technology to meet all these challenges, and the increased pressure to provide more robust security for sensitive data wherever it resides, are now causing many large organisations to rethink their approach to encryption and adopt a far more flexible, risk-oriented strategy.

The key question is how do you select an approach which will meet the needs of your whole organisation and its culture. The key to success in your IT security project is to accept, no matter what anyone else tells you, that there is no single solution which is likely to be the ‘best fit,’ as demonstrated by unsuccessful enterprise encryption rollouts over the past 20-plus years. What is needed is an approach that enables the right encryption solution to be deployed to meet the needs of different parts of your organisation, but to be managed centrally, simply, and with the lowest total cost of ownership.

Full disk encryption will be necessary for some users, policy-based for others while self encrypting drives will be more suited to a different user. If you implement full disk encryption across the entire enterprise you risk neutralising your solution for some users and breaking the encryption for others.

Even if this is what the market generally is telling you at the moment take this advice with a large pinch of salt. A trusted supplier will recommend the correct blend for your organisation – not the same approach for all users. That approach may be the flavour of the month but it will rarely work.

The security project leader or Chief Technology Officer (CTO) will have to come to terms with all of the new approaches to encryption and understand the implications of policy-based encryption, self-encrypting drives, Microsoft Bitlocker and other yet-to-be released offerings for their organisations before they can roll them out.

When choosing a supplier it is wise to avoid those who have dozens, even hundreds of different products and are, as they say: “Ten miles wide and ten inches deep” on any one product. The CTO will have to do his or her due diligence and ensure that the provider has real expertise in desktop and Mobile Data Protection.

Companies like ours can help plan each project in conjunction with a customers’ specific operational requirements in mind and certainly do not cut corners to provide a ‘one size fits all’ solution. That is the road to ruin. Any unlucky Chief Technology Officer who does will be joining Reginald Perrin for his final dip in the ocean of failure.

Why it projects fail

Computer projects fail when they do not meet the following criteria for success:

1.    It is delivered on time.
2.    It is on or under budget.
3.    The system works as required.

There is no one overriding factor that causes project failure. A number of factors are involved in any particular project failure, some of which interact with each other. Here are six of the most important reasons for failure.

1. Lack of User Involvement

Without user involvement nobody in the business feels committed to a system, and can even be hostile to it. If a project is to be a success senior management and users need to be involved from the start, and continuously throughout the development.

2. Long or Unrealistic Time Scales

Long timescales for a project have led to systems being delivered for products and services no longer in use by an organisation. The key recommendation is that project timescales should be short, which means that larger systems should be split into separate projects.

3. Poor or No Requirements
Many projects have high level, vague, and generally unhelpful requirements. This has led to cases where the developers, having no input from the users, build what they believe is needed, without having any real knowledge of the business. Inevitably when the system is delivered business users say it does not do what they need it to.

4. Scope Creep

Scope is the overall view of what a system will deliver. Scope creep is the insidious growth in the scale of a system during the life of a project. As an example for a system which will hold customer records, it is then decided it will also deal with customer bills, then these bills will be provided on the Internet, and so on and so forth.

5. No Change Control System

Change is happening at a faster rate than ever before. So it is not realistic to expect no change in requirements while a system is being built. However uncontrolled changes play havoc with a system under development and have caused many project failures.

6. Poor Testing
The developers will do a great deal of testing during development, but eventually the users must run acceptance tests to see if the system meets the business requirements. However acceptance testing often fails to catch many faults before a system goes live.

Sean Glynn is Vice President of Marketing at Credant