It has been two years since HM Revenue & Customs revealed that it had lost 25 million records, yet here we are, in late 2009, and data breaches still seem to be hugely commonplace.

We have seen countless organisations, both in the public and private sector, losing sensitive data whether it is through leaving a laptop on a train or through being subject to an external hack.

With breaches still regularly occurring today, it seems clear that many organisations still refuse to take the necessary precautions to secure their data, despite the fact there are plenty of solutions available to enable this.

The failings of HMRC certainly saw the topic of security moving higher up both CIOs’ and CEOs’ agendas and there was an increased push to secure customer data.

However, has enough been done to ensure this information remains secure? Is an extra firewall or a new layer of encryption technology enough to ensure that another organisation does not suffer similar embarrassment to its customers? It is clear that the answer is no.

Today’s unprecedented rate of change requires business agility and faster time-to-market, whether it involves introducing new products and services or in response to mergers and acquisitions.

The only effective way to ensure IT systems are still operating at full capacity after any change to the business is through application testing using realistic data. A recent survey indicates that the root cause to many data breaches is due to the use of live data in testing and development

The survey, conducted amongst 1,350 IT practitioners in companies with revenues from $10 million to over $20 billion, looked at data security trends in testing and development1.

Two thirds of all respondents experience change on a weekly basis with a further quarter declaring this takes place at least monthly. To be absolutely sure that IT systems are fully functional in production, the vast majority of surveyed organisations use live production data, such as customer records, employee records, credit cards and other business confidential information, in the testing process.

This may raise a few eyebrows, but as long as the right security techniques are in place, organisations have nothing to worry about. They will be very aware of the risks of data breaches, due to their high exposure in the press, so surely they do not want to fall foul to one, right? Wrong.

The survey went on to reveal that over two thirds (70%) of companies do not have the measures in place to mask this live data during development and testing. This alarming statistic is made even more staggering by the fact that over three quarters (79%) of all organisations have experienced a data breach in the last 12 months.

Despite having their fingers burned once already this year, they are still putting their customers and their own information and reputation at stake by leaving themselves liable to another breach, for the majority, on a weekly basis. The risk is intensified by the unmanageable sizes of data being tested.

Three-quarters of respondents confirmed they use test data files that are larger than one terabyte, with some testing more than 50 terabytes of test data.

To give an example of the potential cost that could be incurred by a data breach, a recent study by the Ponemon Institute revealed that each record that is lost or stolen costs an organisation an average of $202. In today’s economic climate, this is a penalty no business can afford to experience.2

So the question that needs to be asked is how can organisations mitigate this risk and guarantee their data is watertight during development and testing?

To guarantee secure and realistic testing, businesses should implement an automated and repeatable test data management process. First, realistic testing requires realistic data – so, they must begin by accessing relational and hierarchical databases and other data stores from the mainframe and distributed systems. Next, this test data should be subsetted both to make it more manageable and to reduce data storage and test execution costs.

Following this stage, organisations must ensure that this process conceals private data within test data sets to adhere to data privacy regulations and eliminate the risk of data breaches. With a better test data management process, companies can accelerate and lower the cost of testing of high quality applications. At the same time, they avoid the loss of goodwill, costly penalties, and regulatory non-compliance stemming from data breaches.

In March 2009, Joseph Feiman, research VP and Gartner Fellow, confirmed the capability of data masking, stating “Data masking raises enterprises' security and privacy assurance against insiders' abuses and helps enterprises to be compliant with data-centric regulations… [It] is an integral part of software life cycle (SLC) processes.”3

As businesses strive to achieve organic growth in the toughest recession for 100 years, they need to guarantee they are looking after the information they already possess. Testing is always going to be an integral part of a company’s development – the study above showed just how often this process has to take place.

CIOs need to establish a firm data protection strategy for the production environment as well as for the use of live data in testing and application development. and the assessment and implementation of these masking and subsetting techniques need to be an integral part of this.

Cutting corners leaves organisations everywhere vulnerable to a major data leak, an event that could cause irreparable damage to a company’s database and reputation.


1) Data Security in Development & Testing, Micro Focus and the Ponemon Institute, August 2009

2) Cost of a Data Breach Study, Ponemon Institute, February 2009

3) Data Masking Best Practices”, Gartner, March 2009

Peter Mollins is director of enterprise application management software company Micro Focus