Turning compliance into a mandate for data governance

IT and business technology professionals in financial services should use the increasingly intrusive compliance demands of governments and regulators to tackle age old data management problems. The insurance industry, for example, is facing...

Share

IT and business technology professionals in financial services should use the increasingly intrusive compliance demands of governments and regulators to tackle age old data management problems.

The insurance industry, for example, is facing Solvency II, a fundamental review of the capital adequacy regime for the European insurance industry.

The aim is to establish a revised set of EU-wide capital requirements and risk management standards that will replace the current solvency requirements. Rather than chafe against the regulations that will result, perhaps we should welcome them.

The new regulations should sound the death knell for spreadsheet sprawl, inaccurate, duplicate and poorly integrated data. After all, as many IT professionals have long realised data is a key strategic asset and it should be supervised and invested in accordingly. It seems this view is now shared by the regulator as the FSA’s Head of Insurance, Julian Adams made plain at the FSAs recent Solvency II conference:

"Solvency II will bring about very significant changes to the reporting regime for all firms, whether they are using a model or not. From a regulatory perspective this is a considerable enhancement in the quality of the data we will receive, and in the way it is submitted to us and shared amongst supervisors across Europe."

The above quote demonstrates the change in approach we are seeing from the FSA. High quality data is important to capital adequacy compliance because a capital model, even a sophisticated one, is only as good as the data it analyses. This situation isn’t just an insurance issue; banks, too, must ensure they are feeding capital models with accurate data. However, the regulatory approach has changed since Basel II came into force for banks.

Within many of the FSA’s communications about Solvency II they have made it plainly clear that not only will they inspect the results of model calculations and the way the model is designed but they require evidence of data governance - that is, how data has arrived into the model and how the insurer can prove that data is validated.

Take for example the first FSA analysis of UK insurers’ internal capital adequacy models. The IMAP Thematic Review published in February this year states “We witnessed little challenge or discussion on data quality at board level. We expect issues and reporting on data governance to find a regular place within board and committee discussions. Firms need to ensure that adequate and up-to-date quality management information is produced. It is important that the board has the necessary skills to ask probing questions.”

As a veteran of the data management industry, I have witnessed organisations across different sectors grapple with the challenge of ensuring high quality business data, but never before have I seen such a direct regulatory mandate for board level supervision of data. This is an opportunity, not a threat. Data and IT professionals within insurance companies have been presented with the foundation for a business case to tackle the long-term data problems the industry has faced. Data is now a boardroom issue and that necessitates investment, KPIs and ongoing governance.

So how do insurers go about the task of guaranteeing that their business is operating with accurate, complete and appropriate data? In our experience of helping both large and small insurers - life and non-life - the basic steps are relatively similar. The overall objective is to develop a framework for data governance that is part of business as usual operations. This means working to define business rules that govern the content of data and enforce data management policies. It means having a clear view of what a data record should contain with definitions of sources and attributes - a data dictionary if you will.

Finally, the framework must be based on an appreciation of what accurate, complete and appropriate means and should have a mechanism to report on those three metrics over time. If you can establish these elements then your firm will be in a good position to provide a set of key data indicators to the board and regulator that show how data is being governed.

As with any large data management programme, one of the most significant hurdles is ensuring the organisation is able to work towards united goals, when often individual lines of business or stakeholders don’t immediately benefit from the initiative. Let’s take a simple insurance example of the retirement age field within policy holder records. It’s important for an insurer to understand when its policy holders plan to retire as this can impact the time a pension turns into an annuity which has knock-on investment and capital adequacy effects, and this data would certainly be fed into capital models. In addition, the age of retirement field should be used to trigger annuity awareness marketing campaigns several years prior to a customer retiring. Simple errors in this field are common. For example, numbers are often missing all together or the field populated with an antiquated value. One insurer we worked with found that all customers working with one of its independent financial advisors were due to retire at 63.

This problem was only identified by conducting data profiling, that is, analysis of the insurer’s data to locate and pin point anomalies, patterns and errors. The plan for fixing this seemingly simple data quality issue was organisation-wide. The pension origination system was modified to make retirement age a required field, call centre staff were retrained about the importance of the field and to re-enquire about changes to retirement age when in contacting customers, the IFA groups were required to update the field in their annual customer meetings and on-going data monitoring was put in place with KPIs and a dashboard to report on progress over time.

This example demonstrates the depth of some seemingly basic data quality problems. From a Solvency II standpoint insurers need to have established the data monitoring and governance procedures that allow them to report on the accuracy, completeness and appropriateness of data from sources such as the pension origination system. This approach is the key to ensuring the board can understand the state of the company’s data asset and that capital adequacy modelling can be conducted with confidence.

by Colin Rickard, Business Development Director, EMEA DataFlux