How Financial Services Businesses Can Build an Effective Data Quality Approach

Many financial services organisations today fail to implement effective data quality and risk management policies. Generally, they validate and cleanse the data they receive first before distributing it more widely. Typically, their overriding focus day-to-day is on ensuring downstream systems do not receive erroneous data. By Boyke Baboelal, Director – Data Services at Asset Control.

  • Monday, 18th March 2019 Posted 5 years ago in by Phil Alsop

That’s important, of course - but by prioritising reactive ad-hoc incident resolution in this way, organisations inevitably struggle to identify and address recurring data quality problems in a more structural way. Commonly, their analysis of data quality issues is very limited. Data quality itself is therefore not demonstrable and reporting is difficult. 

 

To remedy these problems, firms need to have the ability to more continuously carry out analysis, targeted at understanding their data quality and reporting on it over time. Our experience indicates to us that very few organisations across the industry are doing this today and that’s a significant concern. After all, however much data cleansing an organisation does, if it fails to track what was done historically, it will struggle to find out how often specific data items contained gaps, completeness or accuracy issues, nor understand where those issues are most common. Financial markets are in constant flux and can be fickle and rules that screen data need to be periodically reassessed.

 

Focusing their data quality efforts exclusively on day-to-day data cleansing is also likely to result in organisations struggling to understand how often data quality mistakes are made, or how frequently quick bulk validations replace more thorough analysis.  For many firms, their continued focus on day-to-day data cleansing disguises the fact that they don’t have a clear understanding of data quality, let alone how to measure it or to put in place a more overarching data quality policy. When firefighting comes at the expense of properly understanding underlying quality drivers, that’s a serious concern. 

 

After all, in an industry where regulation on due process and fit for purpose data has grown increasingly prescriptive, the risks of failing to implement a data quality policy and data risk management processes can be significant.

 

Putting a Framework in Place

 

To address this, organisations need to implement a data quality framework. Indeed, the latest regulations and guidelines across the financial services sector from Solvency II to FRTB increasingly require them to establish and implement this.

 

That means identifying what the critical data elements are, what the risks and likely errors or gaps in that data are, and what data flows and controls are in place. Only a few organisations have so far implemented such a framework. They may have previously put stringent IT controls in place, but these have tended to focus on processes rather than the issue of data quality itself.

 

By using a data quality framework, organisations can outline a policy that establishes a clear definition of data quality and the objectives of the approach. In addition, a framework can document the data governance approach, including not just processes and procedures but also responsibilities and data ownership.

 

The framework will also help organisations establish the dimensions of data quality: that data should be accurate, complete, timely and appropriate, for example. For all these areas, key performance indicators (KPIs) need to be implemented to allow the organisation to measure what data quality means in each case. Key risk indicators (KRIs) need to be put in place and monitored to ensure the organisation knows where its risks are and that it has effective controls to deal with them. KPIs and KRIs should be shared with all stakeholders for periodic evaluation.

 

How data quality intelligence plays a part

 

A data quality framework will inevitably be focused on the operational aspects of an organisation’s data quality efforts. To take data quality up a level, businesses can employ a data quality intelligence approach which enables them to achieve a much broader level of insight, analysis, reporting and alerts.

 

This will in turn enable the organisation to capture and store historical information about data quality, including how often an item was modified and how often data was erroneously flagged –good indicators of the level of errors as well as the quality of the validation rules. More broadly, it will deliver critical analysis capabilities for these exceptions and any data issues arising, as well as analysis capabilities for testing the effectiveness of key data controls; and reporting capabilities for data quality KPIs, vendor and internal data source performance, control effectiveness and SLAs. 

 

In short, data quality intelligence effectively forms a layer on top of the operational data quality functionality provided by the framework, which helps financial services firms to visualise what has been achieved by that framework, making sure that all data controls are effective, and that the organisation is achieving its KPIs and KRIs. Rather than acting like an operational tool, it is a business intelligence solution, providing key insight into how the organisation is performing against its key data quality goals and targets. CEOs and chief risk officers (CROs) can benefit from this capability as can compliance and operational risk departments.

 

While the data quality framework helps manage the operational aspects of an organisations data quality efforts, data quality intelligence gives key decision-makers and other stakeholders an insight into that approach, helping them measure its success and clearly demonstrate that the organisation is compliant with its own data quality policies and with relevant industry regulations. 

 

Ultimately there are many benefits that stem from this approach. It improves data quality in general of course. Beyond that, it helps financial services organisations demonstrate the accuracy, completeness and timeliness of their data, which in turn helps them meet relevant regulatory requirements, and assess compliance with their own data quality objectives.  It’s clearly time for firms across the sector to start bringing their data quality processes up to speed.