Time for firms to prepare for tougher data quality management guidelines

As the Basel Committee on Banking Supervision (BCBS) and the national regulators across the world continue to be increasingly prescriptive and challenging of firms’ data requirements, the need for firms to undertake infrastructure and technology enhancements to upgrade their data management systems and processes intensifies.

In the Eurozone, the ECB is clearly setting out its expectations on data quality management, governance and control frameworks that firms should have in place for credit risk data through the Targeted Review of Internal Models (TRIM).

TRIM first introduced the ECB’s expectations on both modelling methodology and assumptions and data quality related principles in February 2017. Focusing on the processes and systems surrounding the Internal Ratings Based (IRB) models, these guidelines set out the data related standards that firms should follow throughout their risk modelling processes.

The ECB’s on-site inspections during TRIM have revealed that most banks’ data processes are not in conformity with the regulatory expectations, with major gaps identified in some cases triggering punitive capital add-ons from the ECB. This prompted an ECB consultation on 7 September 2018, which broadly kept the principles on data quality unchanged but increased the rigour over the data quality requirements with respect to systems, processes and governance. In this blog we highlight the main implications of the ECB’s consultation in relation to data processes.

While TRIM primarily involves the firms under the ECB’s direct supervision, these data-related requirements are relevant to all firms as they broadly echo the Basel Committee Principles for effective risk data aggregation and risk reporting (BCBS 239) and have been drafted in close cooperation with the national regulators.

Firms should take into account five broad implications while revising their legacy data management processes.

First, they should have data governance policies, control framework and data processing procedures in place to ensure the accuracy, completeness and appropriateness of the data used in their models or current exposure data. This should be based on a global data quality framework which has been approved by the board and distributed across the firm.

Second, firms should deploy robust, well-documented and adequately tested IT systems, with clearly defined implementation testing processes. Their data quality controls should focus on the final dataset used for probability of default (PD)/loss given default (LGD)/conversion factor (CF) modelling, ensuring a systematic approach to data quality measurement and of appropriate monitoring/reporting along the IRB data chain. In particular, they should have adequate details on information extracted/loaded in specific systems and applications to identify insufficient descriptions of the process for constructing the modelling reference datasets.

Third, they should have procedures in place for assessing compliance with the data relevant quality standards such as dashboards. They should have adequate standards for data reconciliation across and within systems including among accounting and IRB data. This should also include documentation on the IT infrastructure, directly or indirectly, related to IRB, the existing data quality framework and other related processes.

Fourth, firms should avoid a siloed approach to the management of data quality in terms of source systems, informational systems and modelling datasets. Their manual inputs should be fully depicted within the systems flowcharts and integrated data dictionaries to allow a third party to gain a full understanding of their existing data framework. In addition, they should ensure that their data quality management activities are independent of data processing activities and that these are periodically assessed by the internal audit function or another independent auditing unit.

Finally, firms should have a full audit trail for data quality checks and controls to ensure the traceability of key variables from the modelling and current exposure data sets to the source systems and existence of updated and unique data dictionaries. In particular, firms should be able to provide evidence for the traceability of the production of the PD/LGD/CF databases, particularly in between the source systems, data warehouses and/or datamarts.

Careful alignment of these regulatory expectations on data sourcing, governance and reporting may help reduce cost in the long term by avoiding duplication of labour, time and cost. Those firms which take early action to begin improving their data processing, governance and control frameworks are likely to gain a competitive edge.

Luís Filipe Barbosa

Luís Filipe Barbosa | Partner, PwC Portugal
Profile | Email | +351 213 599 305

Mete Feridun

Mete Feridun | Manager, Financial Services Risk and Regulation
Profile | Email | +44 (0)7483 362 070

Read more articles on