Data quality controls in benchmarks sector – review

Multi-firm reviews Published: 28/07/2025 Last updated: 28/07/2025

We're setting out findings from our review of how benchmark administrators manage data risks. We’re also sharing wider lessons for the sector and our next steps.

1. Who this applies to

This multi-firm review will be of interest to:

  • Benchmark administrators.
  • Price reporting agencies.
  • Data suppliers.
     

2. Why we did this work

In our recent portfolio letter, we outlined concerns with firms’ data quality controls. Through previous supervisory work, we had found cases where incidents had occurred due to weaknesses in firms’ data ingestion (collecting and importing data) and monitoring controls.

Benchmark administrators (BMAs) should be confident that their control frameworks adequately reflect the risks from data sources. We reported we had seen instances where firms’ data controls did not appear to have kept pace with the development of their products.

This review aimed to understand how firms meet their obligations under FCA Principles 2 and 3 (‘due skill, care and diligence’ and ‘adequate risk management systems’), as well as in the relevant requirements under the UK Benchmarks Regulation.

Earlier in 2025, we issued a short survey to 10 firms of varying sizes with different structures, business models and product ranges. We asked questions covering 5 key themes:

  • Supplier onboarding.
  • Data quality oversight.
  • Resilience and incident response.
  • Governance & assurance.
  • Emerging risk awareness.

We met firms to clarify our understanding and seek further detail. 

3. What we found

3.1. Overall assessment

Although there were some good arrangements in evidence at each firm, the overall picture was of varied practices which didn’t consistently support a robust control environment.

For each of the 5 themes covered by our questionnaire, we outline below the factors which appeared helpful in managing the associated risks.

3.2. Data supplier oversight

Data suppliers have a major impact on the accuracy and resilience of the benchmarks that BMAs produce. We wanted to understand how BMAs assessed the risk of a supplier when it was onboarded and how BMAs ensured they identified any changes to the supplier’s risk profile.

The significance of any single supplier may change over time. This can be either because the supplier or the way its data is used in benchmarks has changed. Inadequate oversight of data suppliers is a risk to the integrity, continuity and compliance of benchmarks. Proactive governance and ongoing data supplier risk management contribute to an effective benchmark controls framework.

We considered whether BMAs’ processes for onboarding their various input data suppliers reflected each suppliers’ different level of risk. Good oversight saw the supplier onboarding framework clearly documented, consistently applied and regularly monitored at the appropriate level within the firm.

Scheduled data supplier risk assessments supported efficiency and dynamic risk management. This approach meant ongoing due diligence could be aligned to any changes in the supplier’s risk profile.

Well-designed management information (MI) and key risk indicators (KRI), with metrics set at the right level, demonstrated appropriate oversight of supplier relationships. Good MI and KRI design improved the maturity of firms’ risk management frameworks by enabling them to, for example, dynamically adjust supplier oversight.

Firms appeared better able to identify the impact of a data error when they could easily trace the connections between data suppliers and the benchmarks reliant on them. This data lineage capability also enabled firms to better monitor their data suppliers’ performance. We observed that, while there was a front-end cost in developing traceability, it enabled simpler investigations when problems arose. Firms with good mapping capability also appeared better placed to pinpoint root causes when a benchmark calculation issue was data-related.

3.3. Data quality oversight

Effective risk management was best supported when controls were proportionate to the risk posed by the data being ingested. We saw different models in place across the firms we met. Some firms had a standardised set of controls across all data types; others had different layers of controls which they applied for specific types of data. Either of these options might be appropriate for a given firm.

A one-size-fits-all process was sometimes represented by firms as offering consistency and operational efficiency. A more tailored approach to oversight appeared to work better for firms which administered benchmarks across different asset classes. These firms had considered distinct data characteristics, market structures and their respective risks when designing their controls. In practice, this meant that data quality risks were addressed from different angles, including the nature, source and ultimate use of the data.

Oversight was most effective in identifying and addressing error themes when firms had suitably detailed MI, kept it under review over time and reviewed it at the right level within the firm. Where MI design appeared unchanged for extended periods, it seemed less certain the MI would be capable of identifying evolving risks or enabling the right decision-making.

The most effective MI was appropriately detailed, providing insight and utility. This in turn supported robust governance, accountability and informed decision-making, while reducing dependence on key personnel for interpretation.

Based on the information given to us, we were not always able to understand where decisions were being taken as a result of the oversight in place. In some cases, actions were being taken due to early identification of potential risks, while in others it was less clear that the oversight arrangements had led to specific changes or improvements.

In some cases, the apparent gap seemed to be linked to inconsistent record-keeping. Firms’ actions in response to oversight controls weren’t always well documented, with decisions being made outside of formal meetings. In other cases, we found less evidence of follow-up actions. One positive indicator of oversight quality was where firms could show they had used data to engage with vendors and so reduce operational risks.

3.4. Resilience and Incident Response

We looked at BMAs’ ability to respond to errors in data supplied and recover from interruptions to data provision. Enhanced resilience reduces risk to benchmark users as well as lowering operational risk to the BMA itself. As part of their obligations under our principles to operate their businesses with ‘due care’, we expect firms to maintain their operational robustness and incident response capability. This is so that they can maintain the continuity of their benchmarks and can respond effectively to disruptions.

Resilience appeared less effective when contingency plans for data availability and integrity were less comprehensive or unlikely to be scalable. We saw that some BMAs had thought about how relevant external stakeholders understood the contingency plans in place, and how these might be enacted. This clarity and predictability should reduce challenge when these plans are implemented. The best contingency plans were clear, unambiguous, scalable and had been validated.

Having a clear line of sight between the data received from providers and the benchmarks where their data was used helped firms’ respond to incidents. Firms with broader product ranges often felt it was unsustainable to maintain data mapping documents. Instead, they had put in place systems which enabled traceability between data and benchmarks. A reliable lineage capability, whether system-based or documentary, should enable firms to quickly identify the impact of errors at the necessary pace and scale. This capability would also strengthen firms’ operational resilience.

3.5. Assurance / Governance

We recognise that BMAs have different governance arrangements. However, we expect firms’ management bodies to effectively oversee strategic decision-making, risk management and regulatory compliance. Having a strong governance and assurance framework supports the accuracy, integrity and reliability of benchmarks. This also helps demonstrate a firm’s commitment to high standards of accountability and market integrity.

Some firms shared thoughtfully designed MI which appeared likely to enable good decision-making. There was a clear link between the underlying data and the issues which the committee or management body were being asked to consider.

Some firms recognised that the appropriate level of detail and context to be provided in MI packs might vary over time. For those reasons, data escalated to governance bodies should be periodically tested to ensure that it continues to allow effective risk identification and decision-making.

Where we found evidence of good governance, a key enabler was a clear and documented escalation process. This included prescribed thresholds, appropriate accountability and criteria that seemed to have been refined to address evolving risks. When firms accurately recorded escalations and decision-making, they could have greater confidence in their risk oversight. This record-keeping also allowed them to show evidence that their governance arrangements were effective.

Assurance arrangements across the firms varied greatly; often appearing fragmentary and disjointed. In some firms, the assurance around technical controls appeared weak with risk oversight difficult to measure beyond the first-line. Firms with more mature risk-management had a risk-based data quality control framework, underpinned by independent second-line monitoring, periodic third-line validation and appropriate escalation mechanisms.

3.6. Emerging risk awareness

In recent years, the external risk environment has changed significantly. Some of these changes, for example in technology, give firms opportunities to improve their operational resilience and the quality of their product offerings. Other developments, including increased market volatility and changes in market dynamics, create challenges to firms’ operations or may undermine their resilience. As part of this review, we wanted to see how BMAs undertook proactive risk management.

We didn’t consistently see evidence that firms’ control frameworks, including policies and processes, were evolving to address new and emerging risks. This included risks from within the business (growth, new products, etc) and from outside the business (eg market developments and relatively novel technologies, such as Generative AI etc).

For risks emerging from within the business, appropriately designed and calibrated MI appeared to make it quicker to identify new issues and risks. Periodic reviews of risk management frameworks appeared to leave firms better prepared for emerging issues. Some firms had strengthened their risk awareness with independent input from third-line functions, external stakeholders or independent non-executive directors. A further good indicator of risk management was where firms undertook purposeful horizon-scanning to identify external risks and opportunities.

4. What we expect from firms

Different types of data may require different controls. The diverse nature of firms’ ingested data, sources and collection practices can complicate the design of controls. However, as products become more complex and product ranges grow, firms need to ensure their control frameworks remain appropriate and proportionate to the risks that arise.

Gaps in control frameworks increased the risk of inaccurate benchmarks being produced with the potential for harm to end-users. Weaknesses in controls create vulnerabilities for the firms themselves, including reputational and commercial damage. From our supervisory work, we have seen operational harm to BMAs due to the increased numbers of errors and the resource lost in investigating and resolving these. Enhancing data controls can improve firms’ resilience and operational efficiency.

Firms’ senior managers may want to consider how these findings may apply to their businesses and adopt any relevant examples of good practice.  
 

5. Next steps

We will be carrying out further work later in 2025 and in 2026 on other risks set out in the portfolio letter, including benchmark controls and corporate governance.

Addressing any weaknesses in data quality should help firms gain confidence in, and better demonstrate, the effectiveness of their arrangements in these additional areas, as well as strengthening their overall operational resilience.

If you have any questions about this review, please contact [email protected].