Consumer Panel research highlights concerns about the use of personal data, algorithms and AI in financial services.
The Financial Services Consumer Panel has today published new research which found evidence that suggests consumers with protected characteristics are experiencing bias, due to the way financial firms are using personal data and algorithms.
The Panel found anecdotal evidence and widespread concerns that bias is introduced into financial systems and processes - intentionally or unintentionally - which can lead to consumer harm. For example, the research found that consumers were experiencing unfair bias relating to their ethnicity in terms of access to products, pricing of products and service received. You can read the full report here.
There is a notable absence of concrete evidence that algorithmic decision-making is the direct cause of consumer harm. The research suggests this is because firm’s use of personal data and algorithms is complex and opaque, making it difficult for anyone to understand:
- how consumers’ data is being used ‘behind the scenes’
- the consequences such decisions can have on consumers’ lives.
The Panel believes that this lack of evidence should be a significant cause for concern. The ethical use of algorithms and AI is a priority for the Panel and effective governance to mitigate the risk of consumer harm - is currently a topic of lively debate amongst consumer organisations, regulators and government.
The Panel’s recommendations to the FCA
The report notes the critical role of regulation in addressing the issues and calls for greater oversight of firms’ use of personal data, algorithms and AI. The FCA is currently consulting on the safe and responsible adoption of AI in financial services. As part of this work, Panel calls on the FCA to take further action to better understand how firms are using consumers’ personal data and how this may affect consumer outcomes.
Based on our research findings, the Panel recommends that the FCA:
- Conducts analysis to assess whether consumer harm is occurring as a result of (intentional or unintentional) bias in firms’ collection and use of data and, if so, to identify the nature and scale of that harm. The FCA is the only organisation in a position to carry out this work, as it has the unique ability to request access to relevant data and information from firms.
- Provides, based on the findings of this analysis, guidance for firms on the ethical use of data including examples of good and poor practice.
- Insists that firms understand and are able to demonstrate, what personal information is collected and why, how consumers’ personal data is used in their algorithms and AI and how this drives decisions and outputs that affect consumer outcomes.
- Requires firms to disclose to the FCA, and make available to consumers, which data fields are used when making decisions that impact consumers. This will increase transparency and accountability around firms’ use of data and empower consumers, and organisations that represent them, to challenge it.
- Ensures that firms have a named individual accountable for their use of algorithms, decision engines and AI, either through expansion of the roles detailed under the SM&CR or some other mechanism.
- Embeds the consideration of algorithms and AI – and their potential to affect consumer outcomes and cause harm - as a cross-cutting theme, relevant to all workstreams.
The Panel works closely with the FCA to raise awareness of potential harm and to promote consumer interests. We will continue to advocate for adequate consumer protection in the use of personal data, algorithms and AI.
In March 2023, Financial Services Consumer Panel commissioned Thinks Insight & Strategy, to look for evidence on whether use of personal data and algorithms by financial services providers was leading to unfair, biased or discriminatory outcomes for consumers with protected characteristics.
This followed a report ‘Discriminatory pricing. Exploring the ‘ethnicity penalty’ in the insurance market’, published by Citizens Advice in March 2022 which found that people of colour were paying £250 more a year for their car insurance than white people. The Panel wanted to explore whether there was evidence of similar discrimination and harm in other financial services markets.
The research included a review of published literature in the UK, Australia, Canada, the USA, and Europe, to explore examples of bias and best practice. It also included in depth interviews with eight leading experts in the field of AI and consumer protection.