The increasing importance of technological change creates new challenges for regulation. In the latest of our series of articles on future market dynamics, we consider market power in the age of Big Tech and how consumers’ data can be used for good, but also as a tool for abuse.
Big Tech in finance
Big Tech firms are increasingly collaborating with financial institutions and Fintech companies to provide a wide array of new and enhanced financial services.
These range from the simple and well-established, such as Samsung’s recently announced entry into the payment card market in collaboration with Fintech SoFi and Mastercard, to the innovative and potentially disruptive, such as Facebook’s ongoing proposals for a digital currency.
Such innovation is hardly surprising: Information processing, developing insights, and matching people with complementary demands is at the heart of the financial sector. It is also at the heart of the business models of Big Tech firms in the so-called “platform economy”. In the future, we can expect more, and larger-scale, entry into financial services markets by platforms such as the Big Tech players.
Big Tech firms are perhaps uniquely placed both to spot and to implement new financial sector opportunities because they often have knowledge of the life-events, big and small, that trigger financial services demands. They also benefit from substantial network and scale effects, which they obtain through hosting and supplying services on their platform that are valued by consumers and promote interconnectivity with other consumers.
Consumers can benefit hugely from the convenience of obtaining multiple financial services on a single platform. But this convenience may also reduce the incentives to shop around for more competitively priced or appropriate financial services. This could result in a wide range of regulatory challenges, many of them familiar in themselves, but applied to very new contexts – ones which the FCA Handbook never previously envisaged.
New ways of buying may need new ways of regulating
The platform economy today is often characterised by “competition for attention” in which the aim is to attract and retain consumers on a platform.
Many such platforms provide consumers with free access to services in return for being exposed to targeted advertising or listings. Platforms earn profits on the back of the products sold through them. No one would walk into a high-street bank branch for entertainment, and yet in the platform world, our entertainment desires are satisfied in the same place, often at the same time, as offers are made of financial services.
Yet many financial products and services are complex, some sold as add-ons to other products, and it can be difficult for the consumer to judge whether a particular product is suitable – especially since many financial products are purchased infrequently. There is little room for learning and consumer mistakes can be costly.
It is these fundamental characteristics of financial products which provide the underlying logic for a great deal of the FCA’s analogue-era rule book. The new platform economy may change how we buy and choose but it does not alter the common difficulties of being a consumer of financial products.
But this does not mean the FCA’s Handbook is well set up for regulating a platform economy. We will need to ask whether existing FCA rules on how products are sold are appropriate for the sale of financial products and services by Big Tech platforms (for example, with so many financial products sold as add-ons to other purchases).
We will also need to ensure that some Big Tech firms do not seek to circumvent regulation through "regulation shopping" - setting up in jurisdictions with lower regulatory hurdles - or designing products outside of the FCA’s perimeter.
Disrupting the disruptors
None of this is to deny the enormous benefit and greater competition that might come from platform-enabled sales. The FCA may indeed have a role in easing the way for Big Tech companies to enter financial services markets by ensuring our regulations do not pose disproportionate barriers. For example, Know Your Customer regulations might need to be re-examined in the light of technological advances. The regulator may even be able to use its convening and coordination powers to help set common standards, such as has already started under Open Banking for consumers’ transaction data exchange.
However, the FCA needs to look beyond Big Tech, because today’s disruptor might quite easily become tomorrow’s dominant incumbent. As such, they may create barriers to firms that would disrupt the disruptors – the next generation seeking to compete with Big Tech platforms.
Three particular barriers are likely to need to be carefully considered in future.
First, access to data. Many Big Tech firms have vast, real-time treasure-troves of information on consumer preferences and behaviour.
But lack of access to this data can often be a barrier to entry for firms looking to compete, either directly with the services offered by the platform operator or indirectly through innovating with complementary financial and other services.
Open access to data, greater personal data mobility and open standards for systems could all potentially play a role in reducing barriers to entry and promoting innovation. (There are of course challenges because of the importance and value of privacy. This is a vital aspect of this debate and we will consider it in greater depth in our next article).
Second, network effects. The value of the platform often arises in large part due to the presence of others on the platform – friends on a social network, service providers and customers on a search engine, etc.
Network effects often benefit consumers directly – it, is exactly why networks arise in the first place. But they may also harm consumers in the long-term, because effective network-on-network competition is rare. Networks have high fixed costs of creation, and consumers will often find “multi-homing” - having a presence on substitute networks – an unnecessary burden. Both of these make network competition typically weak or even entirely absent.
Open standards and architectures can be a solution to network monopolies, as they have long been in telecoms, for example. With big tech platforms, the FCA might find value in mandating standards for data sharing, as it has started to do with Open Banking. Moreover, it could extend the "sandbox" concept requiring established platforms to provide innovators with an environment in which they can test new ideas safely.
Third, creating competitive imbalance with other products. A Big Tech platform might seek to exploit customer inertia by creating a retail environment in which they make their own products more salient or more convenient to access than independent products on the platform – a technique known as "self-preferencing".
The FCA has developed sophisticated rules about selling, informing and advising, based originally on analogue contexts – these will need to be repurposed for an environment where the position on a list, or the way consumer reviews are represented, may start to blur the line between sales and advice.
New forms of market dominance in financial services
But it isn’t just these apparently familiar types of market power that might emerge. The digital age also creates entirely new forms of potential anti-competitive behaviour, two of which merit special mention.
First, some technology firms have been accused of “killer mergers” – of buying early-stage companies that might develop into strong competitors.
Second, algorithmic pricing and product selection may come to pit one AI against another, with the possibility that they will discover mutually beneficial anti-competitive strategies. And if a machine learner breaks competition law, who is responsible?
These cases demonstrate that the FCA’s objective of ensuring effective competition may face new challenges, and therefore require new ways of working and of cooperating with other agencies.
A need for joined-up solutions
The regulatory response to platform market power is being actively considered in many parts of the economy, and the debates and conclusions will have relevance to a regulatory response in the financial sector.
One of the key recommendations of the 2019 Furman Review was that digital platforms deemed to have "strategic market status" should be subject to additional regulation, including a code of conduct.
Part of that code would ensure that firms selling products and services that competed directly with the platforms’ own services should be given access to the platform on fair and non-discriminatory terms. In addition, the report recommended the promotion of systems based on open standards.
An extreme version of an open-standards based approach would be to enforce vertical separation of the platform from the sale of complementary services - splitting the digital distributor from the financial product provider. So, for example, perhaps Facebook could offer users competing digital currencies and wallets, but not any proprietary ones.
The regulation of Big Tech platforms is, of course, a global and cross-sector issue, and the FCA will need to work closely with other economic regulators across the UK and the world to develop appropriate solutions.
Data as the “new oil”?
The rise of Big Tech has been strongly facilitated by the way they have been able to use and understand their customers’ data. In the digital economy data plays a number of vital roles – it is at once a commodity for sale, a corporate asset, and a source of potential market power. Some have called data the “new oil”. Others have likened our data trails to radioactive waste.
But above all, data is often personal.
That brings us to the question of how data is used, how that use may affect the design of financial products, and how it may transform the relationship between the financial service provider and the consumer.
Data use can bring benefits but poses risks
Firms are increasingly gaining access to a wealth of data on the preferences and behaviours of individual consumers. This may be a boon, but it also poses risks.
Data on consumers originates from a diverse range of sources including browser searches, phones, GPS, payment systems and sensors. And during the Covid-19 pandemic data on individuals’ movements has been vital in attempting to ‘track and trace’ the spread of the virus.
While much raw data is plentiful and increasingly commoditised, value is added by aggregating that data in unique ways that allow firms to offer innovative and bespoke products to individual consumers.
As well as increasing choice, the use of data can benefit consumers by increasing competition and market access. Access to personal data enables new business models to develop based on personalised prices.
But, in doing so, it can disadvantage others, for example firms could use personal data to charge higher prices to consumers that they identify as being less financially aware or less able or willing to shop around for a better deal.
The use of personalised data by firms raises important issues relating to consumer privacy and to the potential of firms to exploit personal data to identify and target vulnerable consumers, or even more subtly, consumers at key moments of vulnerability.
Personalised pricing – a force for good or just unfair?
The increasingly detailed data profiles of individuals will allow products to be designed and priced to suit consumers’ characteristics. This personalised pricing can lead to better consumer outcomes, but it can also give rise to outcomes some would consider unfair.
The harm from this type of pricing and form of competition could be amplified into generalised distrust where consumers mistakenly believe, or are misled into believing, that the price they are charged is a fair one. This can lead consumers to fail to shop around and switch even when it is in their economic interest to do so. Additionally, discovering that their price is personalised based on the data a firm holds can create a perception that they have not been treated fairly, which can often lead to strong and lasting emotional responses of anger and grievance.
The traditional regulatory response to this has been to encourage more consumers to become aware and engaged through information disclosures. But these forms of intervention have not always worked, particularly with more inactive or vulnerable consumers. This will place greater focus on regulators looking at other regulatory tools to try and amend the nature of competition so that the penalty for consumers who do not shop around regularly is not too great. One example of this has been the recent and more radical intervention by the FCA in General Insurance markets. At the same time, of course, this approach would be likely to reduce the benefits to the most savvy and energetic consumers.
In short there may be a trade-off between the impact of personalised pricing on competition overall and its effect on different consumer groups.
This raises important questions of what exactly we mean by fairness in business models: is it simply a matter of overall economic benefit, or does the impact on specific groups and individuals weigh in the balance? And if so, which groups? The most vulnerable? The most vocal? The most visible? These are questions that the FCA will need to consider.
Respecting privacy in a world of data sharing
While consumers can benefit from data sharing – “track and trace” during Covid-19 provides a high profile current example - the erosion of privacy is a potential form of harm. Its relevance in financial services is likely to grow.
Privacy may be valued in itself – sometimes, we simply do not want others to know something - or there may be hard consequences of privacy being breached. Views on privacy are highly context specific. Someone may be happy to share personal health data as part of the fight against the Covid-19 pandemic, but may not want that information to be used against them when purchasing insurance.
However, the interests of platforms and consumers may often conflict on data gathering and sharing. For example, consumers may share too much data because they lack awareness about how data is collected and used. Or they may share too little data because they do not trust firms sufficiently to use it in ways that benefit them.
Guarding privacy, or otherwise, may have substantial impacts on business models in financial services many of whom depend for their bread-and-butter on the assessment of risk, and therefore the processing of information.
Insurance is one obvious case. However, increased personalised data is also allowing new business models to develop. For example, insurance firms are offering lower premiums to drivers who share data on how much or how safely they drive by using telematics installed in the vehicle. By reducing the frequency of bad outcomes, this development is an overall good to the economy.
However, it will raise insurance costs for the worst drivers who do not adapt their driving style.
If sharing data can be both a benefit and a risk to consumers, what might be the appropriate response from regulation? This question will need to be addressed in many areas of society, not just financial services. A whole new social contract around data consent and use will need to develop.
This is a live issue as we seek to use health data to understand and control the spread of Covid-19. But, because data privacy issues are so context specific, the social contract may look very different in the financial sector. Privacy and data use are under the regulatory purview of the ICO, and the FCA will need to work closely with it to develop principles and rules that are appropriate to the financial sector.
Ensuring data is used for good and not for mis-selling
The availability of previously private data is transforming the sales process of financial products, again raising old risks in new forms.
Firms have always sought to increase demand for their products by using techniques of persuasion. These can range from something as straightforward as listing a particular investment fund on the first page of a SIPP pension platform provider, to the most pernicious high-pressure sales techniques and scams.
The FCA has well-established tools to protect consumers from firms using psychological techniques to make purchases that they later regret. These include cooling-off periods, information disclosures, requirements on firms to treat consumers fairly, and fines for breaching mis-selling rules.
However, the personalisation of data adds another dimension to this problem as it may give firms the ability to identify those consumers most vulnerable to being sold an unsuitable product. It also opens up the possibility that firms could use data trails – for example seeking out social media posts - to identify when an individual consumer is at their most vulnerable.
This raises two important considerations as to whether our current approach to mis-selling fully captures the harm associated with these types of techniques.
First, the FCA will need to develop ways of assessing contextual vulnerability and understanding the algorithms firms use. Would it be acceptable for example if a lead-generation algorithm for high cost loans scanned for evidence of online purchases of certain products?
Second, are there wider social harms beyond the easily quantified ones of financial loss? For example, critiques of “Surveillance Capitalism”, like Professor Shoshana Zuboff, argue that these new sales techniques undermine our autonomy as human beings.
Regulators may need to factor such wider social effects into decisions if harm comes to be widely seen as being more fundamental than financial loss. This would not be straightforward, as many of the benefits associated with the sharing of data in the digital economy only occur because consumers are prepared to cede control over their data and to give up a portion of their autonomy.
A digital-age regulator
Few developments have changed consumers’ lives as significantly or rapidly as the rise of Big Tech and the new emphasis on the value of data. The pace of change is showing little sign of slowing and is likely to have far-reaching consequences.
The regulatory challenges are likely to be manifold and in many respects we are still in the early days of the digital era. Regulators will need to be as agile in the tools they deploy to promote desired outcomes as the disruptors are in supplying their innovations. This will, of course, require transformation, innovation, and the leveraging of new technologies in the task of regulation itself: producing regulation for the digital age also means being a digital-age regulator.
Next, in Future Market Dynamics – Part 3, we will continue our examination of data and consider its role as a public and private good. We will also turn to other dynamics effecting consumers – the risks and opportunities they face and the issues of intermediation and trust.
Subscribe to Insight below and you will receive an alert as soon as we publish Part 3. Or follow us on Twitter @fcainsight.