Removing our biases: some behavioural tools for the workplace

23 October 2019

Some of the behavioural biases we are susceptible to are amplified and altered when we work as part of an organisation, which can lead to costly mistakes. Behavioural science can help us anticipate our mistakes and design them out of our working lives.

Think about how many words start with the letter k compared to the number of words with k as the third letter. Which is most common?

If you’re like most people, you would have overwhelmingly voted for k as a first letter. But you’d be wrong. There are about twice as many words with k as a third letter. We find it much easier to bring to mind words with k at the beginning (keep, king, kind) than as a third letter (bake, take, ankle), because they are more readily available in our memory.

This is known as the availability heuristic – our tendency to think that what comes to mind most easily is true.

Over-optimism is another pervasive bias. Around 80% of drivers consider themselves to be above average for driving safety - a statistical impossibility.

You might think that working as part of a team or organisation and exposure to a diverse range of views and experiences could help rid us of biases like these. For example, what comes to mind most easily for me may be different to others in my team, and other people may be less optimistic about my success than I am.

But quite often, the opposite happens – groups can amplify these errors, resulting in poor decisions, and costly mistakes. 

‘Groupthink’ is a widely recognised problem, whereby various underlying behavioural biases and social pressures lead us to follow the views and beliefs of others.

The Asch experiment famously demonstrated the power of the group. A group of individuals were each asked to make a simple assessment about which of several lines on a piece of paper was the longest. But only one of the individuals was a real subject, all the others were actors, who deliberately gave the wrong (and obviously wrong) answer. Over repeated tests it emerged that a high proportion of test subjects repeated the wrong answer given by everyone else. The simple but vital lesson was that people often prefer to conform to a group, even when that group is very clearly wrong.

Effective organisational decision-making is not about agreement. It requires employees to challenge one another, share views even if they are contrary to the group’s, and to point out risks and potential errors. It also requires us to be objective when planning and making strategic decisions – something which organisational psychologists have shown we are particularly bad at.

The FCA’s own Capability Framework for employees and its Leadership Programme ‘At Our Best’, run in partnership with the Oxford Said Business School, addresses many of these issues. The FCA also runs a programme for its managers and technical specialists run by training group The Mind Gym, which places a strong emphasis on using behavioural science to address personal biases. Additionally, the FCA’s own Behavioural Economics and Design Unit works with teams across the FCA to implement behaviourally inspired interventions that improve organisational decision-making.

Previous Insight articles have explained that diverse teams don’t necessarily lead to diverse voices and a recent article from the Banking Standards Board (BSB) gave ideas for encouraging a speak-up culture, which would reduce the risk of groupthink. The FCA has also held a number of ‘culture sprints’, where our culture team brought together financial services firms, behavioural scientists and practitioners, to develop practical ways to deal with some important behavioural challenges, such as how to develop a ‘speak up, listen up’ culture.

Below, we have pulled together some techniques inspired by behavioural science to unlock our diverse points of view and plan more effectively. We do not require central intervention for these to work, but we can use them to engender change within any team.


Effective organisational decision-making is not about agreement

Challenge and be challenged

When we foster a culture of challenge, we reduce the risk of groupthink and improve the quality of our decisions. But how can we encourage more challenge in our teams?

Chair, don’t anchor

One mechanism that causes groupthink is the tendency for leaders, or the most outspoken people in the room, to state their opinion first, causing others to ‘anchor’ to their view. Thus, the first view voiced overly influences the conversation by suggesting what makes an ‘acceptable’ point.

The overall effect is a reduction in the range of views expressed. If you’re chairing a meeting, hold back your opinions until everyone else has given theirs and encourage outspoken people to do the same thing. Try a ‘speaking pen’ – group members can only speak if they are holding the pen, or assign five straws to each person, which they must surrender each time they speak. These tools give clear visual cues about who may be dominating and who may be holding back and so provide informal prompts for who can speak and when.

If you want to be sure you’re getting people’s unfettered views, you can also privately or anonymously poll them. The anonymity means people won’t have the social pressure to censor their views. This can be particularly useful for brainstorming, where you want to generate as many ideas as possible.

Challenge by design

It’s difficult to elicit genuine challenge for our ideas or methods, but there are some mechanisms that can strip away people’s tendency to hold back their critiques in the workplace. A key technique is not merely to invite challenge, but to require it, by insisting that people come up with negatives.

One such technique is the pre-mortem. In a pre-mortem, the team is asked to assume that a project has already failed and to come up with possible reasons why. This turns the typical critiquing session on its head and can help overcome the usual workplace tendencies towards groupthink and overconfidence. By assuming failure and putting a focus on ‘discovering’ the reasons, it gives licence for everyone to challenge, even the more cautious sceptics who might otherwise not speak up. It can also help bring to light a wider range of potential issues – by assuming that the project has failed, we introduce a sense of certainty. The idea goes back a long way. Research in 1989 showed that people will come up with more potential explanations for an event that has not yet happened, compared to one that has.



Overconfidence is often exacerbated in the workplace where negativity and caution will not get you noticed, but having an ambitious plan or positive approach will.


Microsoft employs a team of hackers whose role is to spend all day attacking Windows to find faults before the real attackers do. This is red-teaming, an approach often used in the security and IT fields to assess vulnerabilities of plans or systems. Organisations increasingly use red-teaming in less extreme settings as a valuable tool for generating challenge.

The FCA has used red-teaming to tackle a particularly complex piece of analysis or policy – creating a team whose sole purpose was to challenge the project.

All you need to create a red team is two or three trusted, competent colleagues, who are independent from your work but knowledgeable about your approach. They are given the task of peer reviewing your project and constructing the strongest possible case against it. Again, this not only gives licence for the team to be openly critical, in fact, they are incentivised to come up with imaginative and wide-ranging critiques.

Plan around your biases

Strategic decisions such as agreeing to a proposal, budgeting for a project or knowing when to halt a project are difficult and made more so by our own behavioural biases. The mechanisms below give a much more objective way of approaching these.

Forced objectivity

We tend to drastically underestimate the time a project will take and how much it will cost – from writing a paper to delivering an infrastructure project. The initial budget set for the construction of London’s Olympic Stadium was £280 million. By the time the Olympics were underway, £429 million had been spent. The final budget, including turning it into a suitable space for football and athletics, was over £700 million.

Large scale projects often fall victim to overruns, commonly due to the behavioural phenomenon known as the planning fallacy. Overconfidence can lead us to believe that our forecasts are more accurate and precise than they really are. As in the example of words that spring to mind beginning with k, the availability heuristic can lead us to make planning decisions based on what most easily springs to mind rather than the most appropriate comparison. Overconfidence in particular is often exacerbated in the workplace, where negativity and caution will not get you noticed, but having an ambitious plan or positive approach will.

It is possible to introduce forced objectivity. ‘Reference class forecasting’ leads to much more accurate estimates of project costs and times and is particularly suitable for large, strategically important projects.

We are also prone to escalating our commitment to a failing project because of the costs already incurred in terms of effort, time and money. This is known as the sunk cost fallacy, and it tends to become more pronounced as more time passes and the investments made are larger.

‘Implementation intentions’ are pre-commitments to take a given course of action if a certain outcome occurs. They are found to be particularly effective in encouraging people to follow an intended course of action. On a personal level they can be used to prevent day-to-day distractions. For example, if we set the rule, 'whenever I find myself looking at Facebook, I’ll turn my phone off immediately', we commit ourselves to behaviours we want to follow.

Implementation intentions can be even more effective if we publicly commit to taking a chosen course of action. Studies have shown that teams can set implementation intentions to help reduce investments in a failing project and therefore prevent them from succumbing to the sunk cost fallacy by setting rules such as, “we will review the project goals if the costs increase above our original budgeted amount.”

Small changes we can all try now

Behavioural insights are typically used to influence the behaviour of individuals, by re-designing processes or policies in ways that help people make better decisions.

Organisations and the teams within them are susceptible to many of the same behavioural biases that we see in individuals and some distinct biases also result from the interaction between individuals, such as the tendency for ‘groupthink’.

Organisations are increasingly recognising the effects of these group-level behavioural biases on their work. There is little value in having a diverse workforce if employees aren’t speaking up in a way that introduces diverse views, and there are real risks to organisational reputation and effectiveness if new ideas go unchallenged.

We have suggested just a few practical tools for tackling the most pernicious organisational behavioural biases. These needn’t come from the top of the organisation, but can be put into practice by any of us.

In fact, we challenge you.

Get Insight in your inbox