In 2005, Elaine Bromiley was admitted to hospital for a routine sinus operation. During the operation, medics had unexpected problems trying to insert a laryngeal mask airway.
This led to a can’t intubate, can’t ventilate emergency; a situation that requires rapid action to increase oxygenisation. Despite this, the anaesthesiologist and surgeon, both experienced in their field and capable of performing the required tracheotomy, continued to try to intubate the patient. But Elaine Bromiley never woke up. Her husband, Martin Bromiley, blames her death on ‘a breakdown of leadership, of awareness, of prioritisations, of decision-making, of communication and of assertiveness.’ An independent report found theatre staff brought the necessary equipment to perform a tracheotomy and, when interviewed, were surprised it was not done.
Using compelling evidence from his experience in managing human factors in aviation disasters, Martin Bromiley has since dedicated his career to improving patient safety. One major element of his manifesto is to develop cultures that systematically support people to exchange views, listen and respond to the concerns of others.
The importance of culture
There is growing evidence in medicine, and particularly in surgery, that a culture of blame, fear and hierarchy causes harm not just to patients but also to staff. Everyone in the operating room, regardless of seniority, needs to feel confident to speak up to avert life threatening situations. And the fallibility of everyone, even the most experienced experts, means that we need to learn from minor mistakes - which sometimes have major consequences.
We see the same errors, time and time again in other industries. Martin Bromiley’s words echo disturbingly across numerous crises blamed on culture failings, 'There is a blind spot among senior leaders at Wells Fargo, as well as deterrents to speaking up among the rank and file.' 'The failure to speak up about safety and other problems is not unique to car companies', according to Amy Edmonson, speaking about the General Motors ignition defect crisis. And according to the inquiry into Mid-Staffs, there was a 'failure of different teams within the hospital, as well as in the wider community, to communicate and share their concerns.'
In all cases, there were people who knew what was happening, but who didn’t speak up.
Speaking up creates the most effective teams
It is now well understood that, more than an individual’s personality and beliefs, context - including organisational culture - most affects how likely employees are to remain ethical and to speak up when they see wrongdoing.
Harvard academic, Amy Edmondson, characterises the willingness of people to express an opinion in the workplace as ‘psychological safety’. This describes an environment where senior leaders are committed to learning from mistakes. No-one feels afraid to raise concerns or fears negative consequences to their ego or career if they do. Put another way, people feel they can take an ‘interpersonal risk’ - raising ideas, challenges or issues without being embarrassed, rejected or punished.
Speaking up does not come naturally. Most people don’t want to harm their image, especially in their office and in front of people who evaluate them. ‘Impression management’ of this kind is personally rewarding for two reasons. First, on a practical level – promotions and other rewards often depend on what your boss thinks of you.
Second, on a psychological level - we have a deep preference for other people's approval. One way of managing interpersonal risk is to avoid speaking up or taking a narrow focus on following protocol. Environments with low psychological safety exacerbate our fundamental desire for approval and make cultural failures more likely.
Conversely, in environments with high psychological safety, it is harder for unethical behaviour to gain a foothold. This is not only morally impactful but research shows that firms with high psychological safety have higher performing teams, more engaged workforces, happier employees and more business success.
For example, the ever-data-driven Google has set out to discover the secret to effective teams. Code-named Project Aristotle, it references Aristotle’s quote ‘the whole is greater than the sum of its parts,’ and the project determined measures of team effectiveness using qualitative and quantitative data.
The project used more than 35 statistical models on hundreds of variables. It found that effectiveness is less about who is on the team (factors like personality traits, emotional intelligence, IQ) and more about how the team works together. Psychological safety was found to be the most important factor - echoing what academics have known since the 1960s.
Accept mistakes and learn from them
In medicine, one solution to a culture of poor patient safety is the use of checklists. Pioneered by Peter Pronovost and developed and popularised by Atul Gawande in his book, The Checklist Manifesto, they provide simple reminders for things that everyone knows they should do, but which in practice are often missed. For example, a checklist can make sure that everyone in the operating theatre has introduced themselves.
Pronovost and Gawande’s research shows a checklist improves communication and makes it more likely staff will raise concerns. As they are usually responsible for administering them, checklists also empower nurses to remind doctors when something is missed or to speak up when procedure is not being followed. And their impact can be huge. Eighteen months after several US hospitals introduced checklists, doctors saved an estimated $175m and more than 1,500 lives.
Checklists can be very effective at drawing attention to small tasks - like handwashing or confirming consumers’ wishes. These are the type of behaviours that, unchecked, can lead to big impacts such as patient infection or systemic mis-selling. However, most scholars agree that they need to be supplemented by personal accountability and moral responsibility to avoid a box-ticking culture.
Embedding a culture of learning and openness can start small, such as encouraging teams to learn from day-to-day mishaps. Ikea founder Ingvar Kamprad was notoriously contemptuous of ‘mediocre people’ who spend their time proving they were not wrong. He described making mistakes as ‘the privilege of the active’.
Creating healthy practices of confronting uncomfortable truths leads to greater likelihood that people will speak up. It can be easier (and better) to flag small risks or concerns earlier than deal with the consequences later.
Psychological safety in financial firms
Are financial services firms less psychologically safe than firms in other markets? In short, we don’t know as the data doesn’t exist. But there are four potential barriers to creating psychological safety – the first two of which are particularly relevant to financial services.
First, the risk of losing your hefty financial bonus (not to mention status and future employability) by speaking up is a potentially powerful deterrent.
Second, the cost of your silence is not always clear. Unlike a patient lying in front of you on a table, the consequences of wrong-doing or error are often not apparent until down the line. Findings from social psychology tell us the more distant an object is from us (whether in time, physically or emotionally), the more abstract it will be thought of, and reactions will be much weaker.
Third, speaking out is not the norm and goes against the grain psychologically.
Finally, do organisations want to listen? Organisations need to be receptive to uncomfortable truths and willing to change from bottom-up, as well as top-down. As Greek philosopher Epictetus said, we have two ears and one mouth so that we can listen twice as much as we speak.Additional contribution by Business Psychologist, Nicole Brigandi.