I came across this video of the wonderful Tim Minchin – a great story teller – who described confirmative bias in a really great way for me [side note is this calling out my own confirmative bias?]. You can find it here: https://www.youtube.com/watch?v=yoEezZD71sc
Confirmative bias is a conundrum for those of us who work in safety. It is part of a long list of biases that we have as humans. It is the way in which we search for clues or information that confirms or supports our beliefs, values and what we think happened. It can impact on all aspects of healthcare from clinical decision making, diagnosis and so on. Over centuries it has impacted on the choice of different treatments for example. If a patient recovers when given a particular treatment, then it must be that the treatment was successful rather than looking for alternative explanations such as the disease may have run its natural cause and so on.
In safety it impacts on our view of cause and effect, our view of risks past and future and our view of what actually happened in any accident or incident. And as Minchin says these beliefs can be so entrenched that it takes a huge amount to shift us away from them. You know it is entrenched when the beliefs persist even when the evidence has been presented that show them to be false and we will continue to look for different evidence that will confirm our existing beliefs.
Confirmative bias can cause systematic errors in our understanding of why things might have not gone as planned or might not go as planned in the future. We may ignore contributory and causal factors while we focus on the ones we want to see. We may prematurely find what we think is the problem or cause and simply stop looking. I like the fact that some people refer to is as ‘my side bias’. The ‘in my opinion’ view. Often used to dismiss others, that even though others might have an opinion, my opinion is the right one so I am not going to listen or give credence to your views.
Confirmation bias is there when investigating why something happened or didn’t happen. We tend to search for all relevant information to support our view and use our prior knowledge to support this. A simple example would be when we try to understand why someone falls. In our experience in the past we may have found that people who fall do so because they were dizzy or light headed. So every time someone falls we look for evidence that they were dizzy or light headed and this confirms our theory. This can also be described as anchoring bias, where we give weight and reliance on our initial understanding and then don’t move from that, despite new information – we have jumped to a conclusion and got stuck there.
Other linked biases include:
*availability bias, when we consider a diagnoses because it is familiar, common, recent or memorable
*diagnostic momentum, where once we have diagnosed the problem we run with that thereby reducing our ability to consider other alternatives and in fact cease to search for any other findings or signals
*framing bias, where we are influence by how the information is presented, who presents it and whether we trust it
What can we do about this?
We can recognise that this could happen.
We could actively search for information that disproves our beliefs or views.
Look at the opposite side of the argument.
Help create the conditions that facilitate our ability to think.
Improve the information we collect.
Promote diverse conversations and inter/intra professional collaboration to verify assumptions, interpretations and conclusions.
Build the psychological safety environment that also fosters a restorative just and learning culture.