“The single greatest impediment to error prevention is that we punish people for making mistakes” Dr Lucian Leape – 12 October 1997
“People make errors, which lead to accidents. Accidents lead to deaths. The standard solution is to blame the people involved. If we find out who made the errors and punish them, we solve the problem, right? Wrong. The problem is seldom the fault of an individual; it is the fault of the system. Change the people without changing the system and the problems will continue.” Don Norman Author, the Design of Everyday Things
I believe that there are things we should all agree on:
- The best people can make the worst mistake
- Systems will never be perfect
- Humans will never be perfect
The current culture we have in healthcare is that; human error coupled with harm to a patient usually results in social condemnation and disciplinary action, that it is proven that disciplining employees in response to honest mistakes does little to improve the overall safety system but that few people are willing to come forward and admit an error when they face the full force of the current punitive system.
So what is a Just Culture? it is one that is open, fair, and a learning culture and combined with the design of safe systems and managing behavioural choices it creates an effective safety culture.
One of the myths of the Just Culture is that it is blame free. This is not a blame free system in which any conduct can be reported with impunity. There need be no loss of accountability – it is just different – the accountability requires an employee to raise their hand in the interests of safety. Not reporting your error, preventing the system from learning is the greatest problem of all and some actions do warrant disciplinary or enforcement action. The key question is; where do you draw the disciplinary line? in order to know that, we all need to understand the differences between human error, risky behaviour and recklessness.
The big three:
- Human error: inadvertent action; inadvertently doing other that what should have been done; slip, lapse, mistake
- Risky behaviour: choices that increase risk, where risk is not recognised or is mistakenly believed to be justified – includes violations and negligence
- Reckless behaviour: behavioural choice, intentional acts, conscious disregard to a substantial and unjustifiable risk
Human Error is when an individual should have done something other than what they did and in the course of that action inadvertently caused or could have caused an undesirable outcome for example picking up the wrong keys, forgetting your ID, miscalculating a medication dose, missing a turnoff from the motorway, picking up strawberry yoghurt instead of raspberry. We make errors every day with generally minimal consequences. In healthcare we make similar types of errors with the potential for dire consequences and for a number of reasons are expected not to. We are expected to be somehow ‘above’ human fallibility and not make mistakes. We need to understand that individuals do not intend the mistake or error or undesirable outcome even though the consequences are potentially life threatening. I often get asked about the individual who makes repeated errors. The individual may be in a job, or performing a specific task that is very prone to error. Drug labels and equipment layouts lacking in standardisation and poor design will lead individuals to make repetitive errors. A source may lie with the individual who is stressed, distracted, unfocused leading to an increased propensity to error, in fact those that have erred are more likely to do it again because of the stress caused by the first error. In these cases it may be appropriate to remove the individual from the current task however this must not be seen as a punishment.
Response required – console, support, learn, and be kind
Risky behaviour or violations are when individuals or groups take action which is different from the expected rules [procedures, policies, standard operating procedures, guidelines, standards]– which require or prohibit a set of behaviours. There are unintentional rule violations and intentional rule violations. Unintentional – usually that the individual was not aware of the rule or did not understand it. Intentional – when an individual chooses to knowingly violate a rule while performing a task – does not necessarily mean they were risk taking. They may be situational or circumstantial or patient centred. Are all intentional violations bad? There will always be circumstances where the rule does not fit the situation. If a healthcare provider felt it was necessary to violate a policy to save a patient, e.g. a cardiac arrest in the car park may mean that some infection control rules are not followed. We should judge individuals based on whether they knew the risks they were taking increased the potential for harm. Normalising behaviour is when intentional violations of rules and procedures occur everyday. This is behaviour developed over time, often without the workforce’s knowledge. Much can be learned by understanding why certain violations become the norm.
Response required – coach, learn, train where needed
Recklessness is a crime – demonstrating in law greater intent than mere negligent conduct. For example consider you are driving and you see a car ahead both speeding and weaving in and out of lanes. The care is violating traffic rules and they are taking a risk which could cause an accident. It is highly likely that the driver knows the risk they are taking. The recent air accident in the French Alps is undoubtedly an example of recklessness.
Response required – discipline and sanction
There are ways in which we can support a Just Culture. Create a resilient system that continually revises its approach to work in an effort to prevent or minimise failure, be constantly aware of the potential for failure and help people make decisions and direct (the limited) resources to minimise the risk of harm, knowing the system is compromised because it includes sometimes faulty equipment, imperfect processes, and fallible human beings. The factors that help system safety:
- Use human factors and design for safety
- Create barriers to prevent failure
- Enable recovery to capture failures before they become critical
- Limit the effects of failure
Factors that help people be safer:
- Information – the right information at the right time to the right people
- Equipment/Tools – as much as possible standardised equipment and tools to minimise user error – especially for transient and new staff
- Design/Configuration – design out the chances of error
- Job/Task – be very clear with instructions for jobs and tasks and dont take on jobs or tasks you dont know how to do
- Qualifications/Skills – the right qualifications and skills for the relevant tasks
- Perception of Risk – understanding your own attitudes to risk and
- Individual Factors – understanding things like stress, mood, distraction and interruptions
- Environment/Facilities – designing the environment to maximise safety
- Communication – the right ways to communicate so that you are heard and understood e.g. at handover, with patients, with colleagues, in an emergency
The Just Culture is also about ensuring that you have a learning culture. When things go wrong and you want to learn from error and incidents, understanding the differences between human error, risky behaviour and reckless behaviour is vital as well as understanding the natural bias we are all subject to in particular outcome and hindsight bias. Outcome bias is when the same “behaviour produce[s] more ethical condemnation when it happen[s] to produce bad rather than good outcome, even if the outcome is determined by chance.” For example if a healthcare professional makes an error that causes no harm we consider them to be lucky. If another person makes the same error resulting in injury to a patient we consider them to be blameworthy and disciplinary action may follow. The more severe the outcome, the more blameworthy the person becomes. This is a flawed system based upon the notion that we can totally control our outcomes. Interestingly outcome bias has influenced our legal system; A drunk driver suffers far greater consequences for killing someone than merely damaging property, the drivers intent is the same, the outcome very different yet society has shaped the legal system around the severity of the crime. What is worrying here is that the reckless individual who does not injure someone sometimes receives less punitive sanction than the merely erring individual who caused injury. Hindsight bias – ‘why did you do it like that’ or ‘I would never have done that’ or ‘the knew-it-all-along effect’. Happens after an event has occurred and sees the event as having been predictable, despite there having been little or no objective basis for predicting it. It may cause memory distortion, where the recollection and reconstruction of content can lead to false theoretical outcomes. So always remember that when we review back at why things went wrong we may never know the full truth, it will be subject to our own bias and views and knowledge, so we may only know one aspect of it.
Three things you can do tomorrow:
1. Review your current disciplinary policy
Ensure it accounts for when incidents are investigated that there is an understanding of human factors and the just culture. Your key issue is to ensure that learning from the events outweigh the deterrent effect of punishment and your staff feel able to speak out, raise concerns and report incidents.
2. Conduct a culture survey of your staff
Ask them questions that illicit their views on whether they feel they can speak out, raise concerns or if they feel they will be consoled if they make a mistake. Feedback the information on a unit by unit basis not for the whole organisation.
3. Review your incident reporting system
If your incident reports are mainly about:
- problems with processes and equipment – you have a low reporting culture – these are easy to do without backlash on individuals
- individuals reporting on other individuals – you have a low reporting culture – it is easy to point the finger at others
- individuals reporting their own mistakes – you have a good reporting culture – the individual will act against their own self interest and report so that others can learn
- individuals reporting their own violations – you have an outstanding reporting culture – they understand that you understand that violations are not disciplinary actions and are to be learned from
In summary
- Human Error is unintentional, accidental, unplanned, there is no intent to be erroneous – therefore there should be no discipline, just consolation
- At risk behaviour is usually where an employee had no reason to know that he or she was creating a risk – there should be no discipline just coaching
- If the employee knew and consciously disregarded the consequences of the risk they were taking – this is reckless behaviour and intentional – they must be disciplined and may even be considered to have acted criminally
References
David Marx. Book – Whack a Mole
Sidney Dekker. Book – Just Culture – Balancing Safety and Accountability
Eric Hollnagel, David Woods and Nancy Leveson. Book – Resilience Engineering – Concepts and Precepts
Very useful post. Injuries to patients can be due to human error or problems in a system. I’m concerned with helping care givers, but as I am sure you are, also concerned about how to help patients heal after they are hurt.
If transparency and apology played a bigger part of the health care system, the suffering caused by bad procedures and recklessness would be less and safety would improve. No one even knows how many mistakes are make by doctors and hospitals each day. It takes courage to report mistakes.
The wall of silence keeps all errors unexamined, hurts patients, doctors, and impairs safety initiatives. Having apology part of the system will make doctors more careful, or at least acknowledge suffering and help patients heal.