It was an absolute honour and joy to deliver the James Reason Lecture for 2016 at the Patient Safety Congress on what was the 68th Birthday of the NHS and in the very city in which the first NHS patient was treated.
Over the last 25 years Jim has helped rewrite critical assumptions about human error and patient safety. He is an undisputed master and has a unique way of helping us understand what can and should be done. All of us in safety have trailed behind Jim and owe him a great debt of gratitude. So when I was asked to do this talk I could think of nothing other than reminding us all about Jim and his ideas from ‘human error’ through to ‘a life in error’ – I also thought I would revisit my roots. The talk covered the past, present and future; from good practice to today’s challenges and then to hope for the future. The following is the transcript of my speech.
I started the second phase of my career in clinical risk and patient safety in 1994. Believe me when I say I had absolutely no idea what I was doing. And Jim Reason came to the rescue. My bible – was his ground breaking book, Human Error.
In writing about his life in error, Jim says he has discovered two key principles:
- Learn as much as possible about the details of how people work
- And most importantly, never be judgemental
I learnt these things one July morning in 1997. One of our patients, a 12 year old boy called Richie William, had been wrongly injected into his spine with a chemotherapy drug called vincristine. My heart flipped at the sound of the urgency and fear in the voice that told me this. I wasn’t even sure if that was a problem – at that point I had never even heard of vincristine let alone the impact. Five minutes later I am sitting in a room full of oncologists explaining to me that giving vincristine into the spine was usually fatal. They tried to explain the whole process of chemotherapy cycles; using complex scientific language that they understood completely and I only understood every tenth word. This is one of the problems with healthcare investigations; normal practice can be understood by just a few and different people and different specialities can have a language all of their own.
So we did just as Jim says; we learnt as much as possible.
It soon became clear that the anaesthetist who had administered the drug knew nothing about the fact that Richie had been readmitted with these worrying signs. Even now I can feel the dread when we walked over to find him. I wanted to be anywhere else but there. I knew fundamentally that this was not the time to be judgemental; this was the time to be kind, to use every skill I had learnt as a nurse in breaking bad news. I shall never forget his face when it dawned on him what had happened.
This young junior doctor’s life and many others in the hospital were never going to be the same again. Richie’s life was taken away from him, he was irreplaceable and his mum and family were devastated. ‘You have killed my child’, she screamed desperately.
So to do justice for Richie and for everyone involved we vowed to find out as much as possible what had gone wrong and learn the lessons for the future.
The first thing you can do is learn to listen
As an investigator you work through facts, you talk to vast numbers of people, you gather statements, you trawl through case notes and observation charts and you review protocols and guidelines; so
- Listen to what people are saying without judgement
- Listen to those that work there every day to find out what their lives are like
- Listen to help you piece the bits of the jigsaw together so that they start to resemble a picture of sorts
The second thing you can do is resist the pressure to find a simple explanation.
Too often investigations try to create a simple linear story rather than the complex reality of multiple and interacting factors and events. We found out so much detail that I can even picture Richie’s day.
I can see him arriving early for his chemotherapy. I can picture him talking to the anaesthetist who was kneeling, gently explaining what would happen that day. I can almost smell the theatre room, glossy silver machines, shiny clean floors, and people in scrubs, hats, and masks. I can imagine Richie on the theatre trolley and the junior doctor sitting at his back with a tray of drugs by his side.
I can see all of this without ever having been there. Because we created these pictures in our heads and in our words for others to see; we created these pictures to help us understand why this happened. We also carried out a number of reconstructions and we created an identical set of the drugs, their packaging and labelling as irrefutable evidence of what the junior doctor was faced with.
The third thing you can do – as Jim so rightly says – don’t be judgemental.
Try to move beyond your own natural bias. Richie was a young 12 year old boy who died in our care – as human beings we are intrinsically motivated to find someone to blame. We are also likely to immediately judge the actions on that day as careless or incompetent. We knew there would be people who would find it hard to comprehend a decision not to blame or punish the people involved. We knew that we ourselves would judge the past events as somehow more foreseeable than they were, or what we now knew would colour our perceptions of how and why things occurred.
But these were not bad people
So we purposefully put these things to one side and looked through the holes in the system. Our internal investigation was undertaken at the same time as the police investigation and not long after, two of the junior doctors; the anaesthetist and a doctor from the oncology and haematology team – were charged with manslaughter.
The result of our collective efforts was a report that we submitted to Richie’s mum, the police and the department of health.
It was never simply about one person’s actions
What we found was an insight into the very workings of a busy and complex children’s hospital. We learnt that healthcare is never neat or linear; it is possible to do tasks and activities in healthcare in multiple different orders and multiple ways. What we found was that, as Carl Macrae says, by noticing, understanding and learning about the small moments in Richie’s life that day we could piece together the reasons why he was eventually injected with the wrong drug into the wrong route.
The report could have been summed up in one paragraph – Richie died because of a combination of small changes to every day events, last minute deviations from the normal routine, distractions, unclear communication, plus the poor design of packaging, mixed together with human error.
The report we were told eventually helped clear the two doctors of the manslaughter charge. In the words of the judge, the internal investigation described a chapter of accidents and misunderstandings; a catalogue of chance events and failings at the hospital, rather than gross negligence by the doctors.
What I do know is that we would never have conducted the type of incident investigation we did if it wasn’t for Jim.
Jim’s work helped us look at the world differently.
What has happened since?
During our investigation we also found out that the exact same incident had happened across the globe and by then at least 12 times in British hospitals. Clearly learning from one incident to another was virtually non-existent. This is an example of exactly what Sir Liam Donaldson meant when he wanted to create a national reporting and learning system. His vision was one of staff identifying a problem one day and sharing that right across the NHS the next – to enable system-wide learning. Instead we have incident reporting systems that are drowning. What we didn’t know at the time of our investigation was that there would be another incident three years later at Queens Medical Centre in Nottingham; the now well-known case of Wayne Jowett.
Was there anything that we could have done that would have prevented Wayne from dying too?
Since 2001 policies have been put in place to reduce the chances of this happening by ensuring that the two drugs are never given by the same person, in the same place and even on the same day and we also know that there is a design change that can be made to the connections of both the intrathecal syringe and the intravenous cannula – so that one can never fit the other. This final change has been an enormous uphill task and credit is due to the persistence of a few such as Brian Toft and David Cousins. But we are still not quite there yet in terms of total compliance. Implementation research says that embedding new practice takes on average 17 to 18 years – since Richie we are at year 19 and counting.
What do we learn?
It takes enormous skill to conduct an investigation well. However, too often these roles are relegated to the back office, relegated to counting and data collection. They are seen as tedious and bureaucratic. The people responsible for clinical risk are expected to collect incidents, investigate, implement change, share learning, audit and review implementation while at the same time engage and provide information and support to distressed patients and their families and the staff that care for them. They are constrained by timelines and the quest for completing the investigation is the aim not the learning; the final report is seen as the end product not the start. Investigators in the NHS do not have an army of personal to examine the incident reports they are faced with –
They are more likely to be confronted with an army of inspectors instead
The function has shifted from finding out stuff and fixing it to meeting the needs of external inspection and scrutiny. The skills and tools needed are consistently under used and their potential under realised and, dare I say it, with the ever increasing emphasis on quality improvement we are in danger of losing our way and relegating these key components of safety to being simple administrative tasks. This is in a context where Jim has warned us that people are potentially blinkered by the pursuit of the wrong kind of excellence – usually productive and financial indicators and the imperative need to achieve these targets.
Jim sums it up in his own unique way;
Those whose business it is to manage system safety will generally have their eyes firmly fixed on the wrong ball
If you have a role in clinical risk and patient safety, like me you have learnt to do your job, on the job. Your role is unpredictable and huge. You are often unable to use the work of experts like Jim because you are constantly being reactive and tackling one incident at a time and the dream you have of capturing and learning from near misses is just that – a dream. You don’t have the time to do justice to what is needed.
You make recommendations you know are weak suggestions of training and protocols because the stronger ones involve changing whole systems and you don’t really have the mandate or the authority to do that. You don’t have enough support because those in turn who are expected to support you don’t have the time to do so.
There are an enormous amount of tools and techniques that already exist you really don’t need to go searching for new but simply look back at the work by Jim and others – they tell you most of what you need to know. But how do you get the time to study these? Where do you go for your own development? Where do you go for your own learning?
We found out how all of this feels for clinicians and relatives of patients who have been harmed when in January of this year a group of us ran an event for some of the members of the expert advisory group who had been tasked with advising on the set up of the new Healthcare Safety Investigation Branch. The summary of the event was that the investigation process is viewed with fear and dislike by both parties and it takes a huge amount of effort and compassion not to be judgemental.
We heard that the pain and trauma, lack of trust in the process, absence of support –are exactly the same whether you are a patient or family member or a member of staff. And this is also what we heard:
- I have never seen a space like this where clinicians and patients can speak openly in the same room and listen to each other
- What we seek is forgiveness and compassion – not a witch hunt
- The reality is that when you make a mistake, you will have done the same thing a thousand times without an error happening, but this time it did. I worry every day I may do it again
- The biggest challenge is trust from all sides
All we want is a conversation one human being to another.
The Public Administration Select Committee met in 2014 and published their findings in relation to investigating clinical incidents in the NHS. They said:
Patients and NHS staff deserve to have clinical incidents investigated immediately at a local level, so that facts and evidence are established early, without the need to find blame, and regardless of whether a complaint has been raised.
This requires strengthened investigative capacity locally in most of the NHS, supported by a new, single, independent and accountable investigative body to provide national leadership, to serve as a resource of skills and expertise for the conduct of patient safety incident investigations, and to act as a catalyst to promote a just and open culture across the whole health system
I could only have dreamt of such a thing as a young risk manager.
Let’s look at that again as a framework for investigations:
- Find the facts and evidence early
- Without the need to find blame
- Strengthen investigative capacity locally
- Support people with national leadership
- Provide a resource of skills and expertise
- Act as a catalyst to promote a just and open culture
There is much that was said but they also recommended:
• Human Factors and incident analysis as part of the training of all healthcare professionals
• The development of a body of professionally qualified investigative staff with formal examinations and qualifications
The Expert Advisory Group set up to help design this new body built on this in their final report. They emphasised the need for independence, learning, to involve patients, families and staff as active participants, and to support all compassionately and respectfully. The group agreed that investigations must be led by experts and produce detailed reports that explain the causes of safety issues and incidents, and issue recommendations for improving patient safety across the system. They supported the creation of a just safety culture, openness and transparency and suggested the establishment of a Just Culture Task Force.
The recommendations made by both the Parliamentary Administration Select Committee and the Expert Advisory Group apply as much to local investigation functions as they do to the new proposed branch.
We are potentially on the cusp of something great.
If these things actually happen we could transform the way we learn and improve patient safety.
I know that there have been some concerns about the new branch but I think it’s time to put those concerns aside and get behind both Keith Conradi who will head up the branch and its ideals.
It would also be worth you reviewing these and considering what they mean for you and your organisation. Can you get ahead of the curve?
I have three things I would add:
Firstly: Instil pride back in this important aspect of patient safety and value those that are working in this area
Our local investigators should not be seen as the enemy. I was hugely struck by a comment by Professor Graham Braithwaite of Cranfield University when he gave evidence to the parliamentary review of clinical incidents – he said:
Air Accidents Investigation Branch inspectors, usually airline pilots, are regarded in the industry as credible and trustworthy: it is a role that is seen by many as the pinnacle of their careers
The pinnacle of their careers – how fantastic is that?
When I trained as a nurse and then went on to specialise in PICU I was carefully taken from the role of a novice through to expert with an in-depth training programme, supervision, mentorship, shadowing, coaching, and tons of experiential learning. Most roles in healthcare, clinical, managerial and admin have all of this together with a recognised body that they can be affiliated with, a bespoke training programme and role models to learn from. In clinical risk, patient safety, incident investigation – none of this happens.
Every healthcare facility should invest in creating expert investigators. There is a massive opportunity that most organisations are missing. If you employ individuals who are skilled in systems thinking, human factors, cognitive interviewing, behavioural change, mediation, facilitation and delivering difficult news and these are the people who can also gain the trust of all around them, the patient and their family and the staff involved and help create the right conversations ‘from one human being to another’….
• You will finally be able to start addressing the risks in your organisation
• You will finally be able to start to reduce the chances of error
• You will finally be able to reduce avoidable harm and improve patient safety
Secondly: Care for the people caught up in incidents
Every healthcare facility should provide an after event duty of care to all. A function that supports patients and their relatives AND staff when things go wrong. Early this year I came across a radio interview with Bob Ebeling. Bob was one of the engineers working on the shuttle Challenger 30 years ago – since the radio interview Bob has since died.
This was his story.
On 27 January 1986, Bob, the engineer had joined four of his colleagues in trying to keep the space shuttle Challenger grounded. They argued for hours that the launch the next morning would be the coldest ever. Freezing temperatures, (their data showed), stiffened rubber O-rings that keep burning rocket fuel from leaking out of the joints in the shuttle’s boosters. For more than 30 years, Bob has carried the guilt of the Challenger explosion. He was an engineer and he knew the shuttle couldn’t sustain the freezing temperatures. He warned his supervisors. He told his wife it was going to blow up. The next morning it did, just as he said it would, and seven astronauts died.
Since that tragic day, Bob has blamed himself. He always wondered whether he could have done more. That day changed him. He became steeped in his own grief, despondent and withdrawn. He quit his job. Bob is 89 spoke this year on National Public Radio (NPR) a radio station in the US on the 30th anniversary of the Challenger explosion. If you listen to the recording you can hear his voice; it is low, quite and heavy with sadness as recalled the day and described his three decades of guilt.
“I think that was one of the mistakes that God made,” he said “He (God) shouldn’t have picked me for the job. But next time I talk to him, I’m gonna ask him…‘Why me? You picked a loser”.
The listeners were so moved by this they sent hundreds of e-mails and letters of support to Bob. Engineering teachers said they would use him as an example of good ethical practice and other professionals wrote that because of his example they are more vigilant in their jobs. Allan McDonald, who was Bob’s boss at the time contacted him and told him that he had done everything he could have done to warn them; “The decision was a collective decision made by all of us. You should not torture yourself with any assumed blame.” And NASA issued a statement commending courageous people like Bob who they said “speak up so that our astronauts can safely carry out their missions.”
Let’s help all of those people who like Bob are living with the guilt and shame and the people (staff and patients) who are profoundly and forever affected by these events. Let’s care for them; be kind to them, support them to come to terms with what went on and let’s never be judgemental. Speaking from my own experience, the guilt never goes away but a kind word goes a very very long way to help.
Thirdly: Create a culture where clinicians and patients can speak openly in the same room and listen to each other
In the words of Scott Morrish in his submission to the Public Administration Select Committee;
Incident investigation requires partnership and a shared commitment – we all want the same thing. The observations, memories, and knowledge exist, but are divided between patients and staff. Insight and clear thinking therefore, depends upon pooling that knowledge openly in order to allow a complete picture which can then be scrutinised in search of learning.
My aim today was to share with you the importance of, as Jim would say, learning as much as possible and never being judgemental.
There is such richness in the early thinking of human error and patient safety that remains as important to day as it did when it was first published. That thinking has laid the foundations for where we are today. Patient safety ranges from clinical risk management through to safety improvement. If we relegate clinical risk to the back office we are in danger of missing a vital piece of the safety jigsaw.
I truly hope that the next few steps we take in patient safety will unite all the different and equally important component parts so that we are making a much bigger stride together.
And finally..
We go back a long way Jim and I, and I will never forget the words; ‘Dr Woodward, now that has a nice ring to it’ when he as my doctorate mentor stood up at the end of my viva and uttered those words for the first time.
Thank you Jim – I hope I made you proud.
Great piece, wonderful prose. And James Reason’s books are inspirational; I reviewed his excellent ‘The Human Contribution’ on my blog The Doctors Bookshelf https://thedoctorsbookshelf.com/2015/08/13/the-human-contribution/