Chapter 9 The implementation challenge

The approach of relying on passive diffusion of information to inform health professionals about safer practices, is doomed to failure in a global environment in which well over two million articles on clinical issues are published annually

This was a quote from my doctoral thesis in 2008 and still applies today.

I am biased but to me implementation is the most important thing we need to get right; more important than improvement and more important than innovation.  We can create all sorts of new ideas, products, and policies but if they are not implemented they are all a waste of time, efforts and resources.

In healthcare we are drowning in ways in which we could improve; numerous interventions and solutions, lots of research and guidance.  We create standards, alerts and ‘must do’ notices and targets.  There have even been repeated alerts published and disseminated in relation to the same topics in patient safety.  This should tell us that the method of ‘telling people just to do it’ isn’t working.

In the early days of patient safety many of us fell into the trap of disseminating guidance and expecting change to happen simply as a result, but after a while we have been forced to admit that things didn’t turn out as we had originally intended and planned.

Implementation is a complex process not a one off event.

Vincristine error

If we study vincristine errors it provides us with a window into the complexity and difficulties faced in relation to implementation. The following detail is mainly taken from the excellent summary of the quest to eliminate intrathecal vincristine errors, a 40 year journey by Noble and Donaldson (2010).

In 2004, in the UK, a review was conducted to assess the level of implementation of guidance for the safe administration of vincristine.  This was three years after the revised guidance had been issued and disseminated.  Despite two very high profile deaths related to vincristine error in the UK, the review revealed only 50% compliance.  Despite these tragic events sustained change had not occurred across the system.

The specific risks of inadvertent administration of vincristine were clearly recognised from the early experience in the 1960s. However, to date around 60 reported cases of intrathecal vincristine errors are known to have occurred with several others assumed as unreported (Noble and Donaldson 2010).  Since they started articles have been written, alerts sent out, guidance disseminated but further deaths occur.  This tells us about the effectiveness of the current approach to change.

The ‘nirvana’ of patient safety is to identify a design solution – one that makes it impossible for mistakes to happen.  Implementation of the design change for vincristine has been an enormous uphill task and we are still not quite there yet in terms of total compliance (Noble and Donaldson 2010).

Rather than wait for the design solution, alternative solutions have been sought.  In recent years, the use of a minibag to deliver vincristine has become increasingly discussed and advocated.  This is not a full physical design solution or forcing function but offers an interim measure.  It involves not using syringes to deliver vincristine at all and instead diluting the drug in a minibag, working on the premise that it would be virtually impossible to deliver this to a patient through a spinal needle.  An alert issued by the National Patient Safety Agency in 2008 stated that no incidents had been reported where a minibag had been used (NPSA 2008).  The World Health Organisation published an alert immediately following the death of a patient in Hong Kong in July 2007 and recommended using the minibag to prevent errors in the absence of the more formal design solution (WHO 2007).

In China, in 2007, two drugs normally given intrathecally (methotrexate and cytarabin hydrochloride) were contaminated with vincristine at the factory level (Noble and Donaldson 2010).  At first, a few children in Shanghai and Guangxi Zhuang Autonomous Region suffered neurological symptoms. Paralysis was common, and these incidents increased the world total from 58 to 251 overnight.  Eventually the outbreak was believed to have led to 193 patients across China having intrathecal vincristine unintentionally. The Chinese government recalled the drugs and closed the plant.

As Noble and Donaldson report, unusual events like this do not get reported via incident reporting systems and even the best design solution may not have prevented such an unpredictable and ‘upstream’ form of error. Yet, this story keeps us attentive to the on-going dangers in every aspect of healthcare, from factory to frontline and adds another vulnerability to this long-running story of unsafe care and the complexities of implementation of good practice.

Noble DJ, Donaldson LJ (2010) The quest to eliminate intrathecal vincristine errors: a 40 year journey BMJ Qual Saf 19:323-326

The checklist

The use of checklists are another interesting example of the difficulties of implementation. It remains astonishing to me today that the final WHO surgical checklist designed had the approval of over fifty different countries from across the globe. To all come together and agree the core components of what should and should not be included in a simple one page checklist is astounding.  It is important to remember this achievement in the light of the criticisms that have followed since.

Checklists in themselves do not prevent or reduce harm; it is the way in which the checklists are used that can do this.  Properly used, the checklist ensures that critical tasks are carried out and that the whole team is adequately prepared for a surgical operation.  However what we find is that rather than full implementation there is mere compliance and there is huge variability in use and implementation.

People think that if you implement the main bits patients will still receive the total benefit.  The checklist to work requires all of the component parts to be implemented but in the main around half of it is used.   In some organisations it has been found that components of the checklist were missed out or incomplete, for example the sign out and debrief post checklist were particularly poorly carried out.  The risk with this is that if you only implement some of the checklist then you may end up by receiving none of the benefits.

If you are in any doubt as to whether checklists can improve patient safety all you need to do is read The checklist manifesto Published by Profile Books in 2010, written by Atul Gawande.

Chapter 10 will talk about what we can do differently for implementation to increase the success of implementing and embedding patient safety solutions and interventions.

 

Chapter 8 The impact on frontline workers

I will be more careful in the future is not a solution to improving patient safety.

Chapter 8 describes the impact on frontline workers when things go wrong.

A few years after my own personal experience of error, I carried out a retrospective investigation in order to finally understand why it happened.  I now know that rather than being solely down to individual performance that there were a number of small moments or incidents during my shift that led me to miscalculate the drug; the constant distractions, automaticity, routine practice as a norm not recognised to be unsafe, being interrupted mid task, combined with my sleep deprivation all contributed to the  drug error.

My response was that I will try harder, that I will not make any more errors but this perpetuates the myth of perfection.

We all need to accept the fact that people, processes and equipment will fail.  By understanding this, organisations can focus on developing ways in which to prevent and plan for these failures.  If instead we simply punish people when they make errors we will not learn about the underlying conditions that may have caused the error.  When staff are fearful of repercussions and blame it hinders them from sharing their concerns and speaking up when things go wrong.

What we need is to all talk to each other when things go wrong, for example about what happened, and about what we all think could be done differently. To have conversations that help people feel confident to speak out when things go wrong. What we also need to do is learn to talk to patients and their families; to have conversations one human being to another.

If we had these conversations people may learn more about what they can do to make their care safer and who knows how many patients could be saved from harm if we did this.

Bob’s story

In this chapter I explore the story of Bob Ebeling who was one of the engineers working on the shuttle Challenger 30 years ago – Bob has since died. but his story is extremely moving.  Bob talked for the first time publically in a radio interview to coincide with the 30 year anniversary.  He talks about how he was simply not listened to but that since that tragic day, Bob blamed himself.  He always wondered whether he could have done more. That day changed him, he became steeped in his own grief, despondent and withdrawn and he quit his job.  If you listen to the recording you can hear his voice; it is low, quiet and heavy with sadness as recalled the day and described his three decades of guilt.

While this story relates to space travel, there are strong synergies with the way healthcare practitioners have across the centuries tried to speak out and not been listened to. If Bob had been listened to, he may have been able to prevent the death of seven astronauts.

Those affected by error live with the guilt for the rest of their lives.  They are profoundly and forever affected by these events.  We have seen how people can experience guilt, anxiety, depression, and more. They find themselves reliving the event days, weeks, months and sometimes years later.  The devastation can lead to their lives unravelling.  Evidence suggests that unsupported health workers may also change their place of work or leave and even some leave the career altogether.  Well-trained, caring, experienced nurses and doctors are moving on, either to another hospital or another career altogether.

We need to care for people; be kind to them, support them to come to terms with what went on and never be judgmental.  If people could talk to each other about these experiences it would allow others to come forward to share. Every healthcare facility should provide an ‘after event’ duty of care to all; a function that supports patients and their relatives and staff when things go wrong.

Kimberley’s story

Kimberley was, just like me, a nurse who worked in PICU although her story was 20 years later. As a result of a ‘ten times the dose incident’ one of the children in her care died.  A doctor instructed Kimberley to administer 140 milligrams of calcium chloride to her patient, a 9 month old infant. She worked out the dosage in her head because she had administered calcium chloride hundreds of times. She in error miscalculated and drew up ten times the dose prescribed.

I read Kimberley’s interview.  She said she had I messed up, that she was talking to someone while drawing it up and she told them that she would be more careful in the future. Almost immediately after the interview, Kimberley was escorted off the premises.  I cannot imagine how that must have felt.  Kimberley was not a bad person.  She loved her job, she loved her patients.  All of a sudden she was isolated from the job, her colleagues and the hospital where she had worked for over 24 years.  We are told that she drove home panicking about what happened.  You can see she felt she was personally and solely culpable for this incident.  She also gave us clues as to why the incident happened; distraction, automaticity, calculating in her head.  I will be more careful in the future is not a recommendation for sustained change.

Four days later the child died and shortly after that Kimberley was fired. She struggled with the death of the patient and the loss of the career she loved.  Sadly she never got over this incident and seven months after she was walked off the premises she committed suicide.  In Kimberley’s case, one of the employees said they felt that people were afraid to admit their mistakes, based on what happened to Kimberley.  They suggested that people didn’t admit mistakes because they were afraid of losing their jobs (Kliff 2016).  It is vital in the memory of healthcare workers like Kimberley that healthcare providers take care of their healthcare workers – they owe a duty of care not only to patients and their families but the people who care for them.

 

 

 

 

 

 

 

 

 

 

 

Kliff S (2016) Fatal mistakes [online] available at vox.com

 

 

Chapter 7 Relegated to the back office

Air Accidents Investigation Branch inspectors, usually airline pilots, are regarded in the industry as credible and trustworthy: it is a role that is seen by many as the pinnacle of their career

  • Professor Graham Braithwaite of Cranfield University (PASC 2015)

This is what is so very badly needed – for the role of an incident investigator in the NHS to be seen by all as at the pinnacle of their career.  What this does is put back pride in the role, and build respect from others and for themselves.  This respects that fact that incident investigation is not a target to be complied with or a bureaucratic exercise – it is a vital way of learning and requires a vast amount of knowledge and skill.

BUT

Incident investigators in the health service often are fairly junior, have had no formal training on incident investigation and lack the necessary ‘power’ to make changes in their organisation.  They are constrained by timelines and the quest for completing the investigation is the aim not the learning; the final report is seen as the end product not the start.

As we know – incident reporting systems are drowning in reports and investigators in the NHS do not have an army to examine the huge number of incident reports they are faced with – they are more likely to be confronted with an army of inspectors instead.

The function of a risk manager / investigator has shifted from finding out stuff and fixing it to meeting the needs of external inspection and scrutiny.

The skills and tools needed are consistently under used and their potential under realised and, dare I say it, with the ever increasing emphasis on quality improvement we are in danger of losing our way and relegating these key components of safety to ‘the back office’ and being simple administrative tasks.

If you want to learn more (apart from in the book) you can find some fantastic examples across healthcare and other high risk industries as well as lessons from sports science such as the ‘marginal gains’ concepts used in particular in elite cycling in Black Box Thinking by Matthew Syed published in 2015 by John Murray Publishes.

The chapter was written just as the Healthcare Safety Investigations Branch was being set up.  Since then they have appointed a lead (Keith Conradi) and a team of investigators together with an advisory group.  They started formally on 1 April 2017 and have started their first investigation.

I look forward with excitement and anticipation as to see whether this branch will go a long way towards raising the importance of incident investigation and the role of investigators so that they are truly seen as at the pinnacle of their careers.

 

 

Chapter 6 Learning or counting

Chapter 6 explores our current approach to incident reporting.

One autumn day in 2015 I came across a webinar titled; Patient Safety after 15 years: disappointments, successes, and what’s on the horizon.  Dr Kaveh Shojania set the context by providing an overview of the past fifteen years of the patient safety movement from ‘To err is human’ (1999) to the current day.  During the webinar, Shojania was talking about incident reporting when he said,

incident reporting is the single biggest waste of time in the last fifteen years’

(Shojania 2015)

This really intrigued me and I started to think about whether he was right and this is what triggered most of the information in this chapter which also shares the wonderful wisdom of people like Carl Macrae and Erik Hollnagel.

When something goes wrong it is unlikely to be unique, it is more likely to be something that has gone well time and time again before and that on this one occasion something failed. It is also likely that even though it failed on this occasion it is likely that it will go well many times again in the future

(Hollnagel 2014)

There is no doubt that incident reporting systems are a vital tool in the risk toolbox. Reporting systems that allow fast and translatable intelligence to be shared across the globe, and thereby prevent error, are common in aviation and other high risk industries. Patients are dying in part because people have not shared the information so that changes could be put in place to prevent future errors.

BUT

Incident reporting systems, instead of being seen as useful insights are viewed as bureaucratic tools to performance manage individuals, teams and entire organisations. The processes are too complicated; there is STILL a lack of feedback, a lack of visible action, the process does not appear to drive change and a general feeling of ‘why bother’.  It has remained a relatively passive process of individuals submitting reports to a central team who may or may not respond with feedback; information sharing rather than participative improvement.

Macrae and Shojania and others advocate a much more intelligent approach to incident reporting and using a different mechanisms to capture the things that go wrong so that the mechanism fits the purpose.  For example…

COUNTING – If all you want or in fact need to do is count the number of incidences of something happening then there are tables, graphs, simple databases and spread sheets that can do that.

NOTIFICATION – If you want to create a notification system that may be used to trigger further review or is part of a triage process that prioritises actions then all that is needed is a very simple form to complete that has a minimum amount of information required.

STATUTORY REPORTING – For all statutory reporting you create an approach that enables you to capture all the statutory required information

LEARNING – You can then use the mechanisms above to decide on which issues you will then invest time and effort in – to delve or investigate further.  This could be again in multiple ways; a thematic analysis, a forensic review of one case, or an investigation into an incident type

In January 2016, the Danish Society for Patient Safety bought together key stakeholders to create recommendations for the future of incident reporting (Rabøl 2016). They made eight recommendations similar to those identified by Macrae in 2015:

  1. Only report incidents of importance, new or surprising events – events which have the potential for learning
  2. Reporting should be easy – it takes around 20-30 minutes to complete an incident report form and then it has to travel the length and breadth of the organisations governance systems before action can be taken
  3. Disciplinary action should be clearly separated from incident reporting – which should be all about learning
  4. The ‘levels’ of reporting should be optimized.  Local reporting to solve local issues, national reporting to solve national issues
  5. Learning must be shared across the system
  6. Incident reporting should be integrated with other components of safety (and quality)
  7. Reporting should be transparent
  8. The reporter must receive individual feedback about actions taken

Over the next two decades we must refocus our efforts and develop more sophisticated and intelligent ways of capturing error and avoidable harm. We need better ways of learning from this data and better ways to share that learning.  There is already a great deal of wisdom out there on how this could be done.

Hollnagel E (2014) Safety-I and safety-II: the past and future of safety management. Ashgate Publishing, Ltd, Farnham, England.

Macrae C (2015) The problem with incident reporting. BMJ Qual Saf. doi:10.1136/bmjqs-2015-004732

Rabøl, LI, Gaardboe O, Hellebek, A (2016) Incident reporting must result in local action BMJ Qual Saf doi: 10.1136/bmjqs-2016-005971

Shojania KG (2015) Patient Safety after 15 years: Disappointments, Successes, and what’s on the horizon [online webinar] available at: ken.caphc.org

 

 

Chapter 5 The right culture for safety

Welcome back everyone.  This is taster for Chapter 5 of the book Rethinking Patient Safety.  Also… while I am writing, a bit of an update for you all… the publishers have commissioned a second book which will build on this one and focus on helping the reader implement the thinking in the first one. Exciting times.

Back to the Chapter…

Chapter 5 is all about culture; a hard thing to really capture but something that gets mentioned all of the time in relation to patient safety; “if only we had the right culture”, “what is needed is the right culture” …as  James Reason puts it the 1990s became the culture decade – and I would add has continued on until now.

In safety there are lots of debates about what do we actually mean by a safety culture.  Like me, James also says that there is no agreed definition of a safety culture.  Terms such as no-blame, low-blame, open and fair, learning and just – are all used interchangeably.  So what do we really mean and what are we really seeking?

In essence it is best to break culture down into behaviours, attitudes and values.  So what behaviours, attitudes and values support a safety culture?  [how long is a piece of string].. the list could include:

  • paying attention to your actions in terms of whether what you are doing is safe or has the potential to be unsafe – this is often referred to as a safety mindfulness
  • attitude to risk – we all have a different approach to risk, some of us are risk averse and some are more likely to behave in a risky way – humans can be both hazard and hero – a safety culture is one that provides a safety net or safeguards to prevent the risky behaviour becoming unsafe or even reckless
  • attitude to error (or violations, incidents) – one of the most important aspects of a safety culture is the understanding of human error, risky behaviour and reckless behaviour – understanding the nuances and blurred boundaries of each of these together with the right response to each one [this is often referred to as a just culture] [the books into this in much more detail]

In addition to these more traditional behaviours or attitudes that impact on safety – we (at Sign up to Safety) have also added in kindness, trust, listening to people and not being judgemental.

We have also started to explore the whole issue of violations, non-compliance and deviations (from known policy or the rules or what is expected) and started to understand that our current approach of seeing these as wholly negative that in fact what we should be doing is looking at them in positive terms.  That in fact the reason that things go right most of the time is not because people behave as they are told to but that people can adjust their work so that it matches the conditions.  In fact people aid safety through their behaviours, they:

  • Adapt and adjust to actual demand and change their performance accordingly
  • Interpret policies and procedures and apply them to match conditions rather than follow them regardless of the conditions in front of them
  • Detect and correct when something goes wrong or when it is about to go wrong and intervene to prevent

If there are behaviours, attitudes and values that we want to see engrained in our everyday actions then what are the behaviours, attitudes and values that we would like to see less off (in creating a safety culture).  We would start with:

  • Rudeness and incivility – (including bullying or aggression) can silence people, prevent them from sharing what they know, belittle their confidence and ability to perform – creating a downward spiral …
  • lack of confidence = lack of competence = increased criticism = lack of competence = lack of confidence

  • Grandstanding – people who think they know the answer before they have even heard of the problem – this can lead people to take the wrong path or do the wrong thing despite a niggle that what they are doing might not be right – grandstanding also silences people but importantly it doesn’t create a shared view of the issue and the ability to work together
  • person starts to talk about their problem – they are interrupted almost instantly by someone with a firm point of view – they try to explain but are silenced by the other person talking over them – they give up and don’t get the help they need or wrongly follow the suggested solution the other person is offering

  • Not listening (just waiting your turn to talk) – people may not listen because they are rushed, stressed, anxious, or tired – they may not listen because they have too much in the heads about where they have just come from and where they are going next.  We have all been there.  If you don’t listen it is also really hard to admit that to the person who has just poured their heart out to you or shared something really important about themselves or their work or their patients.
  • Not listening means we miss crucial pieces of information, we may not act on or respond to some vital news and we are also not respecting the person on the other side of the conversation.

There are loads of tools and ideas to help people talk to each other (as a vital part of building a safety culture) on the Sign up to Safety website:

W: www.signuptosafety.nhs.uk | or for updates on Twitter: @SignuptoSafety |