Chapter 4 Systems approach to safety

All incentives should point in the same direction. In the end, culture will trump rules, standards and control strategies every single time, and achieving a vastly safer health service will depend far more on major cultural change than on a new regulatory regime – Don Berwick 2013

Healthcare is not one single system but a set of systems ranging from the ultra-safe to ultra-adaptive (Vincent and Amalberti 2016).  Neither is it one culture – but that is for another chapter.

The system design helps or hinders the behaviour and performance of the individuals that work within it; the clinicians, the managers, the administrators and so on (Dekker and Leveson 2014).   There are so many variables within healthcare it is often a surprise that so many thing go right.

  • Variation in training, individual experience, fatigue, boredom, depression and burnout
  • The diversity of patients – variation in age, severity of illness and co-morbidities
  • Working hours – types of shift patterns, length of working hours
  • Complicated technologies – equipment, tasks and procedures
  • Drugs – calculations, drug names that look and sound alike, prescriptions, computerized prescribing and alert systems
  • Design – poor packaging, poor labelling, design of equipment, design of environments (and age of buildings)
  • Length of stay – prolonged or multiple transfers
  • Poor communication, handover, transitions and information

Safety experts are more and more agreeing with the work of Rasmussen who back in 1990 said that a safer system is one that is able to adapt, able to help the people that works within it adapt to their environment or changes to their environment. The goal is not to limit human behaviour into a set of tasks but to design a system in which a human can function safely and effectively while at the same time being able to use their judgment, and ingenuity.

This chapter introduces the concepts such as risk resilience and resilience engineering, Safety I and Safety II.  These aim to create a system that continually revises its approach to work in an effort to prevent or minimise failure, be constantly aware of the potential for failure and help people make decisions and direct (the limited) resources to minimise the risk of harm, knowing the system is compromised because it includes sometimes faulty equipment, imperfect processes, and fallible human beings. Hollnagel and colleagues describe three key elements as part of resilience:

  • foresight or the ability to predict something bad happening
  • coping or the ability to prevent something bad becoming worse
  • recovery or the ability to recover from something bad once it has happened

More recently the field of patient safety is moving from the (potentially) simplistic approach of capturing incidents and looking at when we get it wrong to learning from when we get it right.  It is innovative in that it looks proactively rather than reactively and does not assume that erratic people degrade an otherwise safe system.  The work of Adrian and Emma Plunkett – Learning from Excellence is an outstanding example of putting these theories into practice.

James Reason talks about humans as heroes – resilience experts align with this and describe a “new view of human error” which sees humans in a system as a primary source of resilience in creating safety.  Eric Hollnagel describes this in detail in his book that sets out the concepts of Safety I and Safety II (Hollnagel 2014).  He sets out the current state as Safety I defined as a state where as few things as possible go wrong and how we seek to improve safety by measuring and then trying to reduce the number of things (systems and humans) that go wrong.

This approach is stretched to or beyond breaking point – and the world has changed

In Safety I the basic assumption is that the system is safe and people make it unsafe.  It generally goes like this:

  • Something goes wrong – the default is to look at what the policy says should have happened
  • Most investigations usually start with an assumption that the ‘operator’ must have failed
  • Regulators, legal and clinical negligence systems and human resource use this approach and anyone that deviated from the ‘normal’ policy are found to be at fault
  • We are fixated on clear and simple explanations of what ‘should be done’ – we try to prescribe tasks and actions in every detail
  • When there is a difference between the policy and practice this is used to explain why things went wrong
  • Everyone actually doing the work knows it is only possible to work by continually adjusting what they do in a given situation

In Safety II we are much more proactive and interested in how things go right, how people are continuously trying to anticipate the potential for harm.  Instead of humans being seen as a liability, in Safety II the systems are considered unsafe and the people create safety; humans are seen as a necessary component of system flexibility and resilience.

Performance adjustments and variability is at the core of healthcare.  This is a positive thing but is interpreted negatively as…deviations, violations, and non-compliance.

However, the reason that things go right most of the time is not because people behave as they are told to, but that people can adjust their work so that it matches the conditions.  So…

Value the humans in the system and treasure adaptability

  • People adapt and adjust to actual demand and change their performance accordingly
  • People interpret policies and procedures and apply them to match conditions
  • People can detect and correct when something goes wrong or when it is about to go wrong and intervene to prevent

In Safety II the pursuit is one of building a system where everything goes right.  It is also the ability to succeed under unexpected and unexpected conditions so that the number of intended actions and acceptable outcomes are as high as possible.

This leads to the patient safety reviewer to find out how or why things go right and to ask how we can see what goes right.  Questions are the answer:

  • Do you ever adjust the activity to the situation?
  • How do you determine which way to proceed?
  • What do you do if something unexpected happens? e.g. an interruption, a new urgent task, a change of conditions?

Hollnagel goes on to say that adopting a Safety II perspective does not mean that everything must be done differently or that current methods must be replaced wholesale.  His recommendation is that we look at what is being done in a different way.  It is still necessary to investigate things that go wrong and it is still necessary to consider the risks but that when doing an investigation adopt a different mind-set.

 

 

I also talk about human factors, standardisation and the way in which we have for far too long on problems in isolation, one harm at a time. Across the UK there are people who concentrate on falls, pressure ulcers, sepsis, acute kidney injury, or VTE ….. the list goes on. These individuals are passionate about their particular area, they can be lone individuals or groups, either inside or outside the organisation.  They ‘lobby’ people for attention.  What these individuals tend to do is focus others around them on reducing or eliminating specific harms through a project-by-project approach.  People are confused, they don’t know which ‘interest’ or area of harm deserves more or less effort, time and resource. While there is a need for these passionate people focusing on their particular cause we also need to ensure that everything is joined up.   Stopping them from becoming competing priorities and maximising their successes by collaborating.

Patient safety requires a concentration on the factors that thread throughout all of these individual areas of harm; a common set of factors. For example, communication, patient identification, patient observations, the sharing of information with each other and patients, the deficiencies in teamwork and team culture, and the way we design the system and care pathway. These are the same cross cutting factors that happen time and time again and which have been found to explain the majority of the reasons why harm happens.

The concepts described in this chapter move us away from the predominant approach of the last two decades which has led us to look at safety as isolated incidents, to not consider the ways in which we get it right most of the time and try to replicate that and to not join up the disparate patient safety projects.

To the patient it is never just one incident or a nice neat line of activity and there is often hope amongst the harm.  We need to address patient safety in the same way.

References:

Rasmussen J (1990) Human error and the problem of causality in analysis of accidents, In Human Factors in Hazardous Sitations ed DE Boradbent, J Reason and A Baddleley 1-12 Oxford: Clarendon Press
Hollnagel E, Woods, Leveson NG. (2006) Resilience engineering. Abingdon: Ashgate
Hollnagel E (2014) Safety-I and safety-II: the past and future of safety management. Ashgate Publishing, Ltd, Farnham, England.
Reason J (2015) A life in error  Farnham: Ashgate
Vincent C, Amalberti R (2016) Safer Healthcare: Strategies for the Real World Springer Open

 

Chapter 3 A culture of learning

In the third chapter I explore ‘a culture of continuous learning’. I share two contrasting stories; one related to Johns Hopkins Hospital in Baltimore, US, and the other concerned with the care of Sam Morrish and his family in the NHS.

The mystery that motivates most of us who work in patient safety is that why after more than two decade’s worth of energy and resources devoted to patient safety, progress has been so slow.

The Johns Hopkins story is not a new one to those that work in patient safety – the stories of Josie King ad Ellen Roche – but it is the contrast with the story of Sam Morrish that resonated with me.  One organisation accepting that they had let down Josie and her family versus a system that failed to respond to Sam’s needs as effectively as they should and a failure to respond to the needs of his family when he died.  Instead of seeing this as ‘one of those things’ Johns Hopkins Hospital wanted to learn from Josie’s death and set about investigating exactly what had happened and why she died. Unprecedented at the time the hospital involved the family.  They went to their home and apologised for her death and told them they would tell them everything they knew when they knew it.  The family were communicated with every Friday morning, even when there was little to report. The investigations into Josie’s death found that she had died of dehydration as a result of sepsis, a hospital-acquired infection. Similar to most patient safety incidents, there was much more to why Josie died than the care she received.  Communications had failed, and importantly Josie’s parents were not listened to.  Josie’s parent’s repeatedly told staff that their daughter was thirsty but these pleas were ignored.

In the same year Ellen Roche died of lung failure after inhaling an irritant medication while participating in an asthma research study.  These two deaths propelled the hospital to what some consider the most significant culture change in its history (Nitkin and Broadhead 2016).  Johns Hopkins began treating safety like a science, collecting data to find, test and deploy systemic improvements. Safety has become their top priority.  Peter Pronovost one of the clinicians who worked at the children’s unit where Josie died, viewed these events as a moral dilemma. The hospital had to make a choice, were they going to be open and admit their mistakes or were they going to chalk it up to one of those things and move on.  In his words… they decided to start talking.

Josie’s parents channelled their grief into action and created the Josie King Foundation and the Josie King Hero Award for caregivers who create a culture of safety and gave the first one to Peter Pronovost.  Sorrel King offers this advice to everyone involved in patient care: “Slow down and take your eyes off the computer. Look at the patient in the bed and listen. Listen to that mother who is saying something is wrong” (Nitkin and Broadhead 2016).

In our work at Sign up to Safety we work with a number of patients and their families.  One of the things we hear time and time again is an ask to be treated with care and kindness.  When things have gone wrong instead of closing up they desperately want you to reach out, say sorry, tell them what is happening and what you know and how you will do your very best to find out what happened.  Crucially listen to them – they have insights, their own side of the story, facts that no one else will have.  Give them that voice and have a conversation between you and them; one human to another. Listen to the mums, the daughters the husbands and the patients themselves.  If they are worried or question an action, listen.  Don’t be too proud and too late.  Make patients and colleagues feel their voice is worth listening to.

This is what Sam’s parents Scott and Sue needed.

Sam died at the age of three from sepsis in December 2010.  An investigation in 2014 found that had Sam received appropriate care and treatment, he would have survived.  However, this investigation failed to provide clear answers as to why Sam died.  Sam’s parents asked the Ombudsman to undertake a second investigation to find out why the NHS was unable to give them the answers they deserved after the tragic death of their son. The second investigation found a failing system.  Sam’s family should have been provided with clear and honest answers to their questions at the very outset.

As Sam’s parents say.. they ‘trusted the NHS and trusted those that work in it to find out what had happened and why’.  This should be at the very heart of any investigation. This includes informing the family from the outset and assigning someone to answer every single question they have and to follow them up at regular intervals as they did at John’s Hopkins.  It should never be left to patients, or grieving families to drive the process for learning.  Scott sums it all up with the following:

Patients who ask questions can be perceived as a threat, especially if those questions might draw attention to the NHS’s own part in clinical failure or untoward clinical incidents. This can even happen when, as in my case, the motivation was purely a need for understanding, and awareness (on my part) of the possibility of learning. I was grieving, but consistently articulating no interest in blame.

Patients and staff can find themselves isolated, in a no-mans land, their concerns ignored, whilst everyone cedes responsibility to ‘processes’ and ‘systems’ without taking responsibility for either. When no one feels responsible, everyone feels powerless. Actions taken, decisions made and conclusions drawn, if inept, create more frustration, fear and sometimes irreconcilable differences. It also leads eventually to the despairing and fatalistic acceptance of avoidable harm; the notion that there is no need or point in investigation: there is no expectation of learning. It gives rise to a universal feeling of powerlessness where patients do not think it is worth the phenomenal risks involved in raising concerns, because staff seem (and feel) powerless to help; complaints handlers only find fault in systems, processes and services; and no one is deemed responsible for anything. When responsibility is ceded to large and complex processes and systems but no one takes responsibility for them: no one is accountable. This is not a figment of complainants imaginations.

The consequences are profound, insidious and pervasive to the point where they undermine the NHS by alienating patients from staff, and staff from each other, all the time eroding trust and respect in all directions. The same alienation and erosion of trust and respect occurs between commissioners and providers, and between both with their regulators. Nothing is learnt. Nothing changes.

It takes time, effort and determination, to understand all of this, but as the picture becomes more complete, it is no longer a mystery, or even a surprise that people we know to be kind, caring and good, can do or say apparently senseless things which, unintentionally, can cause additional suffering. The only way I can understand such paradoxical behaviour is in terms of culture, which can be paraphrased along the lines of “Blame the system. It is not our fault. It is the way we do things around here.”

 

 

 

 

 

 

References:

Nitkin K, Broadhead L (2016) No Room for Error [online] available at http://www.hopkinsmedicine.org/news/articles/no-room-for-error

PASC (Public Administration Select Committee) (2015) Investigating clinical incidents in the NHS – Sixth Report of Session 2014–15 Report [online] available at: http://www.publications.parliament.uk/pa/cm201415/cmselect/cmpubadm/886/886.pdf

PHSO (Parliamentary and Health Service Ombudsman) (2016) Learning from mistakes; An investigation report by the Parliamentary and Health Service Ombudsman into how the NHS failed to properly investigate the death of a three-year Child – [online] available at http://www.ombudsman.org.uk/__data/assets/pdf_file/0017/37034/Learning-from-mistakes-An-investigation-report-by-PHSO.pdf