Working Safely

It turns out that if you change how people talk, that changes how they think’

Lena Boroditsky, Professor of Psychology, Stanford University

Over the last twenty years the subject of patient safety has grown and we have achieved a number of changes in terms of raising awareness of the issues and quantifying the problem.  However we all know that this is not quite working as we imagined.

As has been mentioned by us in previous blogs we are big fans of the work of Erik Hollnagel and his colleagues and their thoughts in relation to Safety I and Safety II.

To remind you…

Safety-I is where safety is defined as a condition where the number of adverse outcomes (accidents, incidents, near misses) is as low as possible

Safety-II is the ability to succeed under expected and unexpected conditions alike, so that the number of intended and acceptable outcomes (in other words everyday activities) is as high as possible

With this definition, safety (and patient safety) changes from studying why things go wrong to studying why things go right. Hollnagel suggests that means studying and understanding everyday activities which are ‘actual events’ that show how a system functions.  The purpose is no longer to ‘avoid that things go wrong’ but instead ‘ensure that things go right’.

Hollnagel challenges us to think about our definitions and language when talking about ‘safety’, that we should move away from these titles or easily boxed in headings to talking about ‘working safely’.  We could not agree more.  This completely changes the mind-set.  It moves Patient Safety from a thing one person does or a workshop or a strategy to about everything we do, every action we take and every decision we make.

Helping people work safely means we help them adjust what they do to match the conditions of actual work, help them learn to identify and overcome the flaws in the system, and help them interpret and apply policies and procedures to match those conditions.

So in that respect ‘Patient Safety’– should be redefined as ‘working safely’ and should be defined as:

working safely (in relation to patient care) is ensuring that that the number of intended and acceptable outcomes is as high as possible and people adjusting what they do to match the conditions of actual work, learning to identify and overcome the flaws in the system, and interpreting and applying policies and procedures to match those conditions

Those that work in patient safety should study what is working, rather than what doesn’t and should study how people work, individually and collectively and how the organisation functions when things are going well.

 

There are a few things we can do to help people work safely.

Connect people up to work together

We need to connect up the people who are working separately on particular problems in isolation. When people and their isolated projects come together learning increases and instead of improving one process at a time they improve aspects of care (and problems) that thread throughout all of these different harms.

We have tended to focus on problems in isolation, one harm at a time, and our efforts have been simplistic and myopic. If we are to save more lives and significantly reduce patient harm, we need to adopt a holistic, systematic approach that extends across cultural, technological and procedural boundaries – one that is based on the evidence of what works”

Professor the Lord Ara Darzi

Working on harms in isolation can have the risk of creating competition in a way that people don’t know which ‘interest’ or area of harm deserves more or less effort, time and resource.  These competing interests create competing prioritisation and confusion.

Instead organisations could hold ‘cross harm’ conversations where people talk about the common set of causal or contributory factors. These conversations could help discover new ways of sharing information, or designing new pathways to pick up issues more quickly, what could be standardised and what cant.

Stop assuming that healthcare fits into a neat linear model

Erik Hollnagel suggests that simple linear accident models were appropriate for the work environments of the 1920s (when they were first conceived) but not for the current work environments. Also that ‘composite’ liner models such as the ‘swiss cheese’ model by Jim Reason from the 1980s also worked for a different operating model than today.

The reality is that working in healthcare is muddled and unpredictable.  However, Liz Wiggins and Harriet Hunter talk about organisations as…

complex responsive processes with the focus on local, unpredictable interactions between people

They assert that there are multiple, non-linear relationships and interactions between people that are taking place all of the time.  That what is really happening is different from what people think ought to be happening.

Working safely is not a nice neat linear, step by step process that is underpinned by nice neat linear protocols or procedures.  That would be reliant on everyone agreeing with what is needed, that it is known by all and achievable and that nothing unexpected will happen along the way.

Zero harm is impossible

We have to accept that a system can never be ‘safe’ it can only be as safe as possible. Too often, people work in systems that are not well designed or not designed to help people work safely.  What we can do is drive down error and design systems to minimise its effect as much as we can.

Narrow the gap between ‘work as imagined and work as done’

There is an assumption that everything we do can be written down in procedures and guidelines and people will simply follow them.  Wiggins and Hunter suggest:

the key to a relational approach to change is paying attention to what is actually happening in practice as a result of people working with each other, rather than being enslaved to beliefs about what you think ought to happen and what is inscribed in the protocol

This is also the view of those who believe in Safety II concepts and the science of Human Factors.  Steven Shorrock and others talk eloquently about the issues associated with ‘work as imagined as opposed to work as done’.  Steve is the Editor of a fantastic resource titled the Hindsight Magazine which devoted a whole journal to the subject (Hindsight 25).

Sidney Dekker says:

“Sure, we can imagine work in a particular way. We can believe that people will use the technologies we provide them in the way they were intended. Or that they will apply the procedure every time it is applicable. Or that the checklist will be used.

These assumptions (hopes, dreams, imaginings), are of course at quite a distance from how that work actually gets done on the front line, at the sharp end. Work gets done because of people’s effective informal understandings, their interpretations, their innovations and improvisations outside those rules”

Extreme adherence to the plan or directives or rules can be problematic. The way work is imagined by policy makers, board members, leaders, managers and planners is the way they want it to be done not necessarily how it can be done. What these people need to do is:

“Pay attention to what is really happening, rather than what you think ought to be happening; be facilitative and focus on working with people, rather than ‘doing to’ them and see relationships as the source of insight, creativity and energy”

Liz Wiggins and Harriet Hunter (2016)

Sidney Dekker goes on to say;

To learn how work is actually done – as opposed to how we think it is done – our leaders need to take their time. They need to use their ears more than their mouths. They need to ask us what we need; not tell us what to do. Ultimately, to understand how work actually gets done, they need an open mind, and a big heart

At Sign up to Safety we would add:

In order to learn how work is actually done go and talk to people, sit down with them and listen with intent, listen to understand. Then think about what you can do to help people with their reality

Talking and listening to staff on the frontline provides a rich source of intelligence of what works well, especially those that move around the system frequently such as doctors in training.

This work helps people to become consciously competent. If you notice what you can do you can also explore the gaps between your intentions to keep people safer and what can happen in some situations.

It also works both ways.  Often frontline staff say things like ‘the people at the top don’t understand what we do’ ‘I don’t understand why they cant just fix things’.  So the frontline ‘imagine’ a world where leaders, board members and so on that is not quite as it is ‘done’.

The conversation that will help narrow this gap has to be two way.

What can we all do differently?

  • Change the language
  • Shift ‘patient safety’ from one persons job to everyone’s job
  • Create a consensus of what we mean by a just culture and consistently embed it
  • Learn from when we get it right and replicate good practice
  • Re-design systems and mind-sets across every part of the NHS differently that help the human adapt and adjust their performance safely.  This means across your organisation – finance, procurement, operations, clinical areas and so on to think about working safely as part of what they do – creating an enabling environment to help people work safely
  • Stop doing stuff (prioritise and focus the top down interventions, directives, targets and alerts)
  • Focus on the cross system factors that thread through the individual ‘harms’ such as observation, the factors that lead to deterioration, communication and design
  • Spend as much time on implementation as we do on innovation and improvement
  • Be kind to each other
  • Help people interact and develop relationships through talking to each other and listening

References:

  • Erik Hollnagel and Rene Amalberti, (2001) The Emperor’s New Clothes or whatever happened to human error? In: 4th International Workshop on Human Error, Safety and System Development, June 11-12, Linkoping, Sweden (keynote)
  • Hindsight Journal – via https://www.skybrary.aero/index.php/Hindsight_25
  • Erik Hollnagel, (2010) Safer Complex Industrial Environments. CRC Press, Boca Raton, FL
  • EUROCONTROL (2013) From Safety-I to Safety-II: A White Paper.
  • Erik Hollnagel, (2012) A Tale of Two Safeties – via www.resilienthealthcare.net
  • Erik Hollnagel, (2013) Is safety a subject for science? Safety Science; Elsevier Ltd http://dx.doi.org/10.1016/j.ssci.2013.07.025