Healthcare Safety Investigation Branch – First Report

The Healthcare Safety Investigation Branch has published its first report .  The investigation was triggered by a failure to correctly check hip prostheses for a 62 year old man, Mr John Hampton.

The report is excellent and with this the Healthcare Safety Investigation Branch have set the bar high for the quality of their work.

It is one of the best investigation reports I have ever read.  It is clear, detailed and easy to read and beautifully describes the multiple factors (human, system and cultural) that can lead to an error.  The following are a few things that really hit home for me.

The patient:

From what I can read it appears that the investigators developed a good relationship with the patient and that he was involved in the investigation.  There is a particularly moving point about the on-going effects for Mr Hampton on himself and his family.   Not only will they have a crucial piece of the jigsaw they should be involved , informed and supported throughout.  This should be the minimum.

All investigations should involve the patient and or their family  – they should never be a silent participant of their own story.

Standards, guidelines, policies and procedures:

The report shows that standards, guidelines and procedures are not full proof tools for error prevention.  As Jim Reason says ‘there are not enough trees in the rainforest to write a set of procedures that will guarantee freedom from harm’.  The issue is not that you have a standard but whether people know about it, whether it is easy to read and understand, whether it is short enough to help you make the correct steps but not long enough to drown you in detail.

Standards are worthless if they are not implementable.

Labelling and packaging:

We are reminded of the continuing problems with labelling and packaging (highlighted some time ago by the National Patient Safety Agency (NPSA) in its outstanding ‘design for patient safety’ series).   I urge you to read The Design of Everyday Things if you are interested in this subject.

This has been debated for far too long now and it is crucial that manufacturers use the support of human factors experts, designers and behavioural insight experts to design their products to minimise the errors that can be made with things like poor packaging and labelling.

Understanding human factors:

It is so heartening to see that the report has a wealth of human factors thinking and knowledge thread throughout. Crucially the investigation demonstrates the importance of understanding the environmental conditions, individual and team factors.  For example recognising the role that fatigue plays in performance.   There is a fascinating discourse on team work and relationships and the different error rates between familiar and unfamiliar teams.  There is also an insight to understanding more about our mind-set in relation to error; for example the consultant could recall being brought the wrong size before but not the wrong manufacturer so it ‘was not, therefore, an error experience had taught him to be alert to’.  There are so many things that are applicable for all healthcare, for example the issue of noise in section 5.3 and 5.7, and our cognitive biases in section 5.6.

We simply don’t pay enough attention to human factors in the way we design our systems, structures, and organisational processes.

Barriers to error:

The investigation clearly highlights the fallibility of human memory and the importance of the different types and effective use of memory aids and the importance of reading out loud all the details needed and/or independent checks.  Love the comment ‘what’s on the box may not be what’s in your head’ – which tells us all about the strange things our brain can do to convince us that we see things that are not there or don’t see things that are.

The ‘solution’ we are always searching for is the complete inability to make a mistake, ‘designing out the chances of error’ through effective barriers that get in the way of making an error (checklists, verification processes).  These are really rare and if we think we have found them we may have to think again.  Us humans are very clever at thinking of different ways we can get round things that don’t fit naturally with what we are trying to achieve.

As it says in the report ‘team checking processes already in place have not reduced the number of wrong prosthesis incidents, suggesting that more of the same is not the solution’ and ‘double checking requires that one fallible person monitors the work of another imperfect person’. 

Therefore if you are designing barriers, just as for manufacturers, involve human factors experts, designers and behavioural insight experts.

There may never be a neat answer:

There is also a key point that we don’t talk about often enough, ‘the nurse could not understand why she had gone to different cupboards that contained prostheses from different manufacturers’ – ‘the consultant…. could not recall the details of the prosthesis verification during the operation as there was nothing out of the ordinary about it’.  I can think of numerous incidents I investigated when the people involved had no idea why they did what they did.  They could not offer a nice neat explanation.  They could not recall even the steps they took.

There is still a particular pressure to desperately search for a root cause, to find an answer.  We seem to find it impossible to say ‘we simply don’t know’.  Not knowing too often is misinterpreted as ‘covering up’, or ‘hiding something’.

It is therefore great that in this report it clearly states ‘it cannot be known precisely what checks were undertaken or at what point the process failed’  Hopefully this will give other investigators the courage to say this too.  It does not mean that we cannot learn enough in order to make recommendations as this report has done so well.

Carrying out investigations:

The report also recognises the complexities of carrying out investigations, for example noting how the interviews took place six weeks after the incident and that ‘memory recall at this timeframe may be affected by many factors and so have limited accuracy’.  It would be interesting to know whether most incident investigations involve interviews some time after the incident and how much of the truth we are actually finding out because of this.

The only way to truly understand what can be done differently is to study how work is normally done including the actions and habits that have become common place ( Steven Shorrock ).  I particularly like the way in which staff went to observe work in other operating theatres and in another country to see what normal practice was elsewhere.  We all know about the effect observation can have on behaviours but they were clever in that they stated they were observing ‘general theatre practice’.

What I like most about this is the way it demonstrates variation and that this incident was not the problem of one person, one team or even one organisation.

What’s missing:

The two things missing for me:

  1. The different slant that looking at all of this through a ‘safety II’ lens may bring.  I think it goes some way to try to explain the every day, what happens in the organisation of review but also what happens elsewhere.  It alludes to what works well elsewhere and what could be replicated but this could be a little stronger for me
  2. The cultural issues – was it an environment where people could speak up?  I appreciate that the fact the team were ‘like a family’ and its potential inhibitory factor was bought up but what about the issues of hierarchy or gender or status that may have inhibited people from challenging (even if this was not part of the factors that led up to the error)? Even if there wasn’t a problem it might have been worth discussing more

What to do with this report:

  • Read it! and then read it again – it has so many really great learning points throughout
  • Give it to your staff to read and use the report as a tool in itself to start the conversation for briefings and debriefings in theatres (it is applicable elsewhere too)
  • Use this to show the importance of accessing human factors experts where you can – not only in understanding how things work, how things could be safer and what you can do differently but also in your investigations
  • The way the report is written is a lovely ‘good practice’ example for all local investigators to learn from

Link to the report also found here:

Norman D (2013) The Design of Everyday Things. London: The MIT Press