I have mentioned (many times) the excellent series that the BMJ Quality and Safety journal published titled:
‘The problem with…’.
The journal published one on incident reporting; the problem with incident reporting written by Carl Macrae, who provides an outstanding addition to the debate that the problem with incident reporting is that reporting and reporting systems are often misunderstood, misapplied and have left us all with confused and contradictory approaches which have seriously limited their potential impact. It still stands the test of time and is worth reminding us what he said. He sums it up for me when he says…
‘we collect too much and do too little’
Carl describes our current approach as a filing cabinet. A cabinet full of past incident reports requiring an army of personnel to do justice to the reports and properly examine and analyse them in search of answers. Our focus has for too long been on collecting more rather than collecting better information. He provides a really helpful table that shows us the difference between what we could have and what we actually have. For example:
- what we could have – ‘avoid swamping the reporting system to ensure thorough review of all reported incidents’
- what we actually have – ‘celebrate large quantities of incident reports and aim for ever increasing overall reporting rate’
The current definition for reporting is…
‘any unintended or unexpected incident that could have or did lead to harm’
Carl quite rightly points out that we got this wrong; this is far too broad and misses an opportunity for using reporting criteria to ‘shape attention and set priorities’. He then goes on to talk about the current emphasis on more reports and that ‘higher levels of overall reporting reflects a better safety culture’. He states that this is a blunt measure and that we are left with systems which have little new information. This philosophy can pressure organisations to increase reporting just for the sake of increasing the numbers rather than using them for learning.
‘repeated reports of the same type of event suggest a strong culture of reporting but a poor culture of learning’
Some of the most worrying things about the current approach to incident reporting is the fact that it is used as a threat, used to highlight operational issues and used as a proxy for measuring safety in an organisation and as Carl points out, this is a particularly poor way of measuring safety performance. Incident reporting systems have never captured all the things that go wrong on a day to day basis; they are biased towards the easy to report and the attitude that different professions have towards reporting, they also capture all sorts of administrative issues that are not safety related and are often highlighting concerns that individuals have about how the organisation is run rather than the safety of the clinical care. While these may impact on safety they end up by drowning out the important information. Truly hiding the needle in the haystack. Also when these biases lead to the reporting of particular types of incidents but not others – this has a knock on effect to prioritising action and activities that may not be as important to address than some issues that only have a handful of reports to their name. At a national level the numbers and types of incidents reported can then be then used to shape patient safety policy, create patient safety alerts and other national interventions.
In terms of the quality of information, we now know that in any one event there are multiple truths and facts – that for one person there is their version of the truth, the facts, the event and for another there is a different version of the same incident. We all know when telling stories about our own lives that we sometimes miss things out or elaborate a fact to make a point. This natural behaviour distorts the truth very early on. So to see incident reports as telling the exact truth is wrong. As Carl states ….
‘early reports are often inaccurate and usually entirely wrong’
Additionally we know that the data is really had to analyse, particularly when it comes to free text. Especially when as Imperial College found that issues such as Clostridium Difficile can be spelt in so many different ways!
Here are a number of suggestions:
- Ensure that you have a just and learning culture to support the whole process otherwise you will only create fear
- Simplify the data collected where possible
- Never use them as a threat
- Be purposeful about what you want to collect and where, if you are simply wanting to count the number of something you can use a table or spreadsheet for this
- Use incident reporting as a trigger for further investigation or not – do not assume they tell you the whole story
- Provide feedback and create a two way conversation to draw on the collective intelligence of staff to build the full picture once something has been reported