I have just attended the 2015 Behavioural Insights conference in London. Just to be clear I am new to this stuff, I am not a social psychologist, a health economist or an expert in behavioural insight. in fact up until a few weeks ago I didn’t know that there was a Behavioural Insights Team (referred to as *BIT* by all) who have been around for the last five years and describe themselves as ‘a social purpose company which is partly owned by the Cabinet Office’. BIT have just completed an update report for 2013 – 2015 which explains what they do and what they have done over the last 2 years. This and other publications can be found at: http://www.behaviouralinsights.co.uk/publications
Once I have had a chance to read the report I will share some snippets via this blog.
In the meantime I thought I would share what I (with all my own personal bias and lack of understanding) learnt and how I think it could relate to patient safety. Part 1 describes my initial thoughts after the first morning.
Validation: so it turns out that we are not alone in the patient safety world with our implementation struggles – there was a lot of agreement that ‘turning theory into practice is difficult”
Key themes: Make it easy and provide the evidence (to show the change makes a difference or not). Also don’t preach, don’t force yourself on people, be pragmatic and foster a culture of testing before simply adopting
- Clearly we must do the same with safer practices
Replication: this appeared to be a preoccupation by many – any findings, no matter how brilliant they appear to be must then be tested by replicating the study elsewhere – but make sure you involve those that did the original work in order to be truly collaborative
- This reminded me of our ongoing mantra related to the implementation of the surgical checklist and the view that it is more successful if ‘tested’ locally, adapted for the specific set of theatres and teams and only then embedded into practice when people feel it fits the ‘way they do things where they are’
Ethics: another preoccupation – is it ethical to *nudge* people to behave differently, when is it ethical, when is it not?
- Again – when we introduce a ‘safer practice’ or we focus on one harm at a time – we should also consider the ethics of working on one thing rather than another and whether the safer practice may solve a problem for some but not all – are we behaving ethically?
Defaults: something ‘nudgers’ (is that a term) talk about a lot… apparently we are influenced by the default settings or simply too lazy to change them and this can mean we are ‘nudged’ to behave in a particular way. Think about when you set up your new smart phone and you get a variety of options to accept or not – who doesn’t think that the default settings must be set for the most optimum use of that particular smart phone or it feels to daunting to change them.
- So when we think about complex technical equipment in healthcare are the default settings supporting safety, minimising risk, preventing harm?
More to follow….