‘Errare humanum est’
Adapted by Alexander Pope1 in 1711 into the famous quote: ‘To err is human, to forgive divine.’
Case: A MVA victim (later retrieved by GSA-HEMS) comes into a rural trauma unit with a head injury and facial fractures. They need intubating. There are three other victims from the same MVA on their way in. One of our specialists happens to be working an ED shift with an experienced GP anaesthetist doing a locum shift. The intubation is delegated to the GP anaesthetist so that the ED consultant is free to manage the other three victims on their arrival. The GP mistakenly intubates the oesophagus. There is no CO2 trace and no chest movement. On questioning the GP anaesthetist suggests there is a problem with the capnography module on the monitor and asks for a colour-metric CO2 detector. After this device has also revealed no EtCO2, the ED consultant removes the tube and correctly intubates the trachea. The patient was pre-oxygenated appropriately and did not desaturate during the entire intubation episode.
Challenge: To acknowledge that we will all suffer from human error, and think about ways to reduce this leading to patient harm.
Learning points: High technology systems have many defensive layers to avoid adverse events: some are engineered (alarms, automatic shutdowns, some rely on people (medics, pilots) and others depend on procedures and administrative controls2.
In medicine their function is to protect patients becoming victims. Nearly all adverse events involve a combination of active failures and latent conditions.
Active failures are like mosquitoes: they can be swatted one by one, but they will keep coming. The best remedies are to create more effective defences e.g. to drain the swamps in which they breed. The swamps, in this case, are the latent conditions. Understanding this concept leads to proactive rather than reactive risk management.
Human error can be viewed in two ways: the person approach and the system approach. The basic premise to the ‘system approach’ model of human error is that humans are fallible and errors are to be expected, even in the best organisations. Counter measures are based on the assumption that though we cannot change the human condition, we can change the conditions under which humans work. The important issue is not who blundered, but how and why the defences failed2.
The ‘person approach’ to error focuses on the unsafe acts – the errors – of people at the sharp end. People exhibit three error types:
Mistakes (planning stage) occur when the steps in the plan are adhered to but the plan was wrong. Our case – that the GP anaesthetist performed the intubation without possessing the proper skills.
Lapses (storage stage) are associated with our memories; when someone has failed to do something because of a lapse in memory or attention e.g. skipping a step on a check list. Our case – the GP anaesthetist forgetting to place the suction catheter under the pillow and so cannot remove the blood in the airway which is blocking their view.
Slips (execution stage) are generally observable actions that are not in accordance with a plan e.g. mis-keyed command. Slips are most often associated with the execution phase of cognition3. Our case – the GP anaesthetist not trusting the CO2 monitors to be correct.
There were lots of safety measures in place to prevent this patient coming to harm e.g. well equipped ED bed-spaces, trained staff, protocols, pre-oxygenation, alternative ways to check ventilation, different ways to monitor EtCO2, senior supervision and questioning and re-checking steps.
Human reliability specialists now widely uphold the idea that productive strategy for managing human error should focus upon controlling the consequences rather than striving for the elimination of this error3. And effective error management depends crucially on establishing a reporting culture4.
Without a detailed analysis of mishaps, incidents and near misses, we have no way of uncovering recurrent error traps. The complete absence of such a reporting culture within the Soviet Union contributed crucially to the Chernobyl disaster5.
Final thought: Limiting the incidence of dangerous errors will never be wholly effective but resilient organisations create systems that are able to tolerate the occurrence of errors and contain any possible damage. Comprehensive management programmes target: the person, the team, the task, the workplace and the institution as a whole6. They expect individuals to make errors and train their workforce to recognise and recover them. Their staff continually rehearse scenarios where there is potential for failure (any of this sound familiar?)
References:
1. Pope, A., J. W. Croker, et al. 1871. The Works of Alexander Pope, J. Murray.
2. Reason, J. 2000. “Human error: models and management.” Bmj 320(7237): 768-770.
3. Hollnagel, E. 1993. “The phenotype of erroneous actions.” International Journal of Man-Machine Studies 39(1): 1-32.
4. Reason, J. 1997. Managing the risks of organizational accidents. Aldershot: Ashgate.
5. Medvedev G. 1991. The truth about Cher nobyl. New York: Basic Books.
6. Reason J. Human error. New York: Cambridge University Press, 1990.
“Cuiusvis hominis est errare, nullius nisi insipientis in errore perseverare”
Brilliant Laurence!
Reblogged this on prehospitalpro and commented:
An interesting post from GSA HEMS
? retraining