Description #
Discussion on human error and the different attitudes towards error management.
Learning Objectives #
At the end of the unit student should have a basic understanding of the nature of human error and the different approaches used in error management. Students should be able to discuss the differences between the ‘person-based’ and ‘system-based approaches to error management.
The Nature Of Human Error Lecture #
The Nature of Human Error #
The Nature of Human Error
There have been many studies on human error and many definitions for it. While these studies are important when it comes to gaining a deeper understanding of human error, they are in a sense purely academic exercises; for our purposes the real issue is patient safety. That said, a useful definition of error comes from Dr. James Reason, a professor of Psychology who has written extensively on the subject; he describes error as circumstances in which planned actions fail to achieve the desired outcome.
Errors can be classified into three broad categories:
– Skill-based error: Error from a lack of proficiency or knowledge of a skill.
– Procedural error: Error resulting from a slip, lapse or mistake in the execution of a procedure or regulation; the intent is correct but the execution is flawed.
– Communication error: Error resulting from incorrect transmission or interpretation of information.
Human error should not be equated with incompetence, lack of motivation, sloppiness or negligence; in healthcare, just as in aviation, highly experienced people have committed serious errors.
In the context of healthcare, it is important to remember that:
– No healthcare provider comes to work intending to do harm to the patient.
– Due to the limitations of the human body and mind, everyone will eventually commit an error.
While this may seem incredibly obvious, it has not always been reflected in the way human error has been managed; past practice in aviation had been to focus on error causality (causation and commission), i.e. blame the person(s) who committed the error (person-based approach). In other words human error was someone’s fault:
‘…He didn’t…’
‘…She failed to…’
‘…They could have…’
‘…If only they had…’
‘…They should have…’
This simplistic, sometimes punitive approach can create a workplace climate where errors are covered up, with opportunities for improvement lost. A survey conducted by the University of Texas involving physicians and nurses found that over half of the respondents found it difficult to discuss errors. Some of the barriers to discussion were listed as:
– Personal reputation
– Threat of malpractice lawsuits
– High expectations of the patients’ family or society
– Disciplinary action by licensing boards
– Threat to job security
– Expectations or egos of other team members
In was much the same in aviation, errors were equated with a lack of competency, and no pilot wants to be labeled as incompetent. Today the aviation industry uses a ‘system-based’ approach, with the following concepts playing a much larger role when examining human error:
– Human performance and that of technological systems is always variable.
– This variability can lead to unintended consequences, which are then called ‘errors’.
– Many factors contribute to an incident/accident.
– Increasing safety requires involvement at all levels of an organization
The ‘Swiss Cheese Model’
#
Fig. 1
The Swiss Cheese Model, developed by Professor Reason, shows the role played by different levels of an organization or system, in preventing accidents. Each layer of an organization/system has the ability to deflect threats but also has ‘holes’, which allow threats to pass. A threat can be defined as an event or error that occurs beyond the influence of the front line healthcare worker. Threats can be anticipated, i.e. risks during a routine procedure, unanticipated, i.e. a sudden change in a patients’ condition, or latent; these are usually associated with non-operational factors, such as legislation, administration, ergonomics, etc.
Fig. 2
Active failures = at the ‘front line’ (e.g. direct contact with patients).
Latent conditions = ‘Behind the scenes’ (indirect contact with patients), i.e. administration, facilities, ergonomics.
Healthcare workers who have direct contact with patients are the last line of defence when it comes to preventing, trapping and/or mitigating threats and errors. Regardless of the type of threat or error, patient safety depends on whether or not they can be detected and corrected before they result in an unsafe condition or outcome.
A better understanding of how factors such as fatigue, stress and workload influence situational awareness, decision-making, problem solving and teamwork is an important element of threat and error management. Ideally, training should recognize the inevitability of error and concentrate on the management of threats and errors rather than concentrate on error causality.