Article Text

Download PDFPDF

Safety in the operating theatre – Part 2: Human error and organisational failure
  1. J Reason
  1. Department of Psychology, University of Manchester, Manchester M13 9PL, UK

    Abstract

    Over the past decade, anaesthetists and human factors specialists have worked together to find ways of minimising the human contribution to anaesthetic mishaps. As in the functionally similar fields of aviation, process control and military operations, it is found that errors are not confined to those at the “sharp end”. In common with other complex and well defended technologies, anaesthetic accidents usually result from the often unforeseeable combination of human and organisational failures in the presence of some weakness or gap in the system’s many barriers and safeguards. Psychological factors such as inattention, distraction and forgetfulness are the last and often the least manageable aspects of the accident sequence. Whereas individual unsafe acts are hard to predict and control, the organisational and contextual factors that give rise to them are present before the occurrence of an incident or accident. As such, they are prime candidates for treatment. Errors at the sharp end are symptomatic of both human fallibility and underlying organisational failings. Fallibility is here to stay. Organisational and local problems, in contrast, are both diagnosable and manageable.

    • human error
    • operating theatre
    • safety

    Statistics from Altmetric.com

    Request Permissions

    If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

    What do anaesthetists have in common with flight crews, air traffic controllers, nuclear power plant operator, fire chiefs and battle tank commanders? David Gaba,1 an anaesthetist and a pioneer in the human factors of anaesthesia, claims that the practice of anaesthetics has more basic similarities to these activities than to other branches of medicine, excepting perhaps the related fields of intensive and emergency care. These shared characteristics include the following:

    • Uncertain, dynamic environments.

    • Multiple sources of concurrent information (ie, many data streams).

    • Shifting, ill-defined, or competing goals.

    • The need to maintain an up-to-date “mental model” of what is often a rapidly changing situation.

    • Reliance upon indirect or inferred indications.

    • Ill-structured problems.

    • Actions having immediate and multiple consequences.

    • Moments of intense time stress interleaved with long periods of routine activity.

    • Sophisticated technologies with many redundancies.

    • Complex and sometimes confusing human-machine interfaces.

    • High stakes.

    • Multiple players with differing priorities.

    • A working environment highly influenced by group norms and organisational culture.

    One important difference between anaesthetics and these other activities, however, is the consequence of human failure. The release of radioactive material from a nuclear power plant, as at Chernobyl, or the destruction of large commercial aircraft, as at Tenerife, creates an immediate public demand for investigation and remediation. The accidental death of a single patient during surgery or shortly after usually attracts little attention beyond the hospital concerned and the immediate family. It is no coincidence, therefore, that until recently human factors research has focused largely upon the high consequence fields of aviation, nuclear power generation and military operations. These were the people that had the resources and the political muscle to fund the work.

    Over the past decade there has been a growing awareness on the part of both anaesthetists and psychologists that what has been learned from these high profile, non-medical accident investigations and their associated research applies remarkably well to the study and prevention of anaesthetic mishaps. This paper takes these domain similarities as its starting point, and considers how knowledge of the human contribution to system breakdown, acquired in these other fields, can be usefully applied to anaesthetics.

    THE NATURE AND LIKELIHOOD OF ANAESTHETIC MISHAPS

    Surveys give somewhat differing values for the involvement of human error in anaesthetic incidents and accidents, but there is growing agreement that the figure is between 70–80%.2,3 Data from the Australian Incident Monitoring Study,3 based upon 2000 anaesthetic incidents, identified the following as the 12 most commonly occurring contributing factors.

    • Misjudgement (16%)

    • Failure to check equipment (13%)

    • Fault of technique (13%)

    • Other human factors problems (13%)

    • Other equipment problem (13%)

    • Inattention (12%)

    • Haste (12%)

    • Inexperience (11%)

    • Communication problem (9%)

    • Inadequate preoperative assessment (7%)

    • Monitor problem (6%)

    • Inadequate preoperative preparation (4%)

    The prominence of equipment-related problems in this list is in keeping with earlier findings that 48% of anaesthetists use new equipment without reading the manual, and 60% do not follow the manufacturer’s check procedure.4 Mayor and Eaton reported that 30–41% of anaesthetists perform no checks at all.5

    As Gaba points out, estimates of the frequency of adverse outcomes related to anaesthetic care are very difficult to obtain.1 The available data suggest that deaths due at least in part to anaesthetic factors are of the order of 1 in 2000 cases. However, deaths due solely to anaesthesia lie somewhere between 1 in 100 000 and 200 000 cases.

    Of more significance from a human factors viewpoint is the percentage of surgical cases in which, despite prior planning, some unanticipated problem will arise. Cooper and his co-workers estimated that 18% of cases will involve an unexpected problem requiring intervention by the anaesthetist, and 3–5% of cases will involve a serious unplanned event calling for substantial anaesthetic intervention.6 This rate of problem occurrence is much higher than would be expected by pilots, for example, and indicates that one of the key features of an anaesthetist’s skill is to know when and how to intervene to thwart an accident sequence. Of the three phases of surgical anaesthesia—induction, maintenance and emergence—45% of incidents occur during the maintenance phase.7 This suggests that patient monitoring problems, along with very high workload in the event of an emergency, can make excessively high demands upon the limited attentional resources of the anaesthetist.

    CLASSIFYING HUMAN FAILURES

    There is no one error taxonomy. Different error classifications serve different needs. In many domains of application, two kinds of categorisation are used together: a classification by consequences, and a classification by psychological origins. In the case of anaesthetics, a consequential classification would identify which aspect of the anaesthetist’s performance was less than adequate (e.g. not checking equipment, wrong intubation, missing critical signs, inappropriate dosage, misinterpreting rapidly changing physiological parameters, failing to recognise complications associated with congestive heart disease, carrying out an ill-advised intervention, etc). A psychological classification, on the other hand, would focus upon the mental antecedents of the error. It is this latter type of classification that will be considered here. Three distinctions are important.

    (1) Slips and lapses versus mistakes

    There are many ways of defining error.8 For our present purposes, we can say that an error is the failure of planned actions to achieve their desired goal. There are basically two ways in which this failure can occur:

    • The plan is adequate but the associated actions do not go as intended. These are failures of execution and are commonly termed slips and lapses. Slips relate to observable actions and are associated with attentional failures. Lapses are more internal events and relate to failures of memory.

    • The actions may go entirely as planned but the plan is inadequate to achieve its intended outcome. These are failures of intention, termed mistakes.

    All errors involve some kind of deviation. In the case of slips, lapses, and fumbles, actions deviate from the current intention. Here, the failure occurs at the level of execution. In the case of mistakes, the actions may go entirely as planned, but the plan itself deviates from some adequate path towards its intended goal. Here, the failure lies at a higher level: with the mental processes involved in planning, formulating intentions, judging and problem solving.

    Slips and lapses occur during the largely automatic performance of some routine task, usually in familiar surroundings. They are almost invariably associated with some form of attentional capture, either distraction from the immediate surroundings or preoccupation. They are also provoked by change, either in the current plan of action or in the immediate surroundings.9

    Mistakes can begin to occur once a problem has been detected. A problem is anything that requires a change or alteration of the plan. Mistakes can be further subdivided into two categories: rule-based mistakes and knowledge-based mistakes.

    Rule-based mistakes

    These occur in relation to familiar or trained-for problems. A large part of the anaesthetist’s expertise is made up of rules of thumb or heuristics of the kind: if X (local signs of a problem exist) then it is probably Y (a particular condition to be managed), or if X (local signs) then do Y (a particular intervention). Human beings are furious pattern matchers. We are extremely good at making rapid and largely automatic assessments of complex situations based upon matching features of the world to patterns stored in long term memory. But this process can go wrong in two ways. We can misapply a good rule (i.e. one that is frequently applicable) because we fail to notice the contraindications. Or we can apply a bad rule that has remained uncorrected in our stored repertoire of problem solutions.

    Knowledge-based mistakes

    These occur when the practitioner encounters a novel situation that lies outside the range of his or her stock of pre-packaged problem solving routines. Under these conditions, practitioners are forced to resort to slow, effortful, on-line reasoning. This process is extremely error prone for several reasons. First, our capacity for conscious thought is highly resource limited; we can only attend to and manipulate one or two discrete items at any one time. Second, we have to rely upon a mental model of the current situation that is nearly always incomplete and, in parts, incorrect. Third, we have a marked tendency in these circumstances to “fixate” upon a particular hunch or hypothesis and then select features of the world to support it, while neglecting contradictory evidence. This has been called “confirmation bias” or “cognitive lock-up” and has been frequently observed in nuclear power plant operators and others during attempts to recover from an emergency.10

    (2) Errors versus violations

    Violations are deviations from safe operating practices, procedures, standards or rules. Such deviations can either be deliberate or erroneous (e.g. speeding without being aware of either the speed or the restriction). However, we are mostly interested in deliberate violations, where the actions (though not the possible bad consequences) were intended. Deliberate violations differ from errors in a number of important ways.

    • Whereas errors arise primarily from informational problems (forgetting, inattention, incomplete knowledge, etc), violations are more generally associated with motivational problems (low morale, poor supervisory examples, perceived lack of concern, the failure to reward compliance and sanction non-compliance, etc).

    • Errors can be explained by what goes on in the mind of an individual, but violations occur in a regulated social context.

    • Errors can be reduced by improving the quality and delivery of the necessary information within the workplace. Violations generally require motivational and organisational remedies.

    (3) Active versus latent failures

    The distinction between active and latent failures owes a great deal to Mr Justice Sheen’s observations regarding the capsize of the Herald of Free Enterprise. In his inquiry report, he wrote:11At first sight the faults which led to this disaster were the …errors of omission on the part of the Master, the Chief Officer and the assistant bosun … But a full investigation into the circumstances of the disaster leads inexorably to the conclusion that the underlying or cardinal faults lay higher up in the Company … From top to bottom the body corporate was infected with the disease of sloppiness.”

    Here, the distinction between active and latent failures is made very clear. The active failures—the immediate causes of the capsize—were various errors on the part of the ship’s officers and crew. But, as the inquiry revealed, the Herald was a “sick” ship even before it sailed from Zeebrugge on 6 March 1987.

    Active failures are unsafe acts (errors and violations) committed by those at the “sharp end” of the system (e.g. anaesthetists, surgeons, nurses). They are the people whose actions can have immediate adverse consequences.

    Latent failures are created as the result of decisions taken at the higher echelons of the organisation. Their damaging consequences may lie dormant for a long time, only becoming evident when they combine with active failures and local triggering factors to breach the system’s many defences.

    Thus, the distinction between active and latent failures rests upon two considerations: first, the length of time before the failures have a bad outcome and second, where in the organisation the failures occur. Generally, active failures are committed by those in direct contact with the patient and latent failures occur within the organisational and management spheres.

    SOME PROBLEMS WITH THE INTERPRETATION OF HUMAN ERROR DATA

    The statement that around 80% of anaesthetic incidents involve some type of human failure is potentially misleading. Taken at its face value, it suggests that anaesthetists have a major human error problem, though similar values are found in most other domains as well.12 A natural reaction to these high error numbers, both in anaesthesia and elsewhere, is to direct the majority of remedial measures towards the “sharp end” (i.e. those in direct contact with the patient): to “blame and train”, to write additional procedures and to engineer people more and more out of the loop with further automation and “intelligent” displays. But these person-specific measures have little impact on well trained, experienced and highly motivated professionals. Indeed, they can sow the seeds of future mishaps.13

    There are a number of reasons why people at the “sharp end” get blamed for bad outcomes.13,14

    First, they are obvious targets: their actions were usually those closest in time and space to the bad outcome.

    Second, human beings are prone to the illusion of free will. People, especially in Western cultures, place great value in the belief that they are the makers of their own fates. Naturally, they also attribute this autonomy to other people. They too are seen as being able to choose between right and wrong, between correct and erroneous actions. But no one chooses to make an error, nor are all the circumstances influencing performance under a person’s control.

    Third, it is extremely difficult to trace the causes of accidents back to their organisational roots. The natural tendency of most accident investigators is to stop the search as soon as they have identified some less than adequate performance on the part of those on the spot. They are not to be blamed for this, since the tools for carrying out such in-depth analyses are only now being fashioned. Moreover, the legal aspects of such inquiries are usually best satisfied by the identification of “responsible” individuals.

    Fourth, reviewers of past events are subject to hindsight bias.15 Knowledge of the outcome causes us to simplify the problems facing the practitioner, who was armed only with foresight. Mistakes are apparently easy to spot in retrospect but extremely difficult to detect at the time. In simulated nuclear power plant emergencies, mistakes were rarely spotted by their perpetrators. It usually takes someone else, with a fresh view of the situation, to detect a deviation from some adequate path.

    In a recent study, two groups of anaesthetists were asked to judge the quality of care described in a set of written cases.16 The description of the events was the same for both groups, but the outcomes seen by one group were bad while the outcomes for the other group were neutral. The judges consistently rated the performance in the bad outcome cases as substandard, whereas the identical care provided in the neutral outcome cases was assessed as being adequate. As Cook and Woods13 point out, “the judgement of whether or not a human error occurred is critically dependent on knowledge of the outcome, something that is impossible before the fact”.

    In 1620 Sir Francis Bacon observed that: “… the human mind is prone to suppose the existence of more order and regularity in the world than it finds”.17 One of the many ways of simplifying complex events is to assume a symmetry of magnitude between causes and consequences. When confronted with horrific man-made catastrophes like Bhopal and Chernobyl or the accidental death of a young healthy patient during minor surgery, it seems natural to look for some equally monstrous act of irresponsibility or incompetence as the primary cause. What we usually find, however, is the chance and largely unforeseeable concatenation of many different causal factors, none of them sufficient or even especially remarkable by themselves, but each necessary to bring about the outcome.

    As we shall see below, errors are not so much causes as consequences.13,21 The contributing errors, just as much as their bad outcomes, require an explanation. Errors are the product of a chain of causes in which the individual psychological factors (momentary inattention, forgetting, haste, etc) are the last and often the least manageable link.

    MODELLING THE AETIOLOGY OF ORGANISATIONAL ACCIDENTS

    The thesis to be presented in the remainder of this paper is that anaesthetic accidents, in common with accidents in other low-risk, high-hazard systems, are usually organisational accidents, i.e. multiple cause events whose origins can be traced to decisions taken some time before the accident. The Australian Incident Monitoring Study18 found that system-based or organisational factors were implicated in 90% of the incidents (or 97% if human factors are included).

    The technological advances of the last 20 years, particularly in regard to engineered safety features, have made many hazardous systems largely proof against single failures, either human or technical. In order to breach all defences, it now requires the unlikely combination of several contributing factors.

    The aetiology of an organisational accident is shown in fig 1. A case study illustrating the ways in which these various organisational and human factors combine to create an anaesthetic accident is described in detail elsewhere.19 The direction of causality is from left to right.

    Figure 1

     Stages in the development of an organisational accident.

    • The accident sequence begins with the negative consequences of organisational processes (i.e. decisions concerned with planning, scheduling, forecasting, designing, specifying, communicating, regulating, maintaining, etc).

    • The latent failures so created are transmitted along various organisational and departmental pathways to the workplace (e.g. the operating theatre or intensive care unit) where they create the local conditions (e.g. undermanning, fatigue, technical problems, high work load, poor communication, conflicting goals, inexperience, low morale, teamwork deficiencies, etc) that promote the commission of errors and violations.

    • Many of these unsafe acts are likely to be committed, but only very few of them will penetrate the defences to produce damaging consequences for a patient. The fact that engineered safety features, standards, controls, procedures and the like can be deficient due to latent as well as active failures is shown by the arrow connecting organisational processes directly to defences.

    The model presents the people at the sharp end (the anaesthetists, surgeons and nurses) as the inheritors rather than as the instigators of an accident sequence. This may seem as if the “blame” for accidents has been shifted from the sharp end to the system managers. But this is not the case for the following reasons.

    • The attribution of blame, though often emotionally satisfying, hardly ever translates into effective countermeasures. Blame implies delinquency. Delinquency is normally dealt with by exhortations and sanctions. But these are wholly inappropriate if the individuals concerned did not choose to err in the first place.

    • High-level decisions are shaped by economic, political and financial constraints. Like designs, decisions are always a compromise. It is thus taken as axiomatic that all strategic decisions will carry some negative safety consequences for some part of the system. This is not to say that all such decisions are flawed, though some of them will be. But even those decisions judged at the time as being good ones will carry a potential downside for someone, somewhere in the system. Resources, for example, are rarely allocated evenly. There are nearly always losers. In judging uncertain futures, it is inevitable that some of the shots will be called wrongly. We cannot prevent the creation of latent failures, we can only make their adverse consequences visible before they combine with local triggers to breach the system’s defences.

    These organisational root causes are further complicated by the fact that the medical system as a whole involves many interdependent organisations: legislators, manufacturers, maintainers, administrators, medical defence and standards organisations, professional bodies, civil servants, medical institutions, primary carers, and so on. The model shown in fig 1 relates to a given hospital, but it must be appreciated that the reality is considerably more complex, with influences from other organisations impinging on the sequence at many different points.

    REMEDIAL IMPLICATIONS OF THE MODEL

    Unlike theories in the natural sciences whose value is assessed by the amount of experimental interest they provoke, theories in the safety sciences are judged by the much harsher criterion of practical utility. In what ways can the organisational accident model lead to safer anaesthetic practice?

    The key to effective safety management in any hazardous enterprise is to target what are invariably limited remedial resources at the most tractable problems: in short, to manage the manageable. In most organisations a disproportionate amount of these resources is directed at individual practitioners in an effort to prevent the recurrence of past errors through sanctions, exhortations, stricter procedures, tighter selection, additional training, improved certification, and the like. But these measures are only appropriate if the people who commit the active failures are especially error-prone, inexperienced, undermotivated and ill trained. This is rarely the case, either in anaesthesia or in the fields of aviation, process control and military command. A common feature of all of these domains is that the best people can sometimes make the worst mistakes.

    Central to the accident model presented earlier is the notion that the psychological antecedents of unsafe acts (i.e. what goes on in the head of the practitioner) are—beyond a certain point—extremely difficult to control. Distraction, momentary inattention, forgetting, losing the picture, preoccupation and fixation are entirely natural human reactions to the kind of working environment described at the beginning of this paper. What is remarkable is not that dangerous errors happen, but that they happen so rarely.

    Whereas active failures at the sharp end are unpredictable in their precise details and therefore hard to manage, latent failures existing within the work context and the institution at large are, by definition, present before the occurrence of any incident or mishap. For this reason, and because (in the terms of the model) they are the precursors of unsafe acts, they represent the most suitable cases for treatment. Unsafe acts are like mosquitoes. They can be swatted or sprayed, but they still keep coming. The only effective remedy is to drain the swamps in which they breed. In the case of anaesthetic practice, the nature and location of these swamps is both well known and universal. They include:

    • Teamwork and communication problems.1,18,20

    • Problems with the design, construction, maintenance and standardisation of equipment.21

    • Problems with drugs: labelling, purchase, stock control, delivery to and from storage, etc.22

    • Problems with the assessment and scheduling of patients.18,19

    • Problems with the planning and coordination of anaesthetists and their co-workers.1,19,20

    Measures to combat these problems are being implemented in a number of institutions. Less well understood, however, are the factors that determine an organisation’s general “safety health”.

    In medicine there is no single definitive measure of a person’s health. It is an emergent property inferred from a selection of physiological signs and lifestyle indicators. The same is also true for complex hazardous systems. Assessing an organisation’s current state of “safety health”, as in medicine, involves the regular and judicious sampling of a small subset of a potentially large number of indices.

    At the University of Manchester we have tried a variety of diagnostic approaches in our applied research, carried out mainly in the fields of oil exploration and production, railway operations and aircraft engineering. The individual labels for the assessed dimensions vary from industry to industry, but all of them have been guided by two principles. First, we try to include those organisational “pathogens” that have featured most conspicuously in well documented accidents (hardware defects, incompatible goals, poor operating procedures, undermanning, high workload, inadequate training, etc). Second, we seek to encompass a representative sampling of those core processes common to all technological organisations (designing, building, or specifying equipment, operating, maintaining, managing, communicating, goal setting, etc).

    In all three industries the measurements are summarised as bar graph profiles. Their purpose is to identify the two or three organisational factors most in need of remediation and to track changes over time. Instead of dwelling upon the last accident and trying to find local “fixes” for what was probably a unique occurrence, the attention of safety managers is now directed towards eliminating the worst of the current latent problems.

    Maintaining adequate safety health is thus comparable to a long term fitness programme in which the focus of remedial efforts switches from dimension to dimension as previously salient factors improve and new ones come into prominence. Like life, effective safety management is “one damn thing after another”.

    Acknowledgments

    As the many references to their work will testify, I owe a special debt of gratitude to Dr David Woods, Dr Richard Cook, Dr David Gaba, Professor Jan Davies and Professor Bill Runciman. They led the way. This paper merely followed.

    REFERENCES

    View Abstract

    Footnotes

    • * This is a reprint of a paper published in Current Anaesthesia and Critical Care 1995, Volume 6, pages 121–126.