Volume 2, No. 3 • Fall 1987

From the Literature: Accident Analysis Points Way to Greater Safety

M. Maxwell; A. DeAnda; David M. Gaba, M.D.

Editor’s note: In the APSF Newsletter, pertinent publications from the anesthesia patient safety literature will be summarized. Suggestions for future issues are welcome. For this very important, special, and complex paper, the first author was asked to provide a summary.

Gaba DM, Maxwell M, DeAnda A; Anesthetic mishaps: Breaking the chain of accident evolution. Anesthesiology 66:670-676, 1987.

This paper presents a new framework for looking at anesthetic mishaps. The approach comes from work concerning other high-risk industries published by the organization theorist, Dr. Charles Perrow (Normal Accidents, New York, Basic Books, Inc., 1984).

Accidents continue to occur in high-risk industries in spite of multiple technological and organizational fail-safe systems. Such mishaps are called “system accidents” in that they involve unanticipated interactions or multiple failures throughout the “system’ ” This concept appears to describe many anesthetic mishaps where a combination of circumstances leads to disaster.

Complexity and Coupling

The elements that make a system accident more likely are complexity of interactions and tight coupling between components or sub-systems. Complexity arises in anesthesia because of uncertainty about physiologic and disease processes in individual patients, and because of the limited means available to assess and monitor these processes. Tight coupling arises when a simple failure has immediate serious effects (e-g. drug swap), or when it directly induces failures in other systems (e.g. pneumothorax leading to hypoxia and cardiovascular compromise).

The failures, errors, and abnormalities which routinely occur during anesthesia may be called simple incidents, and they can arise from the patient’s diseases, from the surgery, or from the anesthetic or anesthetist. The anesthetized, paralyzed, and mechanically ventilated patient is a tightly coupled system which involves complex interactions of human physiology and anesthetic equipment. In this setting a simple incident can easily progress to a critical incident, previously described by Cooper, et al. as one which could lead directly to an adverse patient outcome. If the critical incident is not detected and corrected, a substantive negative outcome is likely, thus yielding an accident.

Recovery from Incidents

Since simple incidents are common, patient safety often depends on the process of recovery from the incident. Spectacular failures in several high-risk industries remind us that successful intervention in the chain of accident evolution cannot be guaranteed. The recovery process involves complex problem solving skills which are just beginning to be investigated (see figure).

In the past, anesthetic mishaps have been largely attributed to a lack of vigilance. But at least one third of critical incidents studied by Cooper’s group involved other mental errors with perfectly adequate vigilance, and it is likely that combinations of such human failures are at the heart of most mishaps. Vigilance is thus a necessary but not sufficient condition for averting accidents. This has some bearing on recent trends in anesthetic practice, since the assumption that detection of an incident by new monitors or “vigilance aids” will necessarily lead to successful recovery from the incident is not proven. The contribution of monitoring to patient safety depends on the ability of anesthesiologists to use the data correctly in a setting likely to be full of artifact or contradictory information.

Experience in other industries suggests that available data is not always used wisely. A critical factor is the operator’s mental map of the situation. Instrument readings are accepted based on their conformance to the current map. Abnormal readings may be rejected, even when true, due to mistrust of instruments that are prone to failure or artifacts, as was the case at Three Mile Island and in several commercial aviation accidents. Monitors or alarms may be purposefully turned off for the same reasons.

Pressure to Produce

An additional key factor present in most industrial accidents is the pressure to produce. The temptation in anesthesia to “cut corners” is great, and potential hazards may be ignored because of complacency induced by the usual safety of “routine” cases. Other industries have established formal procedures governing decision-making under pressure. NASA’s set of Flight Readiness Reviews, Launch Commit Criteria, and Launch Constraints should have prevented the Challenger disaster; it was the disregard of these established procedures that led to catastrophe. Anesthesiologists also usually do a pre-operative readiness review of patients before anesthesia, but firm, objective guidelines for when to proceed or to cancel or abort surgical procedures CIO not exist, and great pressure may he exerted to “bend the rules” to get the cases done. We suggest that a local consensus be sought among anesthesiologists, surgeons, internists and pediatricians in each institution to establish or strengthen these ground rules.

False Security

Furthermore the temptation to use “safety technology” as a means to speed or expand production must be avoided. A false sense of technological security has led to more near misses in aviation, the launch of the Challenger in unprecedentedly cold weather, and the Chernobyl disaster. New instruments for anesthesia should be used first and formost to enhance safety, and only very cautiously to allow techniques or procedures that might have previously been considered too hazardous.

There is no easy technological fix for maintaining or enhancing anesthesia patient safety. Safety involves much more than just the vigilant application of modem monitoring equipment. Aviation was made acceptably safe only by a system that provided intensive research on fundamental aerodynamics and aircraft design, centralized air-traffic control, and a large regulatory bureaucracy. There are numerous elements in the “larger system” of anesthesia which may paradoxically provide negative incentives for safety, or even positive incentives for clearly unsafe actions. Perrow terms this an “error inducing system:’ The incentives and constraints of this larger system determine to what extent anesthesia patient safety can be improved.

In summary, anesthesia is a risk for all patients. Though research and technology may reduce the uncertainties of administering anesthesia, incidents and errors are inevitable and attention should focus on recovery from error as much as preventing errors. Therefore, recommended are:

1) Improving the detection of simple incidents: newer monitors can detect life-threatening failures at an earlier stage, and the actual contribution of these devices should be clearly established. Alarms and displays can be improved, although the optimum design is not yet clear.

2) Improving the abilities of anesthesiologists to develop effective problem solving skills using good mental maps of the case in progress. Simulators should be developed to enhance training in the handling of simple and critical incidents.

3) Defining the necessary back-up equipment and procedures for common mishaps, and common surgical situations.

4) Cataloging and disseminating effective protocols for handling rapidly evolving incidents, as has already been done for malignant hyperthermia. Such recovery processes should be practiced using simulators.

Go/No-Go Decisions

5) Easing production pressures to allow increased attention to pre-operative checks, readiness review of patients, and proceed versus cancel decisions. Consensus among providers should be sought to avoid misunderstandings which can only detract from patient safety.

Anesthesiologists are the most important link in the chain of safe anesthetic cam We have a responsibility to scrutinize our own abilities and limitations as carefully as we investigate those of our drugs and tools. We must define and implement the procedures and training which can be shown to optimize patient safety. Furthermore, we must steer the interacting sources of incentive and constraint towards a system that consistently promotes patient safety. The path to this goal has yet to be defined, but the public, the courts, and the regulatory and accrediting agencies expect no less. Our profession must take the initiative to see that it is so.

Abstracted by David M. Gaba, M.D., Assistant Professor of Anesthesia, Stanford University School of Medicine.