On Swiss Cheese and Patient Safety

>
By  |  June 12, 2012 | 

Professor James Reason is the intellectual father of the patient safety field. I remember reading his book Managing the Risks of Organizational Accidents in 1999 and having the same feeling that I had when I first donned eyeglasses: I saw my world anew, in sharper focus. Reason’s “Swiss cheese” model, in particular – which holds that most errors in complex organizations are caused not so much by the inevitable human mistakes but rather by the organization’s incomplete layers of protection, which allow the errors to pass through on their way to causing terrible harm – was an epiphany. It is the fundamental mental model for patient safety, as central to our field as the double helix is to genetics.

Last month, I returned to England to give a couple of talks, one at a conference called “Risky Business,” the other at the UK’s National Patient Safety Congress. The former brings together many of the leading thinkers in a variety of risk-heavy fields, including aviation, nuclear power, space travel, the financial system… and healthcare. At the latter, I was asked to give the 2012 James Reason Lecture, a singular honor.

Attending both conferences, I spent a lot of the week thinking about Swiss cheese. Two particular case studies I heard stand out – one in which a series of human and technological errors combined to kill 26 men in a friendly fire accident, the other in which highly skilled pilots managed to safely land a mammoth aircraft after one of its engines exploded. Both were instructive, in different ways.

At the Risky Business conference, Scott Snook described a tragically famous friendly fire accident: the 1994 incident in which two American F-15 fighters shot down two U.S. Army UH-60 Blackhawk helicopters, killing all 26 men aboard, on a crystal-clear day in the “no fly zone” in Northern Iraq. Snook is a blunt, funny, tough-looking, head-shaven, fast-talking New Jersey native. It’s not surprising to learn that he spent 22 years in the US Army, rising to the rank of Colonel before retiring in 2002. More surprising are his Harvard MBA, his Harvard PhD in Organization Behavior and his present job: Senior Lecturer at Harvard Business School.

The test of a conceptual framework is how often it holds true, and in every large-scale accident I’ve studied (Chernobyl, BP, 9/11, Tenerife, Bhopal, and innumerable medical errors), the Swiss cheese model fits. Such errors virtually always involve competent, caring people in hard jobs, trying to do their best with imperfect data and under various pressures, felled by glitchy pieces of technology, poor communication, and really bad karma. And there’s nearly always a cultural mindset that contributes to the whole mess, one that had existed for years and been tolerated because, well, “that’s just how things work around here.”

In the friendly fire case (I’ll make a long story relatively short; Snook wrote a whole book about the incident), two Air Force F-15s were flying “lazy eights” in the sky above Northern Iraq, as they’d been doing each day for a couple of years without excitement. Both planes carried only a single pilot and an awful lot of weaponry. There had been reports of Iraqi saber rattling in the preceding weeks, so the pilots were a bit on edge.

That morning, they had received their pre-flight briefing; it included a printout of all the coalition planes expected in the zone that day. Helicopters, particularly those from another branch of the Armed Services, were often omitted from the list, and the two Blackhawks that day were no exception.

The Blackhawks’ were tasked to shuttle some Kurdish civilians and service personnel from the US, Britain, Turkey, and France to a tribal meeting in the town of Zakhu, in the hills of Northern Iraq, inside the no-fly zone. They took off from an Army base in Turkey, and, as always, flew low and fast; hugging the mountainous terrain was their best protection against being spotted by enemy radar.

Flying at 32,000 feet, high above both the F-15s and the Blackhawks, was an AWACS surveillance plane, a specially-outfitted Boeing 707 that directs traffic over an entire region. One of the computer screens on the AWACS malfunctioned that day, so two people who usually sat next to each other (one directing traffic coming into a given zone, the other the outgoing traffic) were placed a few rows away, where they could no longer tap on each others’ shoulders to discuss a confusing finding. The AWACS crewman on the Incoming monitor spotted the Blackhawks on his radar, but then they disappeared from the screen, probably because the helicopters had gone behind a mountain. He placed an electronic arrow on his screen to remind him of the choppers’ prior position, but such arrows were designed to disappear after a few minutes to avoid screen clutter.

The F-15s had four ways of guarding against friendly fire incidents. The first was the list of expected flights in their zone, strapped to the pilot’s thigh. By failing to list all flights – in particular the Blackhawks – one slice of Swiss cheese was breached even before the fighter planes took off. The second protection was a technology known as IFF (“Identification Friend or Foe”). Every US or allied plane has a transponder that emits a signal telling others that they are on the same side. The F-15’s rules of engagement require it to “paint” a potential target with its IFF probe; a friendly plane returns an electronic signal that says, “I’m on your team.”

But on April 14, 1994, it too proved to be more hole than cheese. The Blackhawk helicopters had their code set for the Turkey frequency when they took off, and neglected to switch to the Iraq frequency when they entered Kurdistan (the later investigation revealed that this happened frequently). This meant that when the F-15s pointed their IFF at the Blackhawks, the response back was “Foe.” It didn’t dawn on the F-15 pilots to try the other frequency for Turkey, which would have entailed turning their dial by one click. Layer Two was breached.

The F-15 pilots radioed the AWACS to ask if there were any friendly helicopters in the area. By this time, the electronic arrow had disappeared from the Incoming crewman’s monitor; there appeared to be “no friendlies” in the zone. Inexplicably, the crewman who had seen the Blackhawks earlier did not speak up. No one is quite sure why neither he, nor a few other AWACS personnel who had seen the Blackhawks’ signal, stayed quiet during the minute when they could have prevented tragedy with a single word. Perhaps each expected that someone else would speak up. In any case, they failed to call off the dogs. Layer Three.

The final protection was the requirement for a visual identification (“VID”, in service lingo) of a target before attack. The F-15’s swooped down to a position 1000 feet above and 500 feet to the side of the Blackhawks. Russian-built Iraqi “Hind” choppers had threatening side-ordnance hanging off the main fuselage; Blackhawks generally did not. But these particular Blackhawks had been outfitted with two side-hanging fuel tanks that, from a few thousand feet, might have looked like missiles. Moreover, despite the pilots’ extensive VID training, IDing a chopper from that distance is like determining a mini-van’s make and model from five football fields away. And one wonders whether these pilots, who had been flying dull surveillance missions for years, had their adrenaline pumping, which led them to see what they wanted to see, a phenomenon known as confirmation bias.

All layers of Swiss cheese having been breached, the F-15 pilots both pulled their triggers. In a chilling bit of Top Gun swagger, as the second missile hit its target, Pilot 2 told Pilot 1, “Stick a fork in them, they’re done.” The pilots returned to base, getting huge atta-boys from their colleagues, and were waiting to be debriefed by their general when they looked up at a TV screen in the general’s waiting room. CNN was reporting that two Blackhawk helicopters were missing in Northern Iraq. One can only imagine how they felt at that moment.

While it would be easy to point any number of fingers, James Reason’s model teaches us that to prevent another friendly fire incident, the most important thing is to identify the holes in the Swiss cheese, shrinking them to the degree possible and creating enough different layers that the probability that they will ever line up again is made as low as possible. (The investigators mostly did that, though they did court marshal one person, AWACS supervisor Capt. Jim Wang.)

Professor Reason also teaches us to mine cases in which things go right for their lessons. At the Risky Business conference, another speaker was Capt. David Evans, a Quantas training pilot who was on the flight deck when Engine #2 of a giant Airbus 380, carrying 440 passengers and 29 crew, exploded in November 2010.

Captain Evans walked right out of central casting – handsome, broad shoulders, with a soothing Australian voice. As he calmly explained it, a few minutes after takeoff, a hydraulic tube ruptured, spilling slippery fluid inside the engine, which caused a turbine to spin too fast and ultimately to explode, sending engine fragments at “infinite energy” in all directions. In an extraordinary bit of luck, only one 75-pound fragment plowed into the fuselage; it missed the passenger compartment by a few feet. But the explosion made a mess of the wing, blowing out not only the left inside engine but several other mission-critical systems like generators and fuel pumps.

He and the other pilots (there happened to be five on the plane that day) methodically (they flew two hours before they landed) went through their paces, ticking through checklists, ascertaining the extent of the damage. While they used the technology when they could, they were also skeptical of it in the face of a catastrophic insult. For example, the computer system recognized that one wing was much lighter than the other (duh, it was missing a big chunk of a 25-ton engine) and signaled the pilots to move fuel from the heavier side to the lighter to keep the plane balanced. They wisely decided to ignore that recommendation.

On the other hand, the pilots programmed all their data into the plane’s landing app, which told them that – because they’d be coming in steeply, heavy with fuel, and with partly damaged brakes – they’d need the longest possible runway (they choose the main runway in Singapore), and the computer predicted that they’d be able to stop about 100 yards from the end of the 2-mile long runway. The prediction was accurate, nearly to the foot, and no one was hurt.

Professor Reason’s model helps us understand why things go right, and sometimes why they go wrong. By causing us to focus more on bad systems than bad people, the model has been responsible for much of the progress we’ve made in patient safety.

In my talk at the Patient Safety Congress, I offered the audience my own thoughts on the successes, failures, surprises and epiphanies in the decade or so since the safety movement began. I was thrilled that Professor Reason, now retired, came to the talk. In addition to highlighting the central role of the Swiss cheese model, I made the point that many people took the model as support for an unblinking acceptance of “systems thinking” and “no blame” as an apt response to every error. The effort to rebalance systems thinking with accountability, often attributed to David Marx’s “Just Culture” model, is sometimes regarded as a counterpoint to Reason’s teachings.

Yet nothing could be further from the truth. I reminded the audience that in 1997 – three years before Marx first wrote about the “Just Culture” – Reason had done the same. In Managing the Risks of Organizational Accidents, he wrote:

A ‘no-blame’ culture is neither feasible nor desirable. A small proportion of human unsafe acts are egregious… and warrant sanctions, severe ones in some cases. A blanket amnesty on all unsafe acts would lack credibility in the eyes of the workforce. More importantly, it would be seen to oppose natural justice. What is needed is a just culture, an atmosphere of trust in which people are encouraged, even rewarded, for providing essential safety-related information – but in which they are also clear about where the line must be drawn between acceptable and unacceptable behavior.

I had a chance to chat with James Reason after my lecture, and he was very pleased that I had highlighted this point, since he believes that he is often misinterpreted as sugarcoating the role of bad behavior. Professor Reason asked me to sign his copy of the new edition of my book, Understanding Patient Safety, which I did, proudly.

My own interest in patient safety came from seeing terrible errors (and committing a few of my own) and learning – from James Reason – that the way I’d been taught to think about them was all wrong. Having a chance to give a lecture in Reason’s name, with him in the audience, was one of the great thrills of my career. We stand on the shoulders of giants, observed Issac Newton, and I’ve never felt that more acutely than I did last month in a remarkable week in England.

Share This Post

7 Comments

  1. Mark Neuenschwander June 12, 2012 at 3:39 pm - Reply

    As many of us are honored to see better for standing on your shoulders.

  2. Bev M.D. June 12, 2012 at 6:08 pm - Reply

    No one could have deserved to give that lecture more than you. I wanted to say, however, that before i retired, I found the Swiss cheese model absolutely invaluable in rapidly and easily communicating the problem to not only staff, but doctors who were untrained in the principles of patient safety and error mitigation, and who didn’t want to sit through a long technical explanation. You could almost see the lightbulb go on in their heads. What a great tool!

  3. Lynn Phillips June 13, 2012 at 1:20 am - Reply

    Bob,

    Bob, great article! It is always interesting to hear how one of my heroes in healthcare, responds to one of his heroes! Done of course in true Wachter style and grace.

  4. Menoalittle June 13, 2012 at 1:40 am - Reply

    Bob,

    After reading your essay, the NEJM editorials critical of HIT devices, and bushel baskets fll of EHR generated records of cases for attorneys about patients who met untimely death associated with EHR and CPOE systems, I have concluded that the EHR and CPOE devices being coaxed into use by the US Government are more air than cheese. The deaths of young vibrant basically healthy people whose care had workflow controlled by these devices tell the story.

    Vapourware, everywhere.

    Best regards,

    Menoalittle

  5. Geff McCarthy June 18, 2012 at 4:07 pm - Reply

    Bob it was an honor to meet you some years ago; I’m an expert in flight safety and a doc. Your lecture was singularly well deserved, and equally well conceived. Reason, and you, are saving innumerble lives daily. (I heard him speak in London a few years ago.)
    I might add 2 thoughts to your excellent analysis of the 2 aircraft accidents:
    Reasons’ model has formed the basis of Human Factors Analysis Categorization System (HFACS) as a taxonomy for error. It is now standard in US DOD, NATO, etc, but has little penetration into medicine. If medicine adapted it as a standard, our understanding of error in medicine would greatly increase. Most of the errors would lie in the organizational and supervisory categories, not the skill-based…
    Secondly, Syd Dekker’s recent book Drift into Failure, identifies clearly the unnoticed, gradual, slide toward failure, as in the first case. I’d highly recommend it.
    On a personal note, I have schizoid feelings for the F-15 drivers…I would love to have had a chance to fly real air-air combat (I flew fighters in Nam but only ground support for the Army) but I would not have wanted their shoot-to-kill decision. Maybe that’s why I became a doc…
    As to the prosecution of Capt. Wang, I personally know the prosecuting official, whose integrity and character are exemplary. I think I might have prosecuted him also. Officers are trained to lead, and to accept the consequences, whether bullet or General Court Martial. Too often, we absolve physicians of their leadership responsibilities…

  6. Tracy Granzyk June 21, 2012 at 5:25 pm - Reply

    How wonderful to drink a toast to patient safety with James Reason himself. I have spent the last week in Telluride, CO where David Mayer MD and Tim McDonald MD/JD are teaching medical students and resident about the value of Reason’s just culture in medicine. At the core of Drs. Mayer and McDonald’s teaching is the open, honest communication and transparency that engenders high reliability organizations. Please visit our blog at http://www.transparenthealth.wordpress.com, and view the student reflections on the week, along with faculty comments. Paul Levy (http://runningahospital.blogspot.com/2012/06/residents-heres-new-way-to-measure.html) was also there for our resident physician week and blogged about the experience. And finally Dave Mayer launched a blog called, Educate the Young (www.educatetheyoung.wordpress.com) Please join our communities and share your expertise–raising the collective voice around the delivery of patient-centered care will expedite the changes in culture so overdue! Best of luck in your efforts–

  7. Margi Macdonald June 22, 2012 at 12:58 am - Reply

    Wow.
    The world just got a whole lot smaller for me.
    Captain David Evans and I are both residents of Brisbane, Australia. One of his sisters is a dear friend of mine.
    My father is a retired air traffic controller, check-controller and search-and-rescue expert. I grew up in a culture of accident, near-miss and human-error analysis.

    After my mother was negligently discharged from an ER with an undiagnosed sub-arachnoid haemorrhage, we subsequently heard a lot about the Swiss Cheese Model from the Director of Emergency Medicine at the hospital where my mother was initially admitted.

    i wonder how many other hapless patients have fallen through holes in the cheese in not one but two ERs, a general practice, an out-patients’ department, AND a rehabilitation unit?

    There remains a culture of ‘no-blame’ in the healthcare system here in Australia. On more than one occasion, in discussions with more than one doctor, I was told “our system is broken”. That statement has become a mantra, rather than an impetus to enact change.

    This is our story: http://margihealing.wordpress.com/2009/05/20/medical-negligence-and-the-meaning-of-life/

Leave A Comment

For security, use of Google's reCAPTCHA service is required which is subject to the Google Privacy Policy and Terms of Use.

About the Author: Bob Wachter

Robert M. Wachter, MD is Professor and Interim Chairman of the Department of Medicine at the University of California, San Francisco, where he holds the Lynne and Marc Benioff Endowed Chair in Hospital Medicine. He is also Chief of the Division of Hospital Medicine. He has published 250 articles and 6 books in the fields of quality, safety, and health policy. He coined the term hospitalist” in a 1996 New England Journal of Medicine article and is past-president of the Society of Hospital Medicine. He is generally considered the academic leader of the hospitalist movement, the fastest growing specialty in the history of modern medicine. He is also a national leader in the fields of patient safety and healthcare quality. He is editor of AHRQ WebM&M, a case-based patient safety journal on the Web, and AHRQ Patient Safety Network, the leading federal patient safety portal. Together, the sites receive nearly one million unique visits each year. He received one of the 2004 John M. Eisenberg Awards, the nation’s top honor in patient safety and quality. He has been selected as one of the 50 most influential physician-executives in the U.S. by Modern Healthcare magazine for the past eight years, the only academic physician to achieve this distinction; in 2015 he was #1 on the list. He is a former chair of the American Board of Internal Medicine, and has served on the healthcare advisory boards of several companies, including Google. His 2015 book, The Digital Doctor: Hope, Hype, and Harm at the Dawn of Medicine’s Computer Age, was a New York Times science bestseller.

Categories

Related Posts

June 8, 2020 |  5
Here we are, faced with history in real time. A plague upon a plague. A new one and a longstanding one. COVID-19 and racial injustice. Both are plagues upon our medical house, and it’s time for some spring cleaning. Initially, COVID-19 concerns brought news of an infection coming for anyone and everyone. Like the Black […]
April 15, 2020 |  1
Not enough ventilators, a profound lack of personal protective equipment (PPE) and overwhelmed healthcare professionals. Hospitals and systems stretched beyond capacity in the hardest hit states. Routine care ground to a halt. COVID-19 has pushed the US healthcare system to its limit and exposed its fragility. It’s not a big secret that the US healthcare […]
By Romil Chadha, MD, MPH, FACP, SFHM
April 2, 2020 |  6
On December 31, 2019, when the whole world was getting to welcome 2020, China reported its first case of pneumonia of unknown cause in Wuhan, China, to the WHO Country Office in China. Today we know that it is coronavirus disease (COVID-19), caused by a virus, severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). This virus […]
Go to Top