How black box thinking can prevent avoidable medical errors

This article was first published in the December 2015 issue of WIRED magazine. Be the first to read WIRED's articles in print before they're posted online, and get your hands on loads of additional content by subscribing online.

Four hundred thousand people die in America because of preventable medical error. Note the key word in that last sentence: "preventable". These mistakes don't need to happen, shouldn't happen. But they do, over and over again. Avoidable medical error is the third-biggest killer, after cancer and heart disease. It kills way more people than traffic accidents.

The numbers are equally shocking in the UK, France and beyond. Why is this suffering happening on such a large scale? The answer is simple to state but complex to address: it's the inability to learn from mistakes. Clinicians spin their errors. They cover up. They use euphemisms to pull the wool over the eyes of grieving families.

This is not just about avoiding litigation. Evidence suggests insurance premiums go down when doctors are open and honest with patients and their families. No, this is a deeper pathology. It is the difficulty that talented, professional people have with admitting their fallibility: the threat to ego, to reputation, to vanity. It obliterates progress - not only in healthcare.

We can draw a contrast with aviation, where the culture is very different. In the airline industry, mistakes are seized on as precious learning opportunities. Every aircraft is equipped with two almost indestructible black boxes, one that records conversations in the cockpit and one that records 
the electronic instructions sent to the on-board computers.

When there are accidents, the boxes are recovered, the data is excavated and the lessons are learned. Professionals have every reason to co-operate because their evidence cannot be used against them in court. The result is that procedures are reformed in the light of this vital data. It is not just accidents but also near-miss events that drive this powerful, adaptive process.

By using this method, aviation has attained a remarkable and still-improving safety record. In the early part of the last century, flying was one of the most dangerous forms of transport. In 1912, eight out of 14 US Army pilots died in crashes: more than half. In 2014, by contrast, there was one accident for every 8.3 million take-offs among the major carriers.

This is not about aviation, however: it is about a method. When the Virginia Mason Hospital & Medical Center in Seattle created an aviation-style system of incident reporting, and altered the culture so that professionals were empowered to speak up, errors plummeted. Insurance-liability premiums dropped by 74 per cent. That is the power of learning from mistakes.

What we are really talking about here is scientific method. Science has been successful precisely because it learns from errors. Scientific theories, by definition, make testable predictions. It is when theories fail that they are reformed or even revolutionised. Just as in aviation, where the safety of the system is paradoxically built upon the rubble of real-world accidents, so the scientific theories of today are built upon the failures of their predecessors.

When we are engaging with a complex world, failure is inevitable. Failure in our assumptions, our theories, our methods and our strategies. The hallmark of great institutions, now and throughout history, has been a capacity to leverage these failures in the dynamic process of change. Institutions founded on authority, on defensiveness, on a lack of courage to engage with mistakes, 
have held the world back in many ways.

Chesley Sullenberger, the pilot who famously landed US Airways Flight 1549, an Airbus A320, on the Hudson River in 2009, has expressed the basic paradox of success. In a TV interview in 2010, he offered this beautiful gem of wisdom: "Everything we know in aviation, every rule in the rule book, every procedure we have, we know because someone somewhere died... We have purchased at great cost, lessons literally bought with blood that we have to preserve as institutional knowledge and pass on to succeeding generations. We cannot have the moral failure of forgetting these lessons and we have to relearn them."

This article was originally published by WIRED UK