Learning from Our Mistakes – Industry to Industry
What Failure Can Teach Us
Recently, a friend of mine passed along a story about the Royal Air Force and their efforts to manage risk during WWII. Early in the war, the British commanders began to examine planes that had returned to base after being shot and damaged during bombing missions. The thinking was that the patterns and location of bullet holes on these planes would provide valuable information about how to protect them better during the next aerial runs. Following inspection, specific areas of the fuselage would then be reinforced with more metal and other protective covering. This process went on for several months until a statistician named Abraham Wald made the observation that the top brass might be going about it all wrong. Perhaps they shouldn’t be examining those planes that made it back. Wouldn’t it be better, he asked, to examine those planes that didn’t make it back? In other words, the planes that made it back had obviously done something right – whether it was piloting skill or aircraft design. But what about the one’s that didn’t? What was it that caused these planes to fail, when other, identical planes had made it through safely? Was their failure due to bad luck, inferior piloting, or vulnerable system design?
Success by Emulation?
There is something in us that wants to focus on success as a pattern for more success. And while undoubtedly there are lessons to be learned from successful ventures, sometimes our successes overshadow the opportunities to learn from failures. Learning from mistakes is one of the key ingredients to success - at both the individual and organizational level. Over the past several years, we’ve seen multiple industries attempt to bring aviation solutions into their realm with the expectation of commensurate success. Healthcare, in particular, has applied numerous initiatives patterned from successes in other industries, such as aviation and manufacturing. For example, healthcare has adopted Lean Six Sigma, voluntary reporting systems (Patient Safety Organization legislation), pre-surgical briefing and time-outs, the use of checklists (see Dr. Atul Gawande’s The Checklist Manifesto) and Team STEPPS (the healthcare version of Crew Resource Management, or CRM), to name just a few.
But what has been the result of the application of these aviation- and manufacturing-styled programs? The truthful answer is: not as much as organizations had hoped for. In fact, it’s been well over a decade since the Institutes of Medicine (IOM) published To Err is Human. Since that clarion call to healthcare professionals in 1999, the healthcare industry has introduced a series of endeavors designed to improve the delivery of healthcare in America. So where are the results? In the words of the father of patient safety, Lucian Leape, at the April 2010 meeting of the American Organization of Nurse Executives (AONE): “I think it's safe to say the patient-safety movement also has been a great failure.” He was right – but it’s not because we haven’t tried through emulation.
Things Often Go Wrong, Before They Go Right…
Our message is not that these emulatory programs can’t work – it’s just that things often go wrong, before they go right. For example, before CRM became the human factors-based national standard it is today throughout the airline and commercial aviation industry, it was widely considered to be a poor application of pop psychology.
In fact, in many early CRM sessions, senior captains simply walked out of the classroom believing that there was little value in learning to get along in the cockpit. The older models of “command and control” apparently had served aviation well during the wartime years, and these were still entrenched in the minds of many civilian pilots and organizations.
It took several set-backs and re-calibrations to arrive at a style of implementation suited to the mindsets, culture, and complexities of the airline community. CRM programs only became successful when the industry came to examine several fatal airline and military aviation crashes, each one shedding new light on the human error contributions to the accidents. More specifically, these programs began to focus not on the human errors involved, but rather the behavioral choices that preceded these inadvertent actions resulting in crashes.
The subtle difference here lies in something called the fundamental attribution error, or bias. A general definition of this bias can be described as the tendency to overestimate the effect of disposition or personality and underestimate the effect of the situation in explaining social behavior. The fundamental attribution bias is most visible when people explain the behavior of others. (In other words, when I engage in a risky behavior there’s a good reason for it; when someone else engages in it, the behavior looks reckless or unwarranted.) What happened in the case of aviation was that the CRM courses began to effectively hold up the mirror so that pilots could see themselves engaged in behaviors that clearly contributed to accidents. The attitude of many pilots changed from “that won’t happen to me” to “there but for the grace of God go I.” This strategic shift in implementation became the key to its eventual widespread success. Without this shift, CRM would have been just another good idea gone unfulfilled.
Only Collaboration Can Optimize Our Success
Can emulatory programs work? Yes, of course. Use of checklists, Team STEPPS, Collaborative Action Programs based on Aviation Safety Action Partnerships (ASAPs), Lean Six Sigma, Just Culture, pre-surgical briefings, and time-outs have all produced improvements, if not always dramatic results. But when when we understand both the failures and successes in industries where these programs originated, and clearly see the differences in the environments into which these programs are applied, then our chances of success improve.
Stewardship of our resources requires more than just our best guess at success. Learning from our mistakes is a necessary ingredient of success at the individual, the organizational, and especially the societal level. But more importantly, acting on this knowledge - and achieving better outcomes - requires collaboration. Learning and action are both necessary in achieving optimal outcomes. Learning systems and predictive science can guide us to an understanding of probabilistic outcomes, but only collaboration can optimize our pathway to success. In our next Viewpoint article, we will examine the best that science has to offer in quantifying our probability of success or failure and explore how healthcare is learning to combine evidenced-based practice with predictive risk assessment and intervention. We hope you'll join us in this conversation.