W Power 2024

How to learn from the big mistake you almost made

A brush with disaster can lead to important innovations, but only if employees have the psychological safety to reflect on these close calls, says research by Amy C. Edmondson, Olivia Jung, and colleagues

Published: Jun 1, 2021 10:49:08 AM IST
Updated: Jun 1, 2021 11:10:12 AM IST

How to learn from the big mistake you almost madeWhen leaders frame near misses as free learning opportunities and express the value of resilience to their teams, the likelihood that workers will report such incidents increases.
Image: Shutterstock


What if businesses could learn from their worst mistakes without actually making them? How might the same progress and innovation occur, without firms incurring the costs associated with such errors?

The results of a recent study about close calls in health care suggest that when people feel secure about speaking up at work, incidents in which catastrophe is narrowly averted rise to the surface, spurring important growth and systems improvement.

“People don't pay enough attention, especially in the business world, to the potential goldmine of near-misses,” says Harvard Business School Professor Amy C. Edmondson, who studies psychological safety and organizational learning.

Incidents that almost result in loss or harm often pass unnoticed, in part because workers worry about being associated with vulnerability or failure. But when leaders frame near misses as free learning opportunities and express the value of resilience to their teams, the likelihood that workers will report such incidents increases.

That was the main finding of Resilience vs. Vulnerability: Psychological Safety and Reporting of Near Misses with Varying Proximity to Harm in Radiation Oncology, a study by Edmondson, the Novartis Professor of Leadership and Management at Harvard Business School, and Olivia Jung, a doctoral student at HBS. Co-authors on the paper, which was published in The Joint Commission Journal on Quality and Patient Safety, included UCLA physicians Palak Kundu, John Hegde, Michael Steinberg, and Ann Raldow, and medical physicist Nzhde Agazaryan.

A spectrum of close calls
The research team wanted to understand the role of psychological safety—defined as “the shared belief that interpersonal risk-taking is safe”—in determining the likelihood that employees in the radiation oncology department report near misses and whether that changes based on the nature of the incident.

“What's interesting about a near miss is that it can be thought of as a failure, where people say, ‘Oh, we almost made a huge mistake,’” explains Jung. “That interpretation highlights a vulnerability in the care-delivery processes. But it can also be thought of as a success, where they say, ‘Whew, we caught the error and delivered great care,’ which highlights resilience of care delivery systems.”

To unravel this complexity, the research team surveyed 78 radiation oncology professionals at the University of California in Los Angeles. First, they surveyed the group about their perceived psychological safety in the department. Overall, they found that individuals felt accountable to each other and comfortable speaking up, but there was significant variance depending on position, with higher-ranked employees, like physicians, generally feeling safer to speak their minds compared to lower-ranked employees, like nurses and therapists. This, says Edmondson, has been a consistent finding across much of the research on teams and psychological safety, across a variety of industries.

“Higher-status people are more likely to feel confident that their voice is welcomed,” she says.

Next, the researchers devised a spectrum of hypothetical near misses based on real-life practice. For example, providers must check cancer patients undergoing radiation for pacemakers, which can malfunction during the treatment. Employees were asked to rank the likelihood that they would report the following near-miss scenarios, which become progressively more threatening to the patient:

  •     Could have happened. The pacemaker status of a patient was not checked at initial consultation. By chance, the patient did not have a pacemaker and received radiation without any harm afterwards.
  •     Fortuitous catch. The pacemaker status was not checked. The patient had a pacemaker, but by chance, a team member noticed this, and the patient’s treatment was postponed until they received clearance.
  •     Almost happened. The pacemaker status was not checked. The patient had a pacemaker and received radiation, but, by chance, the patient did not experience complications.

When overlaid with the results of the first survey, the data showed that the closer the situation got to causing patient harm, the more important psychological safety became in determining whether the employees would report the near-miss event.

“With near misses that we characterize as ‘could have happened,’ where the chance event is far from patient harm, and therefore highlights resilience, we find that the role of psychological safety on people's willingness to report is almost negligible,” explains Jung. “But for near misses that we characterize as ‘nearly happened,’ which highlight vulnerability, we find there's a huge effect of psychological safety on people’s willingness to report.”

“What's interesting about a near miss is that it can be thought of as a failure, where people say, ‘Oh, we almost made a huge mistake,’” explains Jung. “That interpretation highlights a vulnerability in the care-delivery processes. But it can also be thought of as a success, where they say, ‘Whew, we caught the error and delivered great care,’ which highlights resilience of care delivery systems.”

To unravel this complexity, the research team surveyed 78 radiation oncology professionals at the University of California in Los Angeles. First, they surveyed the group about their perceived psychological safety in the department. Overall, they found that individuals felt accountable to each other and comfortable speaking up, but there was significant variance depending on position, with higher-ranked employees, like physicians, generally feeling safer to speak their minds compared to lower-ranked employees, like nurses and therapists. This, says Edmondson, has been a consistent finding across much of the research on teams and psychological safety, across a variety of industries.

“Higher-status people are more likely to feel confident that their voice is welcomed,” she says.

Next, the researchers devised a spectrum of hypothetical near misses based on real-life practice. For example, providers must check cancer patients undergoing radiation for pacemakers, which can malfunction during the treatment. Employees were asked to rank the likelihood that they would report the following near-miss scenarios, which become progressively more threatening to the patient:

  •     Could have happened. The pacemaker status of a patient was not checked at initial consultation. By chance, the patient did not have a pacemaker and received radiation without any harm afterwards.
  •     Fortuitous catch. The pacemaker status was not checked. The patient had a pacemaker, but by chance, a team member noticed this, and the patient’s treatment was postponed until they received clearance.
  •     Almost happened. The pacemaker status was not checked. The patient had a pacemaker and received radiation, but, by chance, the patient did not experience complications.

When overlaid with the results of the first survey, the data showed that the closer the situation got to causing patient harm, the more important psychological safety became in determining whether the employees would report the near-miss event.

“With near misses that we characterize as ‘could have happened,’ where the chance event is far from patient harm, and therefore highlights resilience, we find that the role of psychological safety on people's willingness to report is almost negligible,” explains Jung. “But for near misses that we characterize as ‘nearly happened,’ which highlight vulnerability, we find there's a huge effect of psychological safety on people’s willingness to report.”

In other settings, it may be up to managers to communicate a clear and compelling purpose, and to make sure that employees feel like their contributions are valued. When employees feel free to express their ideas and concerns, the whole group benefits, particularly when it comes to close calls, Edmondson says.

“In organizations like Toyota, where they recognize the richness of the almost-failure and recognize that those are free learning opportunities, people are more likely to speak up, and everyone learns,” says Edmondson. “But when people don't recognize near-misses as this goldmine, then they're not going to take advantage of them, because people quite often won't even mention them.”
About the Author

Kristen Senz is the growth editor of Harvard Business School Working Knowledge.

[This article was provided with permission from Harvard Business School Working Knowledge.]

Post Your Comment
Required
Required, will not be published
All comments are moderated