To err is human. But what happens afterward?
What people do after they make mistakes can mean the difference between life and death. Just ask frontline care providers at a hospital or anyone connected with the Deepwater Horizon explosion.
Organizations develop effective operating procedures to prevent errors, but they will never be perfect, says organizational behavior professor David Hofmann, a recognized authority on leadership and safety in high-risk environments.
“Errors have been ubiquitous since Adam and Eve bit the apple and will continue to be forever,” Hofmann said. “In some areas, errors won’t kill people but can kill a 401(k) or cause expensive technologies to fail.”
So organizations also need to design systems to manage errors after they occur and before they cause significant harm, says Hofmann.
He wanted to know how to build a system that encourages workers who have just made a mistake, or don’t know how to proceed with a task, to seek the advice of the most knowledgeable person.
Seeking advice, however, can come with a price, he says. “Seeking out advice is not a risk-free action, not only because you might worry about revealing a lack of knowledge or competence, but the expert might react in a condescending or ridiculing manner.”
To examine the factors that influence whom a worker turns to for help, Hofmann undertook a study with Zhike Lei (UNC Kenan-Flagler PhD ’05) of the European School of Management and Technology and Adam Grant at the University of Pennsylvania. They surveyed nurses in a 500-bed hospital, and asked them to evaluate other nurses on the unit in terms of expertise, accessibility and trust. The survey also presented a number of situations and asked whom they would approach for advice. By linking the survey data, the researchers could investigate factors that predicted whom nurses would seek out for informal consultations. They published their study results in the Journal of Applied Psychology article “Seeking Help in the Shadow of Doubt: The Sensemaking Processes Underlying How Nurses Decide Whom to Ask for Advice.”
One of their key findings was that having a trusting relationship trumped accessibility when approaching an expert. In other words, strong interpersonal relationships built on trust provided individuals entrée into the expert’s network – even if that expert was very, very busy and seemed inaccessible to most. Thinking about it this way, mentors make time to help valued mentees even if they are very busy because of the relationship.
But the flip side also occurs. Nurses who did not have well-developed relationships with others on the unit had difficulty seeking out advice from experts because of the risk involved. This has broad reaching implications as more and more organizations in healthcare and other industries move toward “strategic staffing” that uses contract, temporary and float workers.
Hofmann’s findings suggest that although organizations typically adopt these staffing models to increase flexibility and costs, they also need to consider how this newly formed structure will function when things go wrong. Because, Hofmann says, “We know things will eventually go wrong.”
His findings suggest that one way organizations can reduce the potential risks associated with these types of staffing arrangements is by making the process of seeking out expertise safer.
One way to do this is to formally designate an expert as the “go-to” person within the unit. By design, the designated expert:
- is expected to be accessible and provide help as part of his/her job
- has a somewhat reduced workload to allow time for helping and mentoring less experienced employees
- is not the unit supervisor who conducts performance evaluations so that workers feel safe about asking for help
This research project builds on Hofmann’s other studies that investigate safety and human error issues in different industries. The cumulative findings help organizations develop systems and cultures that aid them in managing safety risks. Hofmann is continuing this work with the Department of Interior by serving on the National Research Council/National Academies of Science committee investigating the Deepwater Horizon accident to examine what happened and recommend how to prevent such accidents in the future.
“The perspective I hope to bring to the committee will be on the effects of organizational culture and management decision-making and the inner workings of team dynamics and decision-making in complex situations,” he said.
One thing is certain, he said. “There’s an opportunity for many industries to get better.”Key Take-Aways
David A. Hofmann is the Hugh L. McColl Scholar in Leadership and professor and area chair in organizational behavior at UNC Kenan-Flagler.
- Although seeking out the person with the most expertise in an organization seems to be the logical thing to do when needing advice, this often does not occur.
- To ensure that experts are sought, create both a climate and processes that make it easy and safe to ask for advice and help.
- Efficiency, flexibility and costs aren’t the only criteria for strategic decisions about operations. Managers also need to factor in how the resulting system will function when things go wrong.
[This article has been reproduced with permission from research from the UNC Kenan-Flagler Business School: http://www.kenan-flagler.unc.edu/]