In your research, you examine the warning signs that people ignore, leading to bad outcomes. Why don’t people take anomalies in their everyday work seriously when, in retrospect, they look like clear ‘red flags’?
The answer to that has two parts. The first is that, sometimes people just don’t understand the connection between small problems and the big issues they can become. If you think crises, by their very nature, just arrive at your doorstep without warning, you’re not going to be looking out for any clues. The other part of the problem is lack of time. This is especially true in high-paced, hazardous work, but the pace of work across all fields has sped up. People have so much information coming at them that it’s hard to know which weak signals to pay attention to, and which to ignore.
What kind of insights would we gain if we paid more attention to small disturbances?
Because of my background as a physician, I do a lot of work on medical error. While people often think of errors or mistakes as things that just happened, in fact, there is usually a journey of error. Things ‘become’ mistakes over time. Most people don’t look at the process of how errors can build upon themselves and become more serious.
Sometimes, it’s only by looking back that we can see what the early signs were, and sometimes they just weren’t possible to discern. But when you look at an event like BP’s Deepwater Horizon explosion, there were lots of clues that should have been apparent at the time.
Part of combating peoples’ natural tendency to ignore warning signs is to build mechanisms into organizations that encourage people to be more attentive to potential problems. Is this where the organizational capabilities you and Kathleen Sutcliffe write about come in?
Yes. Our argument is that unexpected events will happen, but you can still plan for possible emergencies. And it’s good to plan for those kinds of things because, even if the particular events you had in mind don’t occur, you will begin to develop general skills for dealing with problems. For example, getting better at coping with interruptions is important, and, as we’ve discussed, it’s useful to get in the habit of thinking about how small problems can become big problems. You have to assume that you will not be able to anticipate all eventualities, but building a set of skills so that you can manage these problems is pretty critical. We would argue that managing the unexpected is a fundamental capability that organizations need to build.
Attention allocation is one of the capabilities you argue can help us better manage the unexpected. How do you define this skill, and how would you go about developing it within an organization?
As human beings, our attention is a finite resource, and since so many things happen during the course of our working day, we need to figure out where to focus our limited attention. A fair amount of research has been done on what individuals pay attention to and what they ignore, and it turns out, you can have your attention focused on something and not notice things going on around you, even very striking things. A number of studies have looked at what people pay attention to at the very top levels of the organization, because these are the people tasked with making sense of the landscape in which the organization functions, and then taking action.
We have a colleague at Ivey doing a lot of work on how attention flows between the top lines of an organization and the front line workers and vice versa. As you can imagine, it’s a really delicate balance. People at the top may be getting information from people at the front line, but they can’t listen to every single piece of input, because that would be too much. Organizations have to find the right balance in terms of winnowing the amount of information that gets to the top, but making sure that the information gathered is diverse enough to encompass what’s going on, not just information that matches a certain strategic objective.
Are there examples of organizations that successfully manage attention flow?
High-hazard industries have developed some key practices that they teach their employees. For example, aviation has a practice called ‘crew resource management’, where one of the important tasks that a person has, regardless of their role, is to speak up if they notice something they’re worried about. If they look around and they see that the captain of the plane doesn’t appear to be worried, but they’ve noticed something, it’s their obligation to speak up. High liability organizations, like nuclear power plants, also have a practice called ‘deference to expertise’. In a crisis or a problem, decision-making power migrates to front-line people because they are the experts and they’re located at the problem’s source.
Too much expertise can sometimes become problematic, because experts may jump to conclusions based on the assumptions that come from their years of experience. Obviously it depends on the person, but there is not a straightforward linear relationship between more expertise and better sense making; it’s useful to have a mix of people. Sense making is certainly a skill that people can get better at.
You also write about mindful organizing. Can you explain this term?
There’s a well-established literature on this, rooted in observing high-hazard organizations. The idea is that people who are working together can be more or less mindful about what they and their colleagues are doing. You can pay closer attention to what’s happening around you; you can interact with each other in a way that is more or less heedful or respectful. Mindful organizing is a set of organizing practices that help people not only to do things like notice problems, but also see what others are working on, so that they know how their work fits with other peoples’ work. If I see that you’re having problems with something, I can help you; I can realize that what I do as part of my job is actually going to affect what you’re doing. So, it’s being really aware of your work and other peoples’ work, and how it fits together.One of the other organizational capabilities you’ve written about is updating, which is the ability to process new information that contradicts what you’ve been doing, and changing your course accordingly. You’ve found that we’re much better at doing this as groups than as individuals. Why is that?
This is actually a core part of what I’m studying right now. Based on preliminary studies, there are some good reasons why groups are better at this than individuals. If you imagine working on a puzzle, another set of eyes often helps. It’s good to have another perspective, someone to ask questions; and sometimes it just helps to think aloud. I study medical teams and how they’re able to update their diagnosis of a patient as the situation changes over time. If I see something and I’m not sure what’s going on, it’s helpful for me to be able to turn to someone else on the team and say, “I’m not quite sure what’s going here. I think the patient’s having this particular problem. Can you take a look?” So, people start collaborating and they say, “Oh, I see that too.” Or, “I don’t notice that. Let’s see what this other person thinks.”
It also turns out that working with a partner helps people start tasks, helps them keep going, and makes them more likely to investigate concerns. If you work alongside someone else you can say, “Hey what do you think about this?” and, get their opinion. You might be reluctant to change an important course of action if you were on your own. So the nice thing about teams and groups is, you have other people, their knowledge, and their expertise to rely on. So much of the work we do today is work that a single person can’t accomplish. Does all of this have any implications for telecommuting or working in virtual environments?
Definitely. This isn’t my area of specialty, but there are other scholars who have studied telecommuting. There are different kinds of communication channels. Face-to-face is by far the richest: I can see your expression and read your body language. If I say something that I mean as a joke but I can tell you don’t think I’m joking, I can emphasize that it’s a joke, whereas something like e-mail is terrible for that kind of thing.
With really high-risk work, people need to be physically co-located. When people work virtually, lots of the same things can be accomplished, but people need to do two things to make it work. First, they need to have this general skill set for working well together. Second, they need to figure out how to fine tune it to a context where you don’t have as much information as you would otherwise.
I think there will be some kinds of work that we will never do virtually, but as technology gets better, who knows? You could be there via video and, while it’s not exactly the same as in-person, it is much richer than e-mail. Technology makes possible things that once seemed completely ludicrous. For example, most radiology work is now being outsourced, because digital imaging lets you instantly send CT scans all around the world. A radiologist in another hospital, or even another country, can analyse a scan and send it back. Radiology has historically been a very well-compensated field because you needed the doctor to come to the hospital in the middle of the night to read the emergency CT scan. It was an expensive item that couldn’t just be transmitted, but now it’s electronic. So who knows what will happen in the future?
If working well with people is a whole other skill set beyond your own job skills, what’s the best way to develop these skills within an organization?
I do a lot of work that involves simulations, where people get together in a setting that looks very much like the kind of work they would do every day. Then they’re faced with some kind of event they have to handle and they practice dealing with it. The simulation is videotaped, so they can review and discuss their actions in detail afterwards. They go through a debriefing process and learn what went well and what didn’t. Obviously that’s a pretty high-cost solution, but you can see why people do this in certain high-hazard industries. You can also imagine smaller-scale versions in regular organizations. The idea of rehearsal and practice is important. Simulation can be worthwhile when it’s done well, because when people learn things they tend to fall back on them during times of stress or crisis. You want them learning the kinds of practices that are going to be helpful. If there is a lag between understanding and action, how can organizations decrease the lag time so their response to an unexpected event is better informed?
Organizations in motion do better than organizations that have ground to a halt. If you’re in a crisis and you become completely paralyzed and no one can decide what to do, that quickly becomes problematic because you’re not learning anything. The problem with unexpected events is that you don’t know what’s going on, what’s involved, how big the problem is. Is it just this little thing or are we seeing the beginning of a very large issue?
Trying to figure everything out before you take a single action is a poor strategy for two reasons. One, it’s really hard to understand the scope of the problem because you’re not doing anything to see how big it might be -- you aren’t gathering new information. The second thing is that the situation itself is evolving and changing. While you’re still looking at what it was like in Time A you’re actually at Time B and you haven’t kept up with the unfolding situation. I’m obviously not counselling people to take reckless action and start charging wildly into poorly understood situations. That would be disastrous. But there’s a balance to strike, where you recognize that by taking action you can uncover new clues about what’s going on.
Karl Weick, who I trained under at the University of Michigan, has this saying, “Any map will do.” If you’re stuck in an emergency, even if you don’t have quite the right map, if you just start taking little steps in the direction you think is right, and you’re paying close attention to what happens, you will make some progress. The important part is the feedback you receive along the way, which enables you to build up your map as you go. So, be cautious and take the wisest course of action you can, but get into motion and gather more information as you go. People uncover a lot more during unexpected events if they develop an explanation and then take an action that will help rule that hypothesis in or out. The bottom line is, we understand things by doing them.
Marlys Christianson is an assistant professor of Organizational Behaviour and Human Resource Management at the Rotman School of Management. Rotman faculty research is ranked in the top ten worldwide by the Financial Times.
[This article has been reprinted, with permission, from Rotman Management, the magazine of the University of Toronto's Rotman School of Management]