Most of us think we are better drivers, more honest and more intelligent than other people. You have found that such ‘inflated beliefs’ aren’t just inaccurate, they can lead to problems. How so?
Virtually all of us have something in common: we think too highly of our skills and abilities. On a wide range of dimensions, we rate ourselves higher than our peers, colleagues, or competitors—to an extent that can seem absurd.
A 1997 U.S. News and World Report survey, for example, asked 1,000 Americans a simple question: “Who do you think is most likely to get into heaven?” Overall, the respondents believed that then-president Bill Clinton had a 52 per cent chance, basketball superstar Michael Jordan had a 65 per cent chance, and Mother Teresa had a 79 per cent chance. Yet someone else ranked even higher: the person completing the survey! Respondents rated themselves as having an 87 per cent chance of passing through the pearly gates—and thus as being more divine, all things considered, than the sainted Mother Teresa.
These unjustifiably high views of our personal competence and abilities can lead us to hold excessively-positive expectations about our endeavours. In research studies, overconfidence has been linked to risky product introductions by managers, overly-risky investments by CEOs, and more-frequent trading by investors. If an entrepreneur believes she is savvier than the competition, she will make overly-risky business decisions that are likely to end in failure. And if CEOs believe they are smarter than other executives at their level, they will plunge ahead with ill-advised mergers and acquisitions.
You have also found that we often make the mistake of ignoring the advice of others. Why is this?
The inflated beliefs we have in our own competence, knowledge and perspective can lead us to pay too little attention to the perspective and knowledge that others have to offer, and this often has costly consequences. In many cases, we would have made higher quality decisions by listening to the opinions of others and by considering their point of view.
One particular area I studied is advice-taking. Managers and leaders rarely make critical decisions in isolation. Instead, they commonly receive input from advisors from both within and outside of their organizations. Like managers, most people consult others for their opinion before making a final commitment when facing a decision. Organizations, for their part, spend substantial amounts of money hiring consultants to provide advice on their complex business problems. Appropriately using advice has been shown to lead to better judgments and decisions. Nevertheless, my work in this area suggests that people often give more weight to their own opinions than to those of others.
On the other hand, when it comes to selecting investments, people have been found to pay too much attention to their financial advisors. Why is this?
My research in collaboration with Don Moore shows that people are open to the advice of others only when their own information is poor and their own opinions are weak—and they in fact recognize that this is the case. The problem is, people are insufficiently sensitive to the quality of the opinions they seek out in this state: they are too willing to listen to the opinions of others, who often have equally-poor information.
Research by Prof. Moore and others has shown that people believe themselves to be better than others on easy tasks and below average on difficult tasks. Based on this evidence, we argued and found that, since people believe themselves to be better than others on simple tasks, they should have little reason to listen to the opinions of others when engaged in simple tasks. On the other hand, on difficult tasks, since people believe that others are better than them, they should be more interested in what others have to say. Specifically, people weigh advice more when the task is difficult than when the task is easy. In fact, they weigh advice too much when the task is difficult and too little when the task is easy.
Our research suggests that when people are desperate for information, they will take whatever they can get, even if what they get is equally uninformative. Perhaps it is for this reason that so many ‘quacks’ and hucksters continue to exist, even in advanced economies with a well-educated citizenry. Sick people who have not found cures in the treatments offered by modern medicine often turn to faith healers and alternative therapies that provide some answers, even if there is little evidence in support of their truth or healing value; and businesses pay vast sums of money to consultants for their advice on complex business problems, even when there is little evidence that the consultants are any better at figuring out what will actually lead to success. More generally, our research sheds lights on situations in which people pay too much attention to the opinions of others like their financial advisors.
Describe the role of ‘incidental anger’ in hampering our decision making.
Though helpful in many contexts, emotions can also cloud people’s judgment and lead to poor outcomes, even when the emotions are triggered by an event unrelated to the decision at hand. In a series of laboratory experiments, Maurice Schweitzer and I found that anger—even when triggered by events unrelated to the decision at hand—reduces people’s willingness to listen to others’ opinions, and thus degrades their decisions.
This pattern of results does not generalize to all other negative emotions, however. Negative emotions such an anger, sadness and anxiety share the same valence (i.e., they are all negative), but they are characterized by different underlying dimensions, such as the level of personal control they entail or whether they are triggered by a situation or other people. So, for instance, anger is an emotion commonly triggered by other people, while anxiety is characterized by high situational uncertainty and a low sense of personal control about the outcome.
Because of these different underlying dimensions, different negative emotions can have differential effects on people’s decisions. For instance, anger and anxiety have different effects on advice taking. In fact, my research in collaboration with Maurice Schweitzer and Allison Brooks Wood shows that when people experience anxiety, they listen to others’ opinions more than is warranted, even when the opinions are of poor quality. Though taking others’ opinions is generally beneficial, these benefits do not accrue if the advice is of poor quality, as this research demonstrates. Why do we feel more satisfied with our lives on sunny days?
Without realizing it, we tend to evaluate our overall life satisfaction and well-being based on the emotions we are experiencing in the moment. Sunny days usually make us happy, while rainy days do not. As a result, if asked, we tend to report greater satisfaction with our lives on sunny rather than rainy days. This is because we incorrectly misattribute our emotions: we let the emotions that we feel in the moment affect the way we feel about our life in general.What are ‘focusing failures’, and why are they so prevalent in organizations?
Focusing failures are situations in which we focus too narrowly on the decision at hand and our own views about it. As a result, we fail to see the bigger picture, including other people’s roles. Effective decision makers are able to ‘zoom out’ beyond the specifics of the decision at hand; they are able to widen their focus when considering information to include in their decision-making processes so that they don’t miss important details. How can leaders address such perspective-taking failures within their teams?
Leaders should regularly encourage team members to consider other team members’ points of view. There is always another side to every story, and failing to recognize that can prevent us from reaching good decisions. By considering the other side’s point of view, each team member can analyze the decision faced by the team from others’ perspective.You have said that leaders who want to motivate team members to perform better should be weary of ‘social comparison processes’. Please explain.
As human beings, we share a common tendency to evaluate ourselves on all sorts of dimensions by looking at others. We can often answer the questions that most nag us about ourselves—ranging from, ‘Am I a good leader?’ to ‘Do I make good decisions?’ to ‘Am I trustworthy?’—by comparing our attitudes and actions with those of our peers or colleagues. When we compare ourselves unfavourably to someone else, we are likely to experience distress, jealousy or envy. These emotions can lower our self-esteem and lead us to somewhat dysfunctional behaviors.
For instance, in a recent study, University of Michigan professor Stephen Garcia asked 55 employees at a Midwestern university to imagine that they were working for a company and had either high pay or high decision-making power. The employees were then asked to imagine they had to make recommendations about a new recruit – namely, whether to offer the new recruit high pay or high decision-making power. Prof. Garcia found that the participants advised offering the new recruit the opposite of whatever they had (high pay if they themselves had high decision-making power, and vice versa). These results suggest that people who have high standing on a particular dimension are eager to protect their view of themselves on the social hierarchy by making recommendations that prevent others from competing in the same social comparison context.
You believe that the best way to maintain high ethical standards is to keep them salient in our minds. Please explain.
People with honest intentions often behave dishonestly. My work in this area demonstrates the mental gymnastics that people use to maintain a moral self-image and shows how small changes in both others’ behaviour and in the environment can steer people away from their ‘moral compass’.
In one project, my co-authors and I examined whether increasing the salience of people’s self-identity, which they commonly define as ethical, reduces dishonesty. In this project, we heightened people’s self-identity by simply changing where they were asked to sign documents on which they were being asked to report information truthfully (e.g., an expense report, a tax form). Most compliance certifications, contracts, expense reports and letters ask people to sign at the bottom of the document. In both field and laboratory experiments, we compared the effects of signing at the top of the document—before providing the requested information—to signing at the bottom of the document, after providing the information, on the truthfulness of the information people provide.
In a field experiment with an insurance company, we moved the signature line from the bottom to the top of car insurance reports for half of the customers (randomly chosen) who were filling out their policy forms. On the form, customers had to indicate the number of miles they drove the prior year; higher numbers translate into higher policy premiums. On average, the change we introduced resulted in an increase of about 2,400 miles in mileage claimed on the new policy forms, hinting at what is possible with just a simple ‘nudge’ to be ethical.
In follow-up laboratory experiments, we were able to unpack the psychological drivers of this reduced unethical behavior: we found that signing at the top of the form (before reporting information that can be inflated) increases the salience of ethical standards by highlighting people’s self-identity, thus resulting in greater ethicality.
Thus, more generally, keeping ethical standards salient in our mind can help us follow our moral compass more closely. Given all your findings about our thinking and decision-making biases, what is your advice for readers who want to improve their decision making?
Most of us know little about the functioning of our internal organs, such as our hearts or kidneys, a fact we readily admit. When our bodies don’t function as we expect them to, we invest time and energy in learning more about how they work and trying to improve our health. By contrast, we approach our minds quite differently: we believe we understand exactly how they work. Even after our decisions lead to disappointing outcomes, we don’t investigate what went wrong and try to find out how we might improve our thinking.
My advice to anybody who is interested in improving their thinking and decision making is two-fold. First, treat your mind the same way you treat your internal organs, and thus recognize the systematic limitations in the way you think and make decisions in both your personal and professional lives. Second, find some more time for reflection. Taking stock of our decisions and better understanding what affected them—independent of whether they led to outcomes we are happy or unhappy with—is a crucial ingredient for improvement. Francesca Gino is an Associate Professor of Business Administration at Harvard Business School. She is the author of Sidetracked: Why Our Decisions Get Derailed, and How We Can Stick to the Plan (Harvard Business Review Press, 2013). IT’S DECISION TIME: AVOID GETTING SIDETRACKED
1. Raise your awareness. Become more aware of the subtle influences on your decisions.
2. Take your emotional temperature. Carefully consider your emotional state.
3. Zoom out. Get a sense of the bigger picture.
4. Take the other party’s point of view.
5. Question your bonds. Examine your links and similarities to those around you and consider whether they are influencing your decision for the worse.
6. Check your reference points. Carefully consider the motives behind your decision.
7. Consider the source. Carefully consider the information surrounding important decisions.
8. Investigate and question the frame. Simple changes in framing can have significant effects on our motivation to act.
9. Make your standards shine. Remind yourself of the importance of your moral compass.
[This article has been reprinted, with permission, from Rotman Management, the magazine of the University of Toronto's Rotman School of Management]