Heads in the Sand – Why We Fail to Foresee and Contain Catastrophe
Applied Business Strategy - Issue 2
It’s no secret how organizations should react in times of crisis. First, they need to be nimble as problems often grow at exponential rates, making early action crucial and procrastination disastrous. Second, they need to act wisely, incorporating the full range of available knowledge about the problem at hand, embracing uncertainty rather than willfully ignoring it, and thinking in terms of a long time horizon. Yet, decisionmakers are often anything but nimble and wise. Instead, they tend to be slow, inflexible, uninformed, overconfident, and myopic. To understand why, one must turn to human psychology and the cognitive biases, emotional reactions, and suboptimal shortcuts that hold decisionmakers back – as well as the tools to overcome them.
The underlying problem is that humans seldom have enough time, information, or processing power to carefully consider all viable options and make perfectly rational decisions that optimize their preferences. Instead, they evolved easier ways of making decisions, such as emotions, which serve as an early warning system, and simple decision-making rules that require little effort and work well across a broad range of situations.
Unfortunately, emotions are not always helpful for decision-making, as uncertainty, tradeoffs, conflict, and high stakes all elicit negative emotions, which can impede wise responses. In addition, most people become even more risk averse than usual under stress and retreat to the familiar comfort of the status quo. Similarly, people often employ sub-optimal decision rules to avoid making explicit trade-offs, which involve inherent conflict and unavoidable concessions.
The human tendency to consider a problem solved with a single action (“single-action bias”) is an additional psychological impediment to effective decision-making that evolved during simpler times. For example, there was an easy one-step solution to avoid being killed by lions at a watering hole: stay away from the watering hole. But today’s crises are often far more complicated and ambiguous, as they are caused by a range of factors, including human behavior itself. Furthermore, the solutions to these problems are often inconvenient, unpopular, and initially expensive. When that is the case, people tend to exploit any ambiguity in the cause of the problem to support alternative explanations, as with the pandemic and climate change.
Another psychological barrier to effective governance in times of crisis relates to how people learn and revise their beliefs. In practice, people give more weight to personal experience than abstract statistics provided by scientific experts, even if the latter carry far greater evidentiary value. As a result, most people vastly underestimate the likelihood of low-probability events, until they personally experience one. In addition, people who are committed to their beliefs, especially those shared with ideological allies, will pay selective attention only to information that confirms their preexisting notions.
In addition to identifying numerous decision-making biases, psychologists have also come up with ways to counter their effect, such as the concept of choice architecture, whereby decisions are deliberately structured to nudge people toward good and away from bad choices. For example, employees are more likely to save when companies simply set the proper default by automatically enrolling them in retirement plans (while allowing them to opt out).
Psychologists have also found that eliciting positive emotions (e.g., pride) is far more effective in motivating a desired change in behavior than playing on negative emotions, such as fear or guilt. In addition, using a more intuitive time frame can be very effective when communicating risks, such as warning of the probability of a flood over the course of a 30-year mortgage rather than within 100 years. All these techniques are a form of psychological jiujitsu that turns vulnerabilities into strengths.
Effective leaders understand and use the richness of human behavior. They are not only evidence-based, analytic problem solvers, but also acknowledge fears, empathize with loss and pain, and reassure people in the face of uncertainty. They are not prisoners of psychology but masters of it.
Source
Heads in the Sand; Elke Weber, Foreign Affairs, November/December 2020