dispatch
03-02-2020
Poorly wired for risk.
Explore how the human mind’s natural desire to limit our exposure to risk can actually get us into some pretty dicey situations.
contributors
Jeffrey Boutilier
Knowledge and discipline are the tools necessary to overcome the impulses inherent in our humanity that push us to take bad risks.
Knowledge and discipline are the tools necessary to overcome the impulses inherent in our humanity that push us to take bad risks.
In fairness to Mr. Buffett, he was actually explaining to the graduating class at the Columbia Business School that his approach to investing was structured to mitigate or eliminate risks that were undesirable in the context of his investment strategy. For example, he reflected on his 1973 investment in the Washington Post Company noting that the organization held other companies that, collectively, were worth more than the parent – and therefore his investment was essentially risk-free. So much so that he added he’d have happily invested his entire worth in the company at the time. There are many other examples of Mr. Buffett’s approach to de-risking his investments: avoiding companies with extensive debt portfolios (financial risk), choosing companies with predictable earnings (business risk) spring to mind. These are great examples of Mr. Buffett’s great ability to make intelligent and disciplined choices when it comes to managing risk, but they are not an example of risk simply ceasing to exist because he knows what he’s doing. Instead they are great examples of how knowledge and discipline are the tools necessary to overcome the impulses inherent in our humanity that push us to take bad risks.
Mr. Buffett invests in the same economy we all do. His wisdom has certainly contributed to his outsized returns as an investor, but it’s his discipline and commitment to intelligent risk taking that have defined his success. It would seem that he experiences the same emotions that other investors feel (in fact he often exploits their profoundly human risk responses to his own advantage) but he is remarkably aware of them and able to control their role in shaping his response.
1,500 miles west of Omaha, Nebraska, lies the soaring granite of the Yosemite Valley. The great naturalist John Muir described the valley as “The grandest of all the special temples of nature I was ever permitted to enter”, and it is this place where Alex Honnold comes to test the limits of human athletic achievement. Honnold, long a legendary name in the climbing community, catapulted to mainstream fame with the release of 2018’s Free Solo film documenting his unprecedented and unrepeated June 2017 free solo ascent of the valley’s granite monolith El Capitan
The term Free Solo refers to a climbing style regarded as the purest form of the sport: the climber is un-roped and, in the event of a fall, completely unprotected. One mistake at any point of Honnold’s 3,000-foot vertical ascent would have been fatal. Throughout the 3-hour and 56 minute climb (itself an incredible feat of athleticism), Honnold encountered some of the most difficult technical climbing in the sport, executed complex moves with flawless precision, and grappled with the fear of knowing one false move would result in certain death – or did he?
Honnold’s feat triggered a wave of debate in the climbing community that eventually grew to the general public: was his climb a reckless risk? Did performing the climb un-roped demonstrate anything that a roped climb wouldn’t? Was it fair to his loved ones? What kind of message was he sending to other climbers – particularly younger participants in the sport that idolize their granite heroes? It also attracted the attention of an interested neuroscientist who couldn’t quite shake the idea that the human brain shouldn’t be able to perform that well in the presence of such enormous risks. In theory, Honnold should have been experiencing a degree of fear that would have significantly impaired his probability of success.
Honnold, known for his unshakeable calm, has never held himself out as fearless. His desire to climb un-roped simply appears to outweigh the obvious risks. “If you fall, you die” he has glibly remarked – so he just doesn’t fall. It would seem Honnold’s approach is to evaluate the objective, assess the risks, and mitigate the risks where possible (control for good weather and rock conditions, practice technical moves until they become second nature) in the same way Warren Buffet makes choices that de-risk his investments.
Strapped into the MRI machine, however, Honnold’s brain scans suggest the answer to how he overcomes the fear that would cripple his judgement and performance is more complex. When exposed to stimuli intended to generate an emotional or fear-linked response, Honnold’s amygdala – a emotional processing engine of the human brain – should light up like a Christmas tree. Instead, it’s dark. The fear centre of his brain does not register the biological response typical of other subjects. He’s not overcoming his fear through practiced rehearsal and rational thought – he’s not experiencing fear the same way the rest of us do in the first place.
The octogenarian investor and thirty-something rock climber have both distinguished themselves as players at the very top of exceptionally risky games. They share something else in common: they’re both a rare example of people who took on extraordinary risks and beat the odds.
The 2008 global financial crisis toppled many marquee investors – including acolytes of Buffet’s vaunted value investing ethos. In November 2019, another leading free soloist, (and heir apparent to the Honnold throne) Brad Gobright died after falling a thousand feet off the end of a rappel on El Sendero Luminoso – a classic route in El Portrero Chico, Mexico. He was 31.
Undeterred, these two masters of risk practice their craft in profoundly different ways. Honnold and his steely amygdala demonstrates an incredible capacity for risk, while Buffest draws from an equally impressive capability for it. And it is in these two unique case studies of the human response to risk where insights are revealed that can help us better understand how our own human nature influences our ability to identify, evaluate, and act when confronted by risk in our own lives.
“It is a part of probability that many improbable things will happen” – Aristotle
The human brain is a remarkable instrument poorly wired for risk. Buffett and Honnold are unusual examples of an individual’s unique ability to understand and overcome the limitations and cognitive biases that disrupt our ability to manage risks effectively. By understanding and being awake to these biases – the tricks our minds play on us – we, too, can divorce ourselves from the fear and emotion associated with risk and improve our capability to make decisions with favourable outcomes.
Aristotle famously taught that “it is a part of probability that many improbable things will happen”, yet the lesson at the core of his statement has remained elusive to generations of risk takers. A series of cognitive biases simply interfere with our ability to identify risks reliably and assess their probability and potential impact:
-
Overconfidence bias is something we observe in others routinely: the state of a person holding greater confidence in their own judgements than the judgements of others. Consider the navigator with the ‘great sense of direction’ refusing a map who gets lost along the way. This same overconfidence bias also leads the risk-taker to focus on familiar risks, failing to consider negative scenarios.
-
Framing bias describes the cognitive effect of considering the merits of an option based on whether they are presented in a positive or negative light. Studies have demonstrated that an individual is likely to accept a risk when presented positively or avoid a risk when presented negatively when the probability and impact of the outcome is the same. Consider a surgeon reviewing the potential risk of a procedure with a patient: a 5% chance of complications is processed by the mind as a greater risk than a 95% chance of success.
-
Outcome bias describes our tendency to overweight past results when considering the probability of certain risk outcomes for a given decision. For example, if a coin is flipped five times and produces a ‘tails’ each time, this bias pushes the mind to inflate the probability of a ‘tails’ being produced on the next flip.
Our human nature also drives us to downplay the probability and impact of negative outcomes instead of accepting their true likelihood. For a majority of people, the difference between a 1 in 100 probability and a 1 in 1000 probability is abstract. As the denominator increases, the differences become even less intuitive. The challenge of judging risk probabilities, and the biases that impact our ability to do so, is evident in everyday life: consider that more American citizens are afraid of terrorism than guns, even though guns are statistically 3,210 times more likely to kill them. Understanding how our minds process information concerning probability and impact can help us mitigate these logical fallacies:
-
Availability bias is a unique challenge in our connected world. The availability of information, access to graphic imagery, and media coverage associated with catastrophic events feeds our mind’s natural tendency to judge an event as more likely if it is top-of-mind. This bias, coupled with a strong base of research that suggests fear strengthens memory, can easily distort our perceptions of the probability of certain risks.
-
Anchoring bias also distorts our judgement by overweighting the first piece of information available to support our decision making and using it as a basis for comparison with other pieces of information. For example, if the first vehicle you see on the lot is priced at $50,000, and then you see a second one priced at $30,000 you’re more likely to conclude that the second vehicle is cheap. This cognitive bias is frequently exploited in negotiations where the first offer establishes the anchor that sets the basis for the back and forth that follows.
As social creatures, we tend to seek consensus or social support for the risks we undertake. This desire for validation can pollute our judgement by creating the impression that our choices are somehow less risky because they’re supported by the opinions of others. In fact, this sort of social consensus can magnify the distorting effects of cognitive bias:
-
Authority bias shapes our reaction to information sources based on their social relevance. A police officer in uniform, for example, must have more reliable information than a homeless person in the street.
-
Social proof bias reflects the comfort found in following the patterns that have produced positive outcomes for others, irrespective of the relevance of their circumstances to our own. This implicit trust in others is rooted in our fundamental belief that the majority knows best – often leading us to ignore or dismiss the value of our own analysis. Your mother taught you about this when she asked you if you’d jump off a cliff if all your friends did that time she caught you smoking.
Learning to recognize the cognitive biases that shape our response to risk is essential to improving our capability to manage risks. Understanding how our evolutionary instincts can work against us to drive poor risk decisions is an important step in building the confidence necessary to make decisive, fact-based choices. Developing our capacity for risk is another matter.
Intelligent risk-takers develop an independent understanding of their own tolerance for risk and capacity to manage it. This requires not only a clear understanding of an individual risk in the context of the risk taker’s capacity for risk, but a kind of portfolio-wide view of the cumulative risk undertaken. This capacity to endure the burden of risk without experiencing a decrease in performance is not static: personal circumstances, health, and environmental factors continually reshape our bandwidth.
No matter your capacity for taking on risk, it’s important to understand your limits before committing to a course of action. Once committed, the way we process risk changes dramatically and our minds struggle to respond. Consider the climber that embarks on a route only to confront a difficult section that couldn’t be observed from the ground – ‘well, I’ve come this far’ or ‘I’m so close to the top’ can easily override more sensible decisions that would have been made on the ground:
-
The Gambler’s fallacy speaks to our inherent desire to believe that the universe is keeping score, and that all bad luck is balanced by good luck. If you’ve ever sat at a Blackjack table you’ve likely succumbed to this persuasive failure of judgement. A group of economists studied every game of Deal or No Deal ever played and determined – alarmingly – that contestant decision-making behaviour following a major loss changed completely and irrationally. It’s critical to have a plan, understand your risk tolerance, and know when to get out – before the game begins.
There’s a saying in climbing: there are old climbers, and there are bold climbers, but there are no old bold climbers. In many ways, the same is true of entrepreneurs and business leaders. Hardening ourselves against the biases that push us gently towards ruin is an important step in the journey towards risk mastery. Buffett and Honnold show us that there is no great reward without an equally great risk – and even the best climbers sometimes get lucky with a low gravity day.
Jeffrey Boutilier
President & Chief Strategist
Ascent was founded on Jeff’s desire to challenge people to dig deeper into their environment and themselves to reveal the insights that light the path to market success. Driven by a passion for adventure, he prefers to seek out inspiration and ideas in the places others are afraid to look.