There are very few sure things in life, so when we make decisions, we play the odds. Whether it’s what to study, which job to apply for or which house to buy, the outcomes of our decisions rely on many other factors. It’s simply not possible to know every single relevant variable when we make up our minds.
Like poker, life-changing decisions are largely based on luck. But calling it all “luck” is a bit disingenuous – it’s more like a game of probabilities. What’s more, the decisions we make are linked to the ways our brains are neurologically wired. This is covered in the book, Thinking in Bets by Annie Duke.
So what can you control? Well, probably more than you think.
Super Bowl XLIX ended in controversy. With 26 seconds left in the game, everyone expected Seattle Seahawks coach Pete Carroll to tell his quarterback, Russell Wilson, to hand the ball off. Instead, he told Wilson to pass. The ball was intercepted, the Seahawks lost the Super Bowl, and, by the next day, public opinion about Carroll had turned nasty. The headline in the Seattle Times read: “Seahawks Lost Because of the Worst Call in Super Bowl History”!
But it wasn’t really Carroll’s decision that was being judged. Given the circumstances, it was actually a fairly reasonable call. It was the fact that it didn’t work.
Poker players call this tendency to confuse the quality of a decision with the quality of its outcome resulting, and it’s a dangerous tendency. A bad decision can lead to a good outcome, after all, and good decisions can lead to bad outcomes. No one who’s driven home drunk has woken up the next day and seen it as a good decision just because they didn’t get into an accident.
In fact, decisions are rarely 100 percent right or wrong. Life isn’t like that. Life is like poker, a game of incomplete information – since you never know what cards the other players are holding – and luck. Our decision-making is like poker players’ bets. We bet on future outcomes based on what we believe is most likely to occur.
So why not look at it this way? If our decisions are bets, we can start to let go of the idea that we’re 100 percent “right” or “wrong,” and start to say, “I’m not sure.” This opens us up to thinking in terms of probability, which is far more useful.
Volunteering at a charity poker tournament, the author once explained to the crowd that player A’s cards would win 76 percent of the time, giving the other player a 24 percent chance to win. When player B won, a spectator yelled out that she’d been wrong.
But, she explained, she’d said that player B’s hand would win 24 percent of the time. She wasn’t wrong. It was just that the actual outcome fell within that 24 percent margin.
We all want to make good decisions. But saying, “I believe X to be the best option” first requires good-quality beliefs. Good-quality beliefs are ideas about X that are informed and well thought-out. But we can’t expect to form good-quality beliefs with lazy thinking. Instead, we have to be willing to do some work in the form of truth-seeking. That means we have to strive for truth and objectivity, even when something doesn’t align with the beliefs we hold.
Unfortunately, truth-seeking runs contrary to the ways we’re naturally wired. For our evolutionary ancestors, questioning new beliefs could be dangerous, so it was low priority. If you hear a lion rustling in the grass, for example, you’re less likely to stop and analyze the situation objectively, and more likely to just run!
When language developed, we could communicate things that our own senses had never experienced, leading to the ability to form abstract beliefs. This ability worked via our old belief-forming methods, though, and questioning remained something we did after belief-forming and only infrequently.
In 1993, Harvard psychology professor Daniel Gilbert and his colleagues conducted experiments showing that this tendency to believe is still with us. In the experiments, participants read statements color-coded as either true or false. Later, they were asked to remember which statements were true and which were false. But this time, they were distracted so as to increase their cognitive load and make them more prone to mistakes. In the end, the subjects’ tendency was to simply believe that statements had been true – even those that had “false” color-coding.
And as easily as beliefs are formed, they’re equally hard to change. When we believe something, we try to reinforce it with motivated reasoning. That is, we seek out evidence that confirms our belief, and ignore or work against anything contradictory. After all, everyone wants to think well of themselves, and being wrong feels bad. So information that contradicts our beliefs can feel like a threat.
The good news is, we can work around our tendencies with a simple phrase: “Wanna bet?”If we were betting on our beliefs, we’d work a lot harder to confirm their validity. If someone bets you $100 that a statement you made was false, it changes your thinking about the statement right away. It triggers you to look more closely at the belief in question, and motivates you to be objectively accurate. This isn’t just about money. Whenever there’s something riding on the accuracy of our beliefs, we’re less likely to make absolute statements and more likely to validate those beliefs.
Focusing on accuracy and acknowledging uncertainty is a lot more like truth-seeking, which gets us beyond our resistance to new information and gives us something better on which to bet.
The best way to learn is often by reviewing our mistakes. Likewise, if we want to improve our future outcomes, we’ll have to do some outcome fielding. Outcome fielding is looking at outcomes to see what we can learn from them.
Some outcomes we can attribute to luck and forget about – they were out of our control anyway. It’s the outcomes that seem to have resulted primarily from our decisions that we should learn from. After analyzing those decisions, we can refine and update any beliefs that led to our initial bet.
Here’s an example: A poker player who has just lost a hand needs to quickly decide whether it was luck or her own poker-playing skill that was responsible. If it was skill, then she needs to figure out where her decision-making went wrong so she doesn’t repeat the mistake.
Most outcomes result from a mix of skill, luck, and unknown information. That’s why we often make errors in our fielding. Knowing how much of each is involved is tricky. Plus we’re all subject to self-serving bias. We like to take credit for good outcomes and blame bad outcomes on something or someone else.
For example, social psychologist and Stanford law professor Robert MacCoun examined accounts of auto accidents. In multiple-vehicle accidents, he found that drivers blamed someone else 91 percent of the time. And 37 percent of the time they still refused responsibility when only a single vehicle was involved.
We can try to circumvent self-serving bias by looking at other people’s outcomes. But in that case, it just operates in reverse: we blame their successes on luck and their failures on bad decisions.
Chicago Cubs fan Steve Bartman found this out the hard way in 2003 when he accidentally deflected a fly ball from Cubs left fielder Moises Alou. The Cubs lost the game and Bartman became the subject of angry fans’ harassment and even violence for more than a decade.
But why was Bartman held responsible? He tried to catch the ball, just as lots of other fans did. But Bartman had the bad luck of deflecting it. The world saw the other fans’ good outcome, that is, not touching the ball was a result of their good decision not to intervene. Whereas Bartman’s bad outcome was all his fault.
Phil Ivey is one of the best poker players in the world. He’s admired by his peers and has been incredibly successful in every type of poker. One big reason for this? Phil Ivey has good habits.
Habits work in neurological loops that have three parts: cue, routine and reward. As Pulitzer-prize-winning reporter Charles Duhigg points out in his book The Power of Habit, the key to changing a habit is to work with this structure, leaving the cue and reward alone but changing the routine.
Let’s say you want to minimize your self-serving bias in poker, but your habit is to win a hand (cue), attribute it to your skill (routine) and feed your positive image of yourself (reward). You might try attributing each win to a combination of luck and skill in order to change the habit.
But how do you then get that boost to your self-image? Instead of feeling good about being a winning poker player, you can feel good about being a player who’s good at identifying your mistakes, accurately fielding your outcomes, learning and making decisions.
That’s where Phil Ivey excels. His poker habits are built around truth-seeking and accurate outcome fielding rather than self-serving bias. The author mentions a 2004 poker tournament in which Ivey mopped the floor with his competitors, then spent a celebratory dinner afterward picking apart his play and seeking opinions about what he might have done better.
Unfortunately, most of us don’t have habits as good as Phil Ivey’s, but that doesn’t mean we can’t work with what we’ve got. One way we can improve the way we field outcomes is to think about them in terms of – you guessed it – bets.
Let’s say we got into a car accident on an icy stretch of road. It might be that we were unlucky, that’s all. But would that explanation satisfy you if you had to bet on it? Chances are, you’d start to consider other explanations, just to be sure. Maybe you were driving too fast, or maybe you should have pumped your brakes differently. Once the stakes are raised, we start to look into the causes a little more seriously, to help us move beyond self-serving bias and become more objective.
As a fringe benefit, this exploration also makes us look at things with a little more perspective. We start to see explicitly that outcomes are a mixture of luck and skill. Despite our hard-wired tendencies, this forces us to be a little more compassionate when evaluating the other peoples’ – and our own – outcomes.
Check out my related post: Can you get rid of the bias in your head?