53302337203
53302337203
Portfolio/Penguin An imprint of Penguin Random House LLC 375 Hudson Street New York, New York 10014 Copyright © 2018 by Annie Duke Penguin supports copyright. Copyright fuels creativity, encourages diverse voices, promotes free speech, and creates a vibrant culture. Thank you for buying an authorized edition of this book and for
complying with copyright laws by not reproducing, scanning, or distributing any part of it in any form without permission. You are supporting writers and allowing Penguin to continue to publish books for every reader. Library of Congress Cataloging-in-Publication Data Names: Duke, Annie, 1965-author. Title: Thinking in bets : making smarter
decisions when you don’t have all the facts / Annie Duke.
Description: New York : Portfolio, 2018. | Includes bibliographical references and index. Identifiers: LCCN 2017042666 | ISBN 9780735216358 (hardback) | ISBN 9780735216365 (epub) Subjects: LCSH: Management games. | Decision making. | BISAC: BUSINESS & ECONOMICS / Decision-Making & Problem Solving. | PSYCHOLOGY Cognitive
Psychology. | BUSINESS & ECONOMICS Strategic Planning. Classification: LCC HD30.6 .D85 2018 | DDC 658.4/0353—dc23 LC record available at Version_1 To Lila and Henry Gleitman, generous of heart and intellect CONTENTS Title Page Copyright Dedication INTRODUCTION Why This Isn’t a Poker Book CHAPTER 1 Life Is Poker, Not Chess
Pete Carroll and the Monday Morning Quarterbacks The hazards of resulting Quick or dead: our brains weren’t built for rationality Two-minute warning Dr. Strangelove Poker vs. chess A lethal battle of wits “I’m not sure”: using uncertainty to our advantage Redefining wrong CHAPTER 2 Wanna Bet? Thirty days in Des Moines We’ve all been to Des
Moines All decisions are bets Most bets are bets against ourselves Our bets are only as good as our beliefs Hearing is believing “They saw a game” The stubbornness of beliefs Being smart makes it worse Wanna bet? Redefining confidence CHAPTER 3 Bet to Learn: Fielding the Unfolding Future Nick the Greek, and other lessons from the Crystal
Lounge Outcomes are feedback Luck vs. skill: fielding outcomes Working backward is hard: the SnackWell’s Phenomenon “If it weren’t for luck, I’d win every one” All-or-nothing thinking rears its head again People watching Other people’s outcomes reflect on us Reshaping habit “Wanna bet?” redux The hard way CHAPTER 4 The Buddy System
“Maybe you’re the problem, do you think?” The red pill or the blue pill? Not all groups are created equal The group rewards focus on accuracy2 “One Hundred White Castles . . .
and a large chocolate shake”: how accountability improves decision- making The group ideally exposes us to a diversity of viewpoints Federal judges: drift happens Social psychologists: confirmatory drift and Heterodox Academy Wanna bet (on science)? CHAPTER 5 Dissent to Win CUDOS to a magician Mertonian communism: more is more
Universalism: don’t shoot the message Disinterestedness: we all have a conflict of interest, and it’s contagious Organized skepticism: real skeptics make arguments and friends Communicating with the world beyond our group CHAPTER 6 Adventures in Mental Time Travel Let Marty McFly run into Marty McFly Night Jerry Moving regret in front of
our decisions A flat tire, the ticker, and a zoom lens “Yeah, but what have you done for me lately?” Tilt Ulysses contracts: time traveling to precommit Decision swear jar Reconnaissance: mapping the future Scenario planning in practice Backcasting: working backward from a positive future Premortems: working backward from a negative future
Dendrology and hindsight bias (or, Give the chainsaw a rest) ACKNOWLEDGMENTS NOTES SELECTED BIBLIOGRAPHY AND RECOMMENDATIONS FOR FURTHER READING INDEX INTRODUCTION Why This Isn’t a Poker Book W hen I was twenty-six, I thought I had my future mapped out. I had grown up on the grounds of a famous New
Hampshire prep school, where my father chaired the English department. I had graduated from Columbia University with degrees in English and psychology. I had attended graduate school at the University of Pennsylvania, where I won a fellowship from the National Science Foundation, earning a master’s and completing my doctoral course work in
cognitive psychology. But I got sick right before finishing my dissertation. I took a leave of absence, left Penn, got married, and moved to a small town in Montana. Not surprisingly, my NSF fellowship didn’t cover my cross-country experiment in adulting, so I needed money. My brother Howard, a professional poker player who had already made the
final table of the World Series of Poker by this time, suggested I check out the legal poker games in Billings. This suggestion wasn’t as random as it might sound. I grew up in a competitive, games-playing family, and Howard had brought me out to Las Vegas a few times for vacations I couldn’t otherwise afford on my stipend. I had watched him play,
and played in a few low-stakes games myself. I fell in love with poker right away. It wasn’t the bright lights of Vegas that lured me in, but the thrill of playing and testing my skills in the basement of a Billings bar named the Crystal Lounge.
I had a lot to learn, but I was excited to learn it. My plan was to earn some money during this break from school, stay on the academic path, and continue playing poker as a hobby. My temporary break turned into a twenty-year career as a professional poker player. When I retired from playing in 2012, I had won a World Series of Poker gold bracelet,
the WSOP Tournament of Champions, and the NBC National Heads-Up Championship, and earned more than $4 million in poker tournaments. Howard, meanwhile, went on to win two World Series bracelets, a tournaments. Howard, meanwhile, went on to win two World Series bracelets, a pair of titles at the Hall of Fame Poker Classic, two World
Poker Tour championships, and over $6.4 million in tournament prize money. To say that I had strayed from the academic path might seem like an understatement. But I realized pretty quickly that I hadn’t really left academics so much as moved to a new kind of lab for studying how people learn and make decisions.
A hand of poker takes about two minutes. Over the course of that hand, I could be involved in up to twenty decisions. And each hand ends with a concrete result: I win money or I lose money. The result of each hand provides immediate feedback on how your decisions are faring. But it’s a tricky kind of feedback because winning and losing are only
loose signals of decision quality. You can win lucky hands and lose unlucky ones. Consequently, it’s hard to leverage all that feedback for learning. The prospect of some grizzled ranchers in Montana systematically taking my money at a poker table forced me to find practical ways to either solve this learning puzzle or go broke. I was lucky, early in my
career, to meet some exceptional poker players and learn from them how they handled not only luck and uncertainty but also the relationship between learning and decision-making. Over time, those world-class poker players taught me to understand what a bet really is: a decision about an uncertain future. The implications of treating decisions as
bets made it possible for me to find learning opportunities in uncertain environments. Treating decisions as bets, I discovered, helped me avoid common decision traps, learn from results in a more rational way, and keep emotions out of the process as much as possible.
In 2002, thanks to my friend and super-successful poker player Erik Seidel turning down a speaking engagement, a hedge-fund manager asked me to speak to a group of traders and share some poker tips that might apply to securities trading. Since then, I have spoken to professional groups across many industries, looking inward at the approach I
learned in poker, continually refining it, and helping others apply it to decisions in financial markets, strategic planning, human resources, law, and entrepreneurship. The good news is that we can find practical work-arounds and strategies to keep us out of the traps that lie between the decisions we’d like to be making and the execution of those
decisions. The promise of this book is that thinking in bets will improve decision-making throughout our lives. We can get better at separating outcome quality from decision quality, discover the power of saying, “I’m not sure,” learn strategies to map out the future, become less reactive decision-makers, build and sustain pods of fellow truthseekers to
improve our decision-makers, build and sustain pods of fellow truthseekers to improve our decision process, and recruit our past and future selves to make fewer emotional decisions. I didn’t become an always-rational, emotion-free decision-maker from thinking in bets. I still made (and make) plenty of mistakes.
Mistakes, emotions, losing—those things are all inevitable because we are human. The approach of thinking in bets moved me toward objectivity, accuracy, and open-mindedness. That movement compounds over time to create significant changes in our lives. So this is not a book about poker strategy or gambling. It is, however, about things poker
taught me about learning and decision-making. The practical solutions I learned in those smoky poker rooms turned out to be pretty good strategies for anyone trying to be a better decision-maker. • • • Thinking in bets starts with recognizing that there are exactly two things that determine how our lives turn out: the quality of our decisions and luck.
Learning to recognize the difference between the two is what thinking in bets is all about.See more Note: This post contains affiliate links which means if you click on a link and purchase an item, we will receive an affiliate commission at no extra cost to you. Ready to learn the most important takeaways from Thinking in Bets in less than two minutes?
Keep reading! Why This Book Matters: Thinking in Bets helps readers understand the importance of their outcomes and how their results may not necessarily be directly linked to their choices. The Big Takeaways: We tend to mistake our choices for the way they end up.
If The decisions we make are not necessarily wrong just because the outcome is not what we expected. This way of thinking makes it difficult to see where we actually went wrong. Reality goes beyond just what we’re told.
If we want to find the honest truth, we have to see the truth for what it is without the influence of what others say with bias. The result of our decisions can tell us a lot, but knowing which ones to learn from can be difficult. Some results are random. However, it is the results that come directly from our actions that need to be acknowledged. We
cannot look at the result of our choices with a subjective lens. Our bad habits can make us think that everything that happens is a direct result of our actions, but that is not necessarily true. Thinking about the future allows us to make better choices. We must keep in mind our future selves when making our decisions. If we can visualize the outcome
of our choice, we are more likely to make better ones. Want To Keep Reading?
Watch A Video Summary: Additional Video From The Author: Thinking in Bets: Making Smarter Decisions When You Don’t Have All The Facts — Book Notes Annie Duke A bet: a decision about an uncertain future. The implications of treating decisions as bets made it possible for me to find learning opportunities in uncertain environments.
Treating decisions as bets, I discovered, helped me avoid common decision traps, learn from results in a more rational way, and keep emotions out of the process as much as possible. Thinking in bets starts with recognizing that there are exactly two things that determine how our lives turn out: the quality of our decisions and luck. Learning to
recognize the difference between the two is what thinking in bets is all about. Chapter 1: Life is Poker, Not Chess Pete Carroll was a victim of our tendency to equate the quality of a decision with the quality of its outcome. Poker players have a word for this: “resulting”. When I started playing poker, more experienced players warned me about the
dangers of resulting, cautioning me to resist the temptation to change my strategy just because a few hands didn’t turn out well in the short run.
Resulting is a routine thinking pattern that bedevils all of us. Drawing an overly tight relationship between results and decision quality affects our decisions every day, potentially with far-reaching, catastrophic consequences. When we work backward from results to figure out why those things happened, we are susceptible to a variety of cognitive
traps, like assuming causation when there is only a correlation, or cherry-picking data to confirm the narrative we prefer. We will pound a lot of square pegs into round holes to maintain the illusion of a tight relationship between our outcomes and our decisions. Our goal is to get our reflexive minds to execute on our deliberative minds’ best
intentions. “No, no,” von Neumann said. “Chess is not a game. Chess is a well-defined form of computation. You may not be able to work out the answers, but in theory there must be a solution, a right procedure in any position.
Now, real games,” he said, “are not like that at all. Real life is not like that. Real life consists of bluffing, of little tactics of deception, of asking yourself what is the other man going to think I mean to do. And that is what games are about in my theory.” Chess, for all its strategic complexity, isn’t a great model for decision-making in life, where most of
our decisions involve hidden information and a much greater influence of luck. This creates a challenge that doesn’t exist in chess: identifying the relative contributions of the decisions we make versus luck in how things turn out. Poker, in contrast, is a game of incomplete information. It is a game of decision-making under conditions of uncertainty
over time. Valuable information remains hidden. There is also an element of luck in any outcome. You could make the best possible decision at every point and still lose the hand, because you don’t know what new cards will be dealt and revealed. Once the game is finished and you try to learn from the results, separating the quality of your decisions
from the influence of luck is difficult.
The quality of our lives is the sum of decision quality plus luck. We make this same when we look for lessons in life’s results. Our lives are too short to collect enough data from our own experience to make it easy to dig down into decision quality from the small set of results we experience. We get only one try at any given decision — and that puts
great pressure on us to feel we have to be certain before acting, a certainty that necessarily will overlook the influences of hidden information and luck.
We are discouraged from saying “I don’t know” or “I’m not sure”. We regard those expressions as vague, unhelpful and even evasive. But getting comfortable with “I’m not sure” is a vital step to being a better decision maker. We have to make peace with not knowing.
What makes a decision great is not that it has a great outcome. A great decision is the result of a good process, and that process must include an attempt to accurately represent our own state of knowledge. That state of knowledge, in turn, is some variation of “I’m not sure.” An expert in any field will have an advantage over a rookie. But neither the
veteran nor the rookie can be sure what the next flip will look like. The veteran will just have a better guess. If we misrepresent the world at the extremes of right and wrong, with no shades of grey in between, our ability to make good choices — choices about how we are supposed to be allocating our resources, what kind of decisions we are supposed
to be making, and what kind of actions we are supposed to be taking — will suffer. When we think in advance about the chances of alternative outcomes and make a decision based on those chances, it doesn’t automatically make us wrong when things don’t work out. It just means that one event in a set of possible futures occured.
Any prediction that is not 0% or 100% can’t be wrong solely because the most likely future doesn’t unfold. Decisions are bets on the future, and they aren’t right or wrong based on whether they turn out well on any particular iteration. An unwanted result doesn’t make our decision wrong if we thought about the alternatives and probabilities in
advance and allocated our resources accordingly. When we think probabilistically, we are less likely to use adverse results alone as proof that we made a decision error, because we recognize the possibility that the decision might have been good but luck and/or incomplete information (and a sample size of one) intervened. Redefining wrong allows us
to let go of the anguish that comes from getting a bad result. But it also means we must redefine “right”.
If we aren’t wrong just because things didn’t work out, then we aren’t right just because things turned out well.
Chapter 2: Wanna Bet? Whenever we choose an alternative, we are automatically rejecting every other possible choice. All those rejected alternatives are paths to possible futures where things could be better or worse than the path we chose. There is potential opportunity cost in any choice we forgo. Our decisions are always bets. We routinely
decide among alternatives, put resources at risk, assess the likelihood of different outcomes and consider what it is we value. In most of our decisions, we are not betting against another person. Rather, we are betting against all the future versions of ourselves that we are not choosing. At stake in a decision is that the return to us will be greater than
what we are giving up by betting against the other alternative future versions of us. We bet based on what we believe about the world. Part of the skill in life comes from learning to be a better belief calibrator, using experience and information to more objectively update our beliefs to more accurately represent the world. The more accurate our
beliefs, the better the foundation of the bets we make. We form beliefs in a haphazard way, believing all sorts of things based just on what we hear out in the world but haven’t researched for ourselves. Daniel Gilbert: “Our default is to believe that what we hear and read is true. Even when that information is clearly presented as being false, we are
still likely to process it as true.” Truthseeking, the desire to know the truth regardless of whether the truth aligns with the beliefs we currently hold, is not naturally supported by the way we process information. We might of think ourselves as openminded and capable of updating our beliefs based on new information, but the research conclusively
shows otherwise. Instead of altering our beliefs to fit new information, we do the opposite, altering our interpretation of that information to fit our beliefs. Once a belief is lodged, it becomes difficult to dislodge. The potency of fake news is that it entrenches beliefs its intended audience already has, and then amplifies them. The Internet is a
playground for motivated reasoning. It provides the promise of access to a greater diversity of information sources and opinions than we’ve ever had available, yet we gravitate toward sources that confirm our beliefs, that agree with us. Blind-spot bias (an irrationality where people are better at recognizing biased reasoning in others but are blind to
bias in themselves) is greater the smarter you are. When someone challenges us to bet on a belief, signaling their confidence that our belief is inaccurate in some way, ideally it triggers us to vet the belief, taking an inventory of the evidence that informed us. Being asked if we are willing to bet money on it makes it much more likely that we will
examine our information in a less biased way, be more honest with ourselves about how sure we are of our beliefs, and be more open to updating and calibrating our beliefs. Offering a wager brings the risk out in the open, making explicit what is already implicit (and frequently overlooked.) What if, in addition to expressing what we believe, we also
rated our level of confidence about the accuracy of our belief on a scale of zero to ten? Incorporating percentages or ranges of alternatives into the expression of our beliefs means that our personal narrative no longer hinges on whether we were wrong or right but on how well we incorporate new information to adjust the estimate of how accurate
our beliefs are. By saying “I’m 80%” and thereby communicating that we aren’t sure, we open the door for others to tell us what they know.
They realize they can contribute without having to confront us by saying or implying “you’re wrong.” Chapter 3: Bet to Learn: Fielding the Unfolding Future What are the obstacles in our way that make learning from experience so difficult? There is a big difference between getting experience and becoming an expert. That difference lies in the
ability to identify when the outcomes of our decisions have some thing to teach us and what that lesson might be. Any decision is a bet on what will likely create the most favourable future for us. How we figure out what — if anything — we should learn from an outcome becomes another bet. To reach our long-term goals, we have to improve at
sorting out when the unfolding future has something to teach us, and when to close the feedback loop. If making the same decision again would predictably result in the same outcome, or if changing the decision would predictably result in a different outcome, then the outcome following that decision was due to skill. If our decisions didn’t have much
impact on the way things turned out, then luck would be the main influence. For any outcome, we are faced with this initial sorting decision. That decision is a bet on whether the outcome belongs in the luck or skill bucket. The way we field outcomes is predictably patterned: we take credit for the good stuff and blame the bad stuff on luck. The result
is we don’t learn from our experience. Maybe the solution that has evolved is to compensate for the obstacles in learning from our own experience by watching other people do stuff. Instead of feeling bad when we have to admit a mistake, what if the bad feeling came from the thought that we might be missing a learning opportunity just to avoid
blame? Keep the reward of feeling like we are doing well compared to our peers, but change the features by which we compare ourselves: be a better credit-giver than your peers, more willing than others to admit mistakes, etc. In this way, we can feel that we are doing well by comparison because we are doing something unusual and hard that most
people don’t do. A good strategy for figuring out which way to bet would be to imagine if that outcome had happened to us. Chapter 4: The Buddy System Members of a decision pod can be formed by anyone where members can talk about their decision making. Forming or joining a group where the focus is on thinking in bets means modifying the
usual social contract. It means agreeing to be open-minded to those who disagree with us, giving credit where it’s due, and taking responsibility where it’s appropriate, even when it makes us uncomfortable. Complex and open-minded thought is most likely to be activated when decision makers learn prior to forming any opinions that they will be
accountable to an audience (a) whose views are unknown, (b) who is interested in accuracy, © who is reasonably wellinformed, and (d) has a legitimate reason for inquiring into the reasons behind participants’ judgments/choices. They should also encourage and celebrate a diversity of perspectives to challenge biased thinking by individual members.
For engaging in the difficult work involved in sobriety, local AA groups given tokens or chips celebrating the length of individual members’ sobriety. The tokens are a tangible reminder that others acknowledge you are accomplishing something difficult. There are chips for marking one to 65 years of sobriety. There are also chips given for every month
of sobriety in the first year. A diverse group can do some of the heavy lifting of debiasing for us. After 9/11, the CIA created “red teams” that are dedicated to arguing against the intelligence community’s conventional wisdom and spotting flaws in logic and analysis. A growing number of businesses are implementing betting markets to solve for the
difficulties in getting and encouraging contrary opinions. Chapter 5: Dissent to Win Ideal-type model of self-correcting epistemic community: 1. Communism — data belong to the group 2.
Universalism — apply uniform standards to claims and evidence, regardless of where they came from 3. Disinterestedness — vigilance against potential conflicts that can influence the group’s evaluation 4. Organised Skepticism — discussion among the group to encourage engagement and dissent When presenting a decision for discussion, we should
be mindful of details we might be omitting and be extra-safe by adding anything that could possibly be relevant. On the evaluation side, we must query each other to extract those details when necessary. Don’t disparage or ignore an idea just because you don’t like who or where it came from. The substance of the information has merit/lack of
merit separate from where it came from One way to disentangle the message from the messenger is to imagine the message coming from a source we value much more or much less. After the outcome, make it a habit when seeking advice to give the details without revealing the outcome. Another way a group can de-bias members is to reward them
for skill in debating opposing points of view and finding merit in opposing positions. Lead with assent. Listen for the things you agree with, state those and be specific, and then follow with “and” instead of “but” Ask for a temporary agreement in engage in truthseeking. If someone is off-loading emotion to us, we can ask them if they are just looking to
vent or looking for advice.
Chapter 6: Adventures in Mental Time Travel just as we can recruit other people to be our decision buddies, we can recruit other versions of ourselves to act as our own decision buddies. Several organizations and companies with an interest in encouraging retirement planning have resources that allow clients to “meet” their future selves as they
make retirement decisions. In the simplest versions of these tools, clients plug in their age, income, savings practices and retirement goals. The apps then show the client the financial situation and lifestyle their futureself can expect, compared with the present. Business journalist and author Suzy Welch developed a popular tool known as 10–10–
10 that has the effect of bringing future-us into more of our in-the-moment decisions. Every 10–10–10 process starts with a question… What are the consequences of each of my options in ten minutes? In ten months? In ten years? We can build on Welch’s tool by asking the questions through the frame of the past: “how would I feel today if I had made
this decision ten minutes ago? Ten months ago? Ten years ago?” Our problem is that we’re ticker watches of our own lives.
Happiness is not best measured by looking at the ticker, zooming in and magnifying moment-bymoment or day-by-day movements. We would be better off thinking about our happiness as a long-term stock holding. Ulyssess contract — involve raising a barrier against irrationality. A kind of precommitment contract designed to lower/raise barriers that
interfere with rational action A “decision swear jar” is a simple kind of precommitment contract. We identify the language and thinking patterns that signal we are veering from our goal of truthseeking.
When we find ourselves using certain words or succumbing to the thinking patterns we are trying to avoid because we know they are signs of irrationality, a stop-and-think moment can be created. Belief -> Bet -> Set of outcomes Thinking about what futures are contained in that set (which we do by putting memories together in a novel way to
imagine how things might turn out) helps us figure out which decisions to make. Figure out the possibilities, then take a stab at the probabilities. The best poker players think beyond the current hand into subsequent hands: how do the actions of this hand affect how they and their opponents make decisions on future hands? Whether it involves sales
strategies, business strategies or courtroom strategies, the best strategists are considering a fuller range of possible scenarios, anticipating and considering the strategic responses to each, and so on deep into the decision tree. Prospective hindsight — imagining that an event has already occurred — increases the ability to correctly identify reasons for
future outcomes by 30% Backcasting: we imagine we’ve already achieved a positive outcome, holding up a newspaper with the headline “We Achieved Our Goal!”, then we think about how we got there. Premortem: an investigation into something awful, but before it happens. Imagining a headline “We Failed To Reach Our Goal” challenges us to think
about ways in which things could go wrong that we otherwise wouldn’t if left to our own devices. Has Thinking in Bets by Annie Duke been sitting on your reading list? Pick up the key ideas in the book with this quick summary. There are very few sure things in life, so when we make decisions, we play the odds. Whether it’s what to study, which job to
apply for or which house to buy, the outcomes of our decisions rely on many other factors. It’s simply not possible to know every single relevant variable when we make up our minds.
Like poker, life-changing decisions are largely based on luck. But calling it all “luck” is a bit disingenuous – it’s more like a game of probabilities. What’s more, the decisions we make are linked to the ways our brains are neurologically wired. So what can you control? Well, probably more than you think. With lessons grounded in poker and cognitive
psychology, this book summary explain how it all starts with thinking in bets. In this summary of Thinking in Bets by Annie Duke,You’ll find out why hard-luck stories are a waste of time; why car accidents are always someone else’s fault; and how time travel might be our best tool for making decisions. Super Bowl XLIX ended in controversy. With 26
seconds left in the game, everyone expected Seattle Seahawks coach Pete Carroll to tell his quarterback, Russell Wilson, to hand the ball off. Instead, he told Wilson to pass. The ball was intercepted, the Seahawks lost the Super Bowl, and, by the next day, public opinion about Carroll had turned nasty. The headline in the Seattle Times read:
“Seahawks Lost Because of the Worst Call in Super Bowl History”! But it wasn’t really Carroll’s decision that was being judged. Given the circumstances, it was actually a fairly reasonable call. It was the fact that it didn’t work. Poker players call this tendency to confuse the quality of a decision with the quality of its outcome resulting, and it’s a
dangerous tendency. A bad decision can lead to a good outcome, after all, and good decisions can lead to bad outcomes. No one who’s driven home drunk has woken up the next day and seen it as a good decision just because they didn’t get into an accident. In fact, decisions are rarely 100 percent right or wrong. Life isn’t like that. Life is like poker, a
game of incomplete information – since you never know what cards the other players are holding – and luck. Our decision-making is like poker players’ bets. We bet on future outcomes based on what we believe is most likely to occur. So why not look at it this way? If our decisions are bets, we can start to let go of the idea that we’re 100 percent
“right” or “wrong," and start to say, “I’m not sure.” This opens us up to thinking in terms of probability, which is far more useful. Volunteering at a charity poker tournament, the author once explained to the crowd that player A’s cards would win 76 percent of the time, giving the other player a 24 percent chance to win. When player B won, a
spectator yelled out that she’d been wrong. But, she explained, she’d said that player B’s hand would win 24 percent of the time.
She wasn’t wrong. It was just that the actual outcome fell within that 24 percent margin. We all want to make good decisions. But saying, “I believe X to be the best option” first requires good-quality beliefs. Good-quality beliefs are ideas about X that are informed and well thought-out. But we can’t expect to form good-quality beliefs with lazy
thinking. Instead, we have to be willing to do some work in the form of truth-seeking. That means we have to strive for truth and objectivity, even when something doesn’t align with the beliefs we hold. Unfortunately, truth-seeking runs contrary to the ways we’re naturally wired. For our evolutionary ancestors, questioning new beliefs could be
dangerous, so it was low priority. If you hear a lion rustling in the grass, for example, you’re less likely to stop and analyze the situation objectively, and more likely to just run! When language developed, we could communicate things that our own senses had never experienced, leading to the ability to form abstract beliefs. This ability worked via our
old belief-forming methods, though, and questioning remained something we did after belief-forming and only infrequently. In 1993, Harvard psychology professor Daniel Gilbert and his colleagues conducted experiments showing that this tendency to believe is still with us. In the experiments, participants read statements color-coded as either true or
false. Later, they were asked to remember which statements were true and which were false. But this time, they were distracted so as to increase their cognitive load and make them more prone to mistakes. In the end, the subjects’ tendency was to simplify believe that statements had been true – even those that had “false” color-coding. And as easily
as beliefs are formed, they’re equally hard to change. When we believe something, we try to reinforce it with motivated reasoning. That is, we seek out evidence that confirms our belief, and ignore or work against anything contradictory.
After all, everyone wants to think well of themselves, and being wrong feels bad. So information that contradicts our beliefs can feel like a threat. The good news is, we can work around our tendencies with a simple phrase: “Wanna bet?"If we were betting on our beliefs, we’d work a lot harder to confirm their validity. If someone bets you $100 that a
statement you made was false, it changes your thinking about the statement right away. It triggers you to look more closely at the belief in question, and motivates you to be objectively accurate. This isn’t just about money. Whenever there’s something riding on the accuracy of our beliefs, we’re less likely to make absolute statements and more likely
to validate those beliefs. Focusing on accuracy and acknowledging uncertainty is a lot more like truth-seeking, which gets us beyond our resistance to new information and gives us something better on which to bet. The best way to learn is often by reviewing our mistakes. Likewise, if we want to improve our future outcomes, we’ll have to do some
outcome fielding. Outcome fielding is looking at outcomes to see what we can learn from them. Some outcomes we can attribute to luck and forget about – they were out of our control anyway. It’s the outcomes that seem to have resulted primarily from our decisions that we should learn from. After analyzing those decisions, we can refine and update
any beliefs that led to our initial bet. Here’s an example: A poker player who has just lost a hand needs to quickly decide whether it was luck or her own poker-playing skill that was responsible. If it was skill, then she needs to figure out where her decision-making went wrong so she doesn't repeat the mistake.
Most outcomes result from a mix of skill, luck, and unknown information. That’s why we often make errors in our fielding. Knowing how much of each is involved is tricky. Plus we’re all subject to self-serving bias.
We like to take credit for good outcomes and blame bad outcomes on something or someone else. For example, social psychologist and Stanford law professor Robert MacCoun examined accounts of auto accidents. In multiple-vehicle accidents, he found that drivers blamed someone else 91 percent of the time. And 37 percent of the time they still
refused responsibility when only a single vehicle was involved. We can try to circumvent self-serving bias by looking at other people’s outcomes. But in that case, it just operates in reverse: we blame their successes on luck and their failures on bad decisions. Chicago Cubs fan Steve Bartman found this out the hard way in 2003 when he accidentally
deflected a fly ball from Cubs left fielder Moises Alou.
The Cubs lost the game and Bartman became the subject of angry fans’ harassment and even violence for more than a decade. But why was Bartman held responsible? He tried to catch the ball, just as lots of other fans did. But Bartman had the bad luck of deflecting it.
The world saw the other fans’ good outcome, that is, not touching the ball was a result of their good decision not to intervene. Whereas Bartman’s bad outcome was all his fault. Phil Ivey is one of the best poker players in the world. He’s admired by his peers and has been incredibly successful in every type of poker. One big reason for this? Phil Ivey
has good habits. Habits work in neurological loops that have three parts: cue, routine and reward. As Pulitzer-prize-winning reporter Charles Duhigg points out in his book The Power of Habit, the key to changing a habit is to work with this structure, leaving the cue and reward alone but changing the routine. Let’s say you want to minimize your self-
serving bias in poker, but your habit is to win a hand (cue), attribute it to your skill (routine) and feed your positive image of yourself (reward). You might try attributing each win to a combination of luck and skill in order to change the habit.
But how do you then get that boost to your self-image? Instead of feeling good about being a winning poker player, you can feel good about being a player who’s good at identifying your mistakes, accurately fielding your outcomes, learning and making decisions. That’s where Phil Ivey excels. His poker habits are built around truth-seeking and
accurate outcome fielding rather than self-serving bias. The author mentions a 2004 poker tournament in which Ivey mopped the floor with his competitors, then spent a celebratory dinner afterward picking apart his play and seeking opinions about what he might have done better. Unfortunately, most of us don’t have habits as good as Phil Ivey’s, but
that doesn’t mean we can’t work with what we’ve got. One way we can improve the way we field outcomes is to think about them in terms of – you guessed it – bets. Let’s say we got into a car accident on an icy stretch of road. It might be that we were unlucky, that’s all.
But would that explanation satisfy you if you had to bet on it?
Chances are, you’d start to consider other explanations, just to be sure. Maybe you were driving too fast, or maybe you should have pumped your brakes differently. Once the stakes are raised, we start to look into the causes a little more seriously, to help us move beyond self-serving bias and become more objective. As a fringe benefit, this
exploration also makes us look at things with a little more perspective. We start to see explicitly that outcomes are a mixture of luck and skill. Despite our hard-wired tendencies, this forces us to be a little more compassionate when evaluating the other peoples’ – and our own – outcomes. We’ve all got blind spots, which makes truth-seeking hard. But
it’s a little easier when we enlist the help of a group. After all, others can often pick out our errors more easily than we can. But to be effective, a group dedicated to examining decisions isn’t like any other. It has to have a clear focus, a commitment to objectivity and open-mindedness, and a clear charter that all members understand. The author was
lucky early in her career to be brought into a group like this, made up of experienced poker players who helped each other analyze their play. Early on, poker legend Erik Seidel made the group’s charter clear when, during a break in a poker tournament, the author tried to complain to him about her bad luck in a hand. Seidel shut her down, making it
crystal clear that he had no interest. He wasn’t trying to be hurtful, he said, and he was always open to strategy questions. But bad-luck stories were just a pointless rehashing of something out of anyone’s control. If she wanted to seek the truth with Seidel and his group, she would have to commit to objectivity, not moaning about bad luck. She did,
and over time, this habituated her to working against her own biases, and not just in conversations with the group. Being accountable to committed truth-seekers who challenged each other’s biases made her think differently, even when they weren’t around. In a decision-examining group committed to objective accuracy, this kind of change is self-
reinforcing. Increasing objectivity leads to approval within the group, which then motivates us to strive for ever-greater accuracy by harnessing the deep-seated need for group approval that we all share. Seeking approval doesn’t mean agreeing on everything, of course.
Dissent and diversity are crucial in objective analysis, keeping any group from being more than an echo chamber.
Dissent helps us look more closely at our beliefs. That’s why the CIA has “red teams," groups responsible for finding flaws in analysis and logic and arguing against the intelligence community’s conventional wisdom. And as NYU’s professor Jonathan Haidt points out, intellectual and ideological diversity in a group naturally produces high-quality
thinking. Shared commitment and clear guidelines help define a good-quality decision-examining group. But once you’ve got that group, how do you work within it? You can start by giving each other CUDOS. CUDOS are the brainchild of influential sociologist Merton R. Schkolnick, guidelines that he thought should shape the scientific community.
And they happen also to be an ideal template for groups dedicated to truth-seeking. The C in CUDOS stands for communism. If a group is going to examine decisions together, then it’s important that each member shares all relevant information and strives to be as transparent as possible to get the best analysis. It’s only natural that we are tempted
to leave out details that make us look bad, but incomplete information is a tool of our bias. U stands for universalism – using the same standards for evaluating all information, no matter where it came from.
When she was starting out in poker, the author tended to discount unfamiliar strategies used by players that she’d labeled as “bad.” But she soon suspected that she was missing something and started forcing herself to identify something that every “bad” player did well. This helped her learn valuable new strategies that she might have missed and
understand her opponents much more deeply. D is for disinterestedness and it’s about avoiding bias. As American physicist Richard Feynman noted, we view a situation differently if we already know the outcome. Even a hint of what happens in the end tends to bias our analysis. The author’s poker group taught her to be vigilant about this. But,
teaching poker seminars for beginners, she would ask students to examine decision-making by describing specific hands that she’d played, omitting the outcome as a matter of habit. It left students on the edge of their seats, reminding them that outcomes were beside the point! “OS” is for organized skepticism, a trait that exemplifies thinking in bets.
In a good group, this means collegial, non-confrontational examination of what we really do and don’t know, which keeps everyone focused on improving their reasoning. Centuries ago, the Catholic church put this into practice by hiring individuals to argue against sainthood during the canonization process – that’s where we get the phrase “devil’s
advocate.” If you know that your group is committed to CUDOS, you’ll be more accountable to these standards in the future. And the future, as we’ll see, can make us a lot smarter about our decisions. Comedian Jerry Seinfeld describes himself as a “Night Guy.” He likes to stay up late at night and doesn’t worry about getting by on too little sleep.
That’s Morning Jerry’s problem, not Night Jerry. No wonder Morning Jerry hates Night Jerry so much – Night Jerry always screws him over. It’s a funny description, but temporal discounting – making decisions that favor our immediate desires at the expense of our future self – is something we all do. Luckily, there are a few things we can do to take
better care of our future selves. Imagining future outcomes is one. Imagined futures aren’t random. They’re based on memories of the past. That means that when our brains imagine what the future will be like if we stay up too late, they’re also accessing memories of oversleeping and being tired all day long, which might help nudge us into bed. We
can also recruit our future feelings using journalist Suzy Welch’s “10-10-10.” A 10-10-10 brings the future into the present by making us ask ourselves, at a moment of decision, how we’ll feel about it in ten minutes, ten months and ten years. We imagine being accountable for our decision in the future and motivate ourselves to avoid any potential
regret we might feel. And bringing the future to mind can also help us start planning for it. The best way to do this is to start with the future we’d like to happen and work backward from there. It’s a matter of perspective: the present moment and immediate future are always more vivid to us, so starting our plans from the present tends to make us
overemphasize momentary concerns. We can get around this with backcasting, imagining a future in which everything has worked out, and our goals have been achieved, and then asking, “How did we get there?" This leads to imagining the decisions that have led us to success and also recognizing when our desired outcome requires some unlikely
things to happen. If that’s the case, we can either adjust our goals or figure out how to make those things more likely. Conversely, we can perform premortems on our decisions. Premortems are when we imagine that we’ve failed and ask, “What went wrong?" This helps us identify the possibilities that backcasting might have missed. Over more than
20 years of research, NYU psychology professor Gabrielle Oettingen has consistently found that people who imagine the obstacles to their goals, rather than achieving those goals, are more likely to succeed. We’ll never be able to control uncertainty, after all. We might as well plan to work with it.
The key message in this book summary: You might not be a gambler, but that’s no reason not to think in bets. Whether or not there’s money involved, bets make us take a harder look at how much certainty there is in the things we believe, consider alternatives and stay open to changing our minds for the sake of accuracy. So let go of “right” and
“wrong” when it’s decision time, accept that things are always somewhat uncertain and make the best bet you can. Actionable advice: Try mental contrasting to make positive changes. If you want to reach a goal, positive visualization will only get you so far. In fact, research shows that mental contrasting – visualizing the obstacles that are keeping
you from your goal – will be far more effective. So if you want to lose a few pounds, don’t picture yourself looking good on the beach. Instead, think about all the desserts to which you’ll struggle to say “no” – that’s much more likely to motivate you to do the hard work. LifeClub © 2019