Hi all! I wanted to start a new discussion thread. Even though these threads are generally for paid subscribers only, I am opening this one up to everyone so I have a chance to get to know everyone in this community and we can all get to know each other.
Feel free to comment on anything that’s on your mind. For me, I have been thinking a lot recently about why it’s so important to make explicit what’s already implicit in the decisions you make. Would love to hear your thoughts on that or anything else that’s on your mind!
Hello Anne, I've enjoyed both of your books immensely. I've sent and given your book to friends and colleagues. I have no question but thank you for providing this discussion thread. I learned a lot from the other reader's questions and your answers. Great format. Thank you.
Thank you for sharing your knowledge! Caught you on Brian Koppelman’s podcast recently and joined here. Looking forward to reading both the books in the near future.
Along a similar vein, I’m wondering what’s underneath the thoughts we have that keep us from moving forward and doing something that might help us. Say hire a weight loss coach or go to bed earlier or leave a job we hate.
That's an interesting question. I would love to hear what Katy Milkman (author of How to Change) has to say on this topic of weight loss coaches and going to bed early. As far as weight loss coaches, I am guessing part of it has to do with optimism bias (we are too optimistic we can do it ourselves) and illusion of control (we think we have more control over our outcomes than we do).
When it comes to going to bed early, that is likely a temporal discounting issue...we overly wait a reward in the present compared to a reward in the future. This is the classic Walter Mischel marshmallow study where kids will take one marshmallow now instead of waiting fifteen minutes to get two marshmallows. With adults, they will stay up late instead of missing out on the fun in order to get the reward of a happier and more productive day in the future.
As for why people don't leave a job they hate, I just wrote a whole book called QUIT on that topic. You will have to check it out! :)
This is great. I read a quote recently, not sure from whom, the essence of it was that if you say yes to something you say no to something else. Yes to staying up late, no to a productive morning. (Completely embarrassed if it was you lol)
I loved Thinking in Bets, great work! I just gave a copy to my college student daughter.
I am a health and wellness coach and Lifestyle coach and here are a few of my favorite resources I wanted to share. I love BJ Fogg from Stanford University
The Maui Habit
Getting up everyday to say "Today is going to be a great day!"
Finding something you love to do for example walking and how that if you love it make the decision to do it you don't always have to find the motivation.
I think for many reasons, two of which I will offer here.
1) I think a lot of the time people don't realize they are missing information. As example, I don't think people know that you have to ask about the untreated group to understand information. So if someone says they have figured out that the best software engineers in their company are disagreeable, they don't know to ask if their unsuccessful software engineers are also disagreeable (maybe software engineers are just disagreeable!). Without knowing you need to ask for a control group, you might start looking for disagreeable people to hire, which would be a big mistake.
2) When the information someone already has confirms their priors, they will be biased to not search for new information (which might disconfirm their prior belief). So bias is a big part of why people don't search for information.
I was struggling with a decision an upcoming decision. Which week in March 2024 to choose for a Canadian Ski vacation? I remembered your section in How do Decide, "When a decision is hard, that means it’s easy" and after re-reading it I made my decision in under 5 minutes!
The important decision was decide to do to the trip in the first place. This is in alignment with my values to live a great live, be healthy and fit, and have great adventures.
The decision of which week to choose seemed hard because both were great options, and I was focusing on the potential downsides (what if I choose week when the weather/snow conditions are bad?). In reality, the odd of this are nearly identical for both weeks, so fretting won't improve my decision. Also, If one of the weeks was my "only option" I'd jump on it and be thrilled. So, I picked a week, booked it, and moved on.
Thanks Annie for such great and practical advice. I could have fretted over this decision for weeks! Now I can focus on getting excited and in shape for next year's trip!
Does anyone have a curriculum for teaching poker and probability in middle schoolers? If life is like poker rather than chess let’s expose kids to this early. As a parent it MIGHT be fun to host a tween poker night for actual face-to-face games rather than screen games. thoughts?
I'm currently in the early stages of developing a risk tool to change how lenders look at the climate risks of certain assets. The current practice is to price a set of risks at zero, although lenders freely admit that the risk is not zero. (As a side note, I find it strange how people whose job it is to think about risk seem to have no feeling for the subject.) From these sparse details, do you have any recommendations for me? One thought I've had is to portray the events behind the risks with a high degree of detail and show the cause and effect but I'd be curious if there are practices you think I should follow.
I also want to say thanks for Thinking In Bets. It's had a tremendously positive impact on me.
Just finished reading your book and thought it was very informative. If you have another revision of the book there's another story you might want to include about a major corporation that figured out when to abandon their major line of business. Intel was a semiconductor company when it was founded and they had a major share of the memory market in the 70's and 80's but Japanese manufacturers were taking market share and driving prices down precipitously. In the mid 80's the leaders Andy Grove and Gordon Moore had a conversation. As NPR reports
Grove says he and Moore were in his cubicle, "sitting around ... looking out the window, very sad." Then Grove asked Moore a question. "What would happen if somebody took us over, got rid of us — what would the new guy do?" he said. "Get out of the memory business," Moore answered. Grove agreed. And he suggested that they be the ones to get Intel out of the memory business.
Annie- Loved "Thinking In Bets". It is a framework I find I ask myself on almost a daily basis now. It is a fantastic mental model for me. A question I wonder about - do you think you would make a better poker player today that you were say 10 years ago? In other words, beyond acquiring certain objective skills in any endeavor can decision making continue to improve indefinitely (or at least until cognitive decline occurs)?
Hi Anne, I'm reading Quit now and I just commend that your storytelling is top-notch. The topic of quitting as a positive trait was always novel and useful, and you've added memorable to it.
I've two questions, both around any correlation between learning habits and quoting decisions.
1) Would you say single loop learning tends to contribute more toward status quo than quitting? If we opened ourselves to reflection, is it possible that we may decide more often to quit than to stick.
2) It is hard to get past defensive reasoning and question our assumptions when analyzing outcomes. That's the barrier, as I understand, between single and double loop learning. If so, is it plausible that a group is better placed at getting past this barrier than an individual. In the sense that it is easier to make group analysis (project retrospectives) fun by having someone moderate, but it is much harder for someone to decide to reflect upon a negative outcome on their own. I'm trying to explore if groups are at a learning advantage here.
Annie (and forum): I may be missing something basic........ {EDIT I WAS - I have left the original coomet as an example of wrong thinking}............' but with regard to the Monty Hall problem do you accept the statistical outcome is correct? If the event can be argued not to be independent, random, and a one off can statistical probability even apply? For me Monty Hall is set-up so that the first decision becomes irrelevant in all ways. The real action starts and where probability first applies is when there are only two doors? Whilst the first decision will affect the thinking of the player he is faced (and always was going to be) with at best a 50-50 choice?' .......... {EDIT: NOW I THINK I HAVE GOT IT- . The car is behind Door A so he can never open that door. If you choose Door A then Monty will open EITHER door B OR door C (this is the key as the process only provides for ONE opening of these doors) Here you switch but lose. If you guess Door B he has to open Door C (as the Car is behind A) You switch to A and win. If you select C he opens Door B again you switch to A and win again. Overall 2-1 - Neat! A good lesson learnt about fully understanding the process and using information correctly}................ 'I suppose the point is that whilst it is always better to have cold hard numbers when deciding in the quest to provide these it is possible to go too far and pin mathematical certainties incorrectly or where they do not apply'
Yes thanks. I mapped it out and having gone through the four 'logical' options on the second round then there was a beam of illumination about when you originally predict correctly there is only one opening of the doors not two. The processs is mechanistic (like a clock) rather then probablistic so I was looking at it the wrong way. But 'error is where education starts'!
Hello Annie, I've been following your writing ever since I heard you on a Jill on Money podcast episode a few years ago. I had known of your poker career before that though and it was such a pleasant surprise to re-discover you through the personal finance and behavioural psychology space. I quickly read through Thinking in Bets and How to Decide after that podcast episode, and Quit was one of my favourite books last year (along with Dan Pink's The Power of Regret and Maria Konnikova's The Biggest Bluff).
I've been pondering a behavioural situation lately that seems to be a pattern in my professional and social circle. I'm of the age where those I know seem to be falling into 2 noticeable groups: those who have stayed at the same job for many years/decades (sometimes it's their first and only job) and those who have worked at a dozen or more companies. Of those in the first group, many are unhappy but are unwilling to leave. A common reason they give is that they're hoping for a severance package. I'm guessing that this gives them a plausible and rational sounding reason to not make a change but then I read Quit and perhaps there's a strong omission-commission bias too. On paper I think the math is quite compelling that the expected value of a long-odds severance payout is lower than the well documented advantages of increasing one's income through more frequent job changes. No one is interested in seeing the numbers though :D Is this a scenario that's been academically studied? I'm a bit fascinated by it because if I squint really hard, I can see many behavioural biases within.
Check out my book QUIT. I talk about our ability to rationalize sticking with the status quo. The stickers will generally be unhappier but will easily come up with reasons it is better to stay. Goes into the devil you know category quite a bit.
In an uncertain environment any decision will normally have to go beyond the facts (the known) and there has to be management of that element of uncertainty. Whether we like it or not, to some degree, ‘feelings’ about the uncertainty will have a part in the process. It would be stupid for a person who has considerable experience of an environment not to use those feelings – it could be called judgement – in that ‘it looks right and feels right’ will normally beat it ‘looks right but feels wrong ’ or 'just looks right'? { Edit - PS In terms of the Monty Hall the 'look' is the choice which seems to have improved from 1 in 2 to 50-50 and ,as it stands now, actually is. But is the problem really statistical? Patrick Veitch (a noted British punter on horse racing) suggests the problem is actually behavioural because Monty Hall has shown you the goat and also knows what is behind the other doors. If you do not actually read between the lines then you would surely have some feeling about your original choice being wrong (even though you may stick with it for other reasons - it woud fall into the 'looks good- feels bad' category). }
The thing about feelings, and I agree they are inputs into any decision, is that feelings are generally left unexamined. They are inputs so make those gut feelings explicit so that you can interrogate them. Otherwise you may never figure out it is a mistake not to switch doors.
Thanks Annie - I agree and they have to be part of the decison process and handled as carefully as the data, logic and beliefs that go into making choices. Enjoying this open forum, so much to take in.
I just accepted new temporary new job assignment. Annie’s concept of Free Roll (if I don’t like this job it only lasts 4 month and I still can come back to my current job) and encouragement to assess probability of outcomes, (85-98%) probability that I’ll continue to frustrated with my current leadership in the next 6 months and 55% - 80% probability that I’ll enjoy new job that will lead to interesting new connections helped me make this decision.
Your post on omission and commission bias gave me a lot to think about and I'm curious to hear your feedback on an idea that came up while applying your writing to risk taking. I think that we tend to think of risk dualistically, that we think we either hit home runs or strike out, which glosses over all the neutral outcomes. I think that when we increase the amount of risks we take, we think that the result will be to decrease positive outcomes and increase negative ones because equate "taking more risks" with irresponsible behaviors/decisions. But I think that what we're actually doing when we increase risk is squeezing some neutral outcomes out towards each end. The question becomes how often does the changed neutral outcome result in a positive or negative outcome instead. I'm curious to hear your thoughts and, if my thinking seems useful, how would you go about testing such an idea? Thanks.
Hi Annie Duke! I am so happy to be here and have learned so much from your research and how to apply practical tools for better decision making that I use in my work and everyday life.
I learned about you from Dr. Nirav Shah from Stanford University. I I am the lead educator for his course on Healthcare Leadership. The students also love your work and have found it valuable as they lead and shift.
I look forward to more conversations with you and others around decision making for healthcare workers and their own wellbeing and Life/ Work integration
As Well as an athlete lover of sports decision making. Sports like football, Mountain Biking , Ice Climbing etc... some of my Favs I would love to start this with you!
I'm reading your book "Thinking In Bets" and have watched a few discussions in which you spoke. I've spent some thinking time about the topic of assigning probabilities to outcomes in the everyday world, as well as practicing that myself, some years back. For me, the practice seemed to be a subversion of recognizing indicators of a typical outcome. Assigning probability was like a guesstimate at a fuzzy match of indicators, pro and con, of the outcome. I could emphasize one indicator, and raise the probability, emphasize another, and lower the probability, think of an outcome, and feel confident it will occur, but then think of a contrary outcome, and find that just as credible. When deciding how to weigh uncertain outcomes, weighting schemes seemed untrustworthy, because they would fit any narrative that I wanted. Uncertainty was more like a choice among pretended-to beliefs. Truth was I was ignorant, just didn't have enough information, but was desperate to control my future in the short-term.
At the same time, I noticed that I could distinguish something I believed would occur from other possibilities according to whether I expected it to happen. It didn't matter that I could disagree with it's occurrence or think of plausible alternatives. I expected it to happen. It seemed like it was easier to believe and be wrong than it was to "develop uncertainty" about the future, if that makes any sense.
I've been puzzling over my subjective experience, wondering about its contrast with reports from people who claim that they find confidence in the future reassuring but then hate to be wrong. I hate to be wrong too, but I go ahead and have confidence about my claims right up until they're proved wrong. My use of credences, these partial beliefs about outcomes, with probabilities assigned to them, was just a way to dodge that my expectations had consequences for me that I didn't want to acknowledge. If I kept blaming probabilities, I could avoid acting on what I thought was true or avoid feeling trapped because there was nothing to do or nothing I was brave enough to do. However, now it seems better to qualify predictions with as many meaningful, explicit caveats as I can find, and have a robust response ready, since different outcomes could occur, but still keep a belief about what will occur if I'm going to act on that belief anyway (for example, because I have to act on that belief anyway). I try to find early indicators that what I want to do has been prevented from happening in advance by something else so that I don't waste my effort, unless I'm going to do it anyway, in which case, well....
Reading through all that, if you got this far, I welcome your thoughts and comments. I enjoy your grasp of bias and the overall structure you impose on decisions, but see your practice of assigning odds as more like measuring your own feelings of certainty, regardless of the facts. I find my certainty to be easy to falsify, when I have it, so I avoid doing that, it's an uncomfortable walk down memory lane. We are very different people in different environments, so maybe my way of thinking lacks applicability to you and most others, I haven't thought about that much. Either way, your general grasp of decision-making is solid, and you pack many useful recommendations and distinctions into your work.
With implicit information we are still being impacted by the implicit data however the impact on our decision is unknown. So to have this hidden source more cognitively available we can weigh it with our explicit data. Seems a potential for increased data points though, causing some potential dissonance, thus reasonable that decision makers might avoid dealing with the implicit if nothing else than to avoid stress.
The issue, though, is that the implicit is included in the decision whether you make it explicit or not. If I decide when to set my alarm, implicit in that decision is a forecast of how long I will take to get ready/how long it will take me to get to work. Always better if it is already implicit to make it explicit...unless of course the decision is very low impact. Then don't bother with too much process.
Thinking in Bets is about probabilistic thinking in three categories: the known, the unknown, and the unknowable. Is there a default for % unknowable for given decisions? eg black swan stuff? how do you quantitatively balance or estimate known and unknown?
I don't think there is a default percentage for unknowable simply because it is unknowable. focus on the quality of the known and whether it is worth the cost (time and other resources) to discover the unknown in deciding how fast to go.
Big fan of the field having read books by Kahneman, Ariely and you of course. I'm always amazed at the lower turnout for local elections vs. national ones, yet the former is much more likely to have a direct impact on the voter's life. Would you chalk this up to availability bias since national elections are so widely covered in the news whereas local ones have much less coverage or is there some other bias at play?
I'd guess availability bias is at play. There is also less social proof and social comparison with local elections (fewer of your friends vote). There is great work by Todd Rodgers on the effect of comparison to Neighbors on saving energy. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2589269 I believe this is related!
The Book of Why: The New Science of Cause and Effect by Judea Pearl is my current read. I can’t claim to understand it all, but he explains how complicated interactions can be explored by diagramming them and calculating probabilities. At its most extreme, the methods offer the equivalent of a randomized control trial with just the data you happen to have on hand. The math and the concepts are just on the edge of my understanding, but I hope they will add to your insights into rational decision making.
For both, I suggest making sure that you create good decision rubrics (including forecasts) and make sure you fill them out for big decisions. Also, create kill criteria. Both of these methods are in How to Decide and Quit. Also, very much part of the book Noise by Kahneman, Sibony, and Sunstein.
The reason this is helpful is that is will improve accuracy but also make you less likely to regret a decision because you can 1) Look back at the process that went into deciding, including your foresight of the probability of an undesirable outcome and what the alternatives were at the time and 2) You are more likely to cut your losses sooner with kill criteria to implement.
Why do you think so many people operate with blinkers on? It’s a decision to be short sighted and only think first order. Do you think it’s the way we are shaped and conditioned?
Or do you think we are primed for survival and therefor most make decisions based on survive not thrive?
Oh, gosh...well I always preface any evolutionary explanation with the caveat that it is a just so story. But...here goes...when we evolved, we stayed in a small territory and interacting with less than 300 people. Many of the heuristics we developed were energy savers and also mostly accurate. Availability bias, for example, maps to what is actually the most frequent if you only experience a small territory and only a handful of folks. Second order thinking is effortful. We are energy savers and a lot of heuristics work well enough in many environments. The problem is that when they don't, they really don't. I don't think we have caught up with ourselves or our environments.
Thanks Annie. I think you hit the nail on the head “I don't think we have caught up with ourselves or our environments.”
I often compare how we evolve in work and out of work, and how you mimic others to do well in that work environment that ultimately hinders most when they leave
The reason is combinatorial explosion. That is, in any situation (and I do mean any), the number of things that could be considered before taking action is infinite. Therefore, our brains have to limit the scope of the things that we consider to be important enough to deal with. This process is called Relevance Realisation. The problem with this is that we can easily confuse things that are salient with things that are relevant. So for the example above, the hiring person's experience of engineers being disagreeable is salient but not necessarily relevant.
For a proper discussion of this I'd direct interested readers to John Vervaeke's magnum opus, Awakening from the Meaning Crisis.
Thanks Steve. I’ll take a look into your recommendation.
My recent observation is first order effects thinking is default to 90% of people and increasing. It’s very rare even with c suite they are really thinking second to third order and to mid to long term and I see this impacting business and personal relationships as busy is everyone’s badge of honour rather than going deeper into topics and themes to really understand the ripple effects
I couldn't say what the percentage of people who rely on what Kahneman calls "fast thinking" is or whether it's increasing.
My perspective on the C-suite is that this kind of second-order thinking as you term it is something that the inhabitants definitely "ought" to be doing. The questions I have long pondered are, why don't they do that kind of thinking and how did they manage to get selected for the role?
As someone who has been close to, but not in the C-suite in a number of different roles, my intuition is that the skills that are required to attain the C-suite are not the same as the skills that are required to be effective in the C-suite. I think that people also underestimate the role that luck plays in early career success - this applies to both the people who have these "successful" career trajectories and those who hire them.
From experience and coaching c suites - Most c-suite is about protecting your team, your budget and thinking about 4-6 months ahead. It’s often survival versus how do we thrive and collaborate. Very human instincts but many are being reshaped to think shorter term and protect themselves at all costs. Environmental for sure, but blinkered and detrimental for wider successes
I loved reading your book “Thinking in Bets -“. My immediate question that I wanted to find out an answer for was how do you find out the unconsciously formed beliefs you might have? If this is influencing the way I process information (motivated reasoning) then what are the ways to find out the beliefs I formed in the past unknowingly? (Hear it-->Believe it-->Vet rarely).
Ask others opinions without telling them yours. You will find gaps between what you think and how you model a problem and what they think. Dig into those gaps to discipline bias.
I have not read “How to Decide” yet, but I was wandering how do you decide when the chances of the outcome that you value (as there could be many types of success for a given decision) is 50/50, or simply you can’t know? I figured when things are not 50/50, it is easier to make a decision. But what is always a difficult decision is when you don’t know the probability of succeeding or that it is 50/50 chance.
This goes under the category of “when a decision is hard that means it’s actually easy.” When we get to two options that are so close we often spin out and waste time trying to parse out which is better. But for most things, given uncertainty, we can’t actually parse those differences out. Not only that, it isn’t worth the time. If the options are so close, you can’t be that wrong whichever you choose better to spend that time on other things.
So, when you are in these fifty fifty situations, instead of going into analysis paralysis, just flip a coin to decide!
A more fleshed out answer is in How to Decide in the section on a sheep in wolf’s clothing.
Thanks for opening up this discussion, Annie! Just finished Thinking in Bets and about to start How to Decide.
I’ve got two questions my mind:
1. Do you have any thoughts or recommendations on decision-making frameworks for people with ADHD? I’m in the (arduous, labyrinthian) process of getting a diagnosis as an adult, which is making me more acutely aware of particular biases that I may either overlook or willfully (and irrationally) accept.
2. Do you have any suggestions on or can you recommend resources for assembling a decision-making support group? I’m a hard introvert when it come to online communities, so I’d love to find a very supportive guide for forming such a group online and/or a guide for starting an in-person group with strangers. (I don’t have enough friends in the area who are convinced of this premise, though I have already recommended your books.)
1) Checklists, Checklists, Checklists, and Rubrics! It is hard for someone who does not have ADHD to operate without these. It is even more beneficial for someone who does.
2) Yeah so I'm a bit in the same boat. That being said, I think if you can get into some cohort classes, that will naturally put you will like-minded people interested in improving decision making. I found it so much easier to form groups back when I was in school. I think creating natural collision with people makes it much easier for introverts.
I find this question fascinating as somebody who leads in schools for a couple of reasons:
1) we find that almost every school operates an implicit model for the practice of their teachers (good people off in classrooms making an enormous number of daily practice decisions at random) rather than an explicit practice framework. This robs any school of the chance for its teachers to practice consistently.
2) it really did get me thinking about that volume of bets (microdecisions) that teachers/principals make every day and how it must be among the highest for any profession.
This is going to sound a bit morbid. As I get older I find myself thinking more about the decisions I am making throughout the day than ever before. Prior to starting any project or devoting my energy and time toward anything, I think about the "end point" of moving forward and work backward. Is this worth my time? Am I getting any value out of what I'm about to start? My time is valuable and I don't want to waste any of it. Go/no-go decision on anything is hard given how little time we all have on this planet.
I’m curious if you’re focused on energy and time only as your own inputs, or if you also consider the “return” you might get in energy and time.
I have one older friend who has fully embraced a “return on energy” framework for making decisions, including what projects to work on and whom to spend time with. She reports that thinking about how energized she might feel by doing something provides a better metric compared to monetary return. (She’s even repositioned her consulting practice on this premise.)
So, your focus on time and energy seems quite valid for a personal decision-making framework.
Unrelated but maybe related….I have been reflecting lately about how times that I’ve gone about my pursuits (education and music performance) in methodical ways, and in “intuitive” ways, for lack of a better term. Although music is an art, there are principles that apply. Same with teaching. I was struck today when I watched the video on Monty Hall and the selection of doors. I totally resisted the explanation of logic - until I didn’t! And it’s caused me to reflect on times when I’ve prepared for music performance (made decisions) using methodology and known, tested principles - even in art, proper decision making wins the day. At least at my level it does.
Love this! And, yes, so easy to resist the answer to the Monty Hall Problem because it doesn't "feel" right...until it does. Then once you get it you can't unsee it. I like the analogy to music/art.
To me the most terrifying aspect of my (human) nature is my ability to make terrible decisions despite obvious evidence to the contrary. Evidence at the time of the decision - not only outcome evidence.
I quit a job at the tail end of covid with an airline after 25 years in the hope of a better lifestyle with another one. The problem with being a pilot is that when you quit, you go from senior captain to bottom of the list first officer and it takes a long, long time to work your way back up. So, here I am, having quit, and desperately regretting it. There's no lifestyle and there's a 66% salary cut. Never mind the other ugly surprises of the new job.
I guess I didn't do a proper pre-mortem, despite reading about them in How to Decide.
I also disregarded the advice of someone about whom I have a negative opinion (for having acted against my and our family's interests before) and overweighted the advice of someone I respected (a professional counsellor).
I'm so sorry you are feeling these regrets. My book QUIT: the Power of Knowing When to Walk Away might help you with these types of decisions in the future.
I do think you have it right in this case. For these big, high impact, irreversible decisions also slow down, do a premortem, get outside opinions independently, look for base rates, set kill criteria, etc. Might help for you to make yourself a checklist for the future. I find it helpful!
Loved the Lit Video version of Thinking in Bets. Since then I've noticed an interesting dynamic. 1. Think of the win 2. Guess the probability 3. List aspects related to a true win 4. List the probabilities of each aspect 5. Calculate the probability. 5 is very different than 1. Going through this process helps make a better bet. Does this resonate?
Absolutely! It makes you make explicit why you are making the forecast you are. When you actually take the time to break down what is going into the forecast or judgment, it will improve accuracy, especially if you write it down and know you will check back in on it late after you see what the results are. Love your process.
I would like to know if there is a way to communicate uncertainty to people who are certain of their beliefs, and where their certainty is likely false or overstated/too high.
For example people who are not experts in a field, have little actual knowledge in that field, and whose beliefs contradict the beliefs of a lot of experts in that field.
The best way I have found is to ask if they want to bet on it. It usually gets them to back off of certainty to a place where they start examining possible gaps in their knowledge. "Wanna bet?" gets people to start interrogating their belief, asking themselves something like:
1) What does the person challenging me to the bet know that I don't know?
Of course! Offering to bet will not work perfectly but it does get people at least thinking about what they do and do not know, more so than if they didn't think of it as a bet.
Thank you! That probably was in Thinking in Bets; I may need to re-read it, I read it a couple of years ago and there are probably other oddments I have forgotten since as well as that one.
I’ve been thinking a lot about the relationship between your rational brain and your embodied brain in decision-making. You might sub “cold cognition” for rational and “hot cognition” for embodied. Or conscious and subconscious. It came to mind when reading your piece today. Our cold cognition allows us to solve problems that no other animal can. Yet it also can serve as a barrier to good decision-making, one that the pigeons in your article today weren’t burdened with. I read recently that people who have suffered brain injuries that destroy their embodied brain but leave their rational brain completely intact are very capable or perceiving options in a given situation, but completely incapable of making a decision of any kind. I find that fascinating.
So interesting. I think the not switching is a system 1 (hot cognition) problem that has to do with bias. That being said, system 2 for most of us can't properly calculate the probabilities, which pigeons seem to easily do.
This reminds me of some work from Randy Gallistel also on pigeons. He had two people sitting on adjacent park benches throwing bread to pigeons. One of the people threw the breads at four times the rate of the other person. The pigeons very quickly formed groups in the same proportion to the rate of bread throwing. Four times as many pigeons queued up in front of the faster bread thrower and even throw individual pigeons might move between the two groups, the ratio stayed the same.
In some ways, system 1 is more biased. But System 2 is slow and can still be error-filled. And humans are just pretty bad at probabilities. Worse than pigeons in many ways!
Hmm, I was initially thinking, like you, that the failure to switch was a hot cognition problem- because of the emotional attachment to the initial guess. But then I thought it might actually be a cold cognition problem because it is essentially a sunk cost bias. It certainly seems right that our hot cognition struggles badly with probabilities- I would extend that to weighing things in general (see sunk cost bias above for an example or how bad we are at perceiving exponential change.)
And by the way these pigeons are awesome! How cool is it that they congregate in ratio to the food. I wonder if there is any connection to the sensory or thought process that allows them to fly in unison or formation.
I believe the rules that allow them to fly in formation are local rules (what is the bird in front of me doing?). I wonder if it is a local rule that allows them to do these proportions as well.
I think more in terms of System 1 and System 2. Most bias is a system 1 issue although bias is an issue that influences more deliberative thought. The pigeons are obviously in system 1. They are just being reinforced for choosing to switch over time. Humans don't seem to learn from the reinforcement in the same way.
Annie, do you think Herb Simon's notion of "bounded rationality" might explain some of the issues with switching systems? IOW, what is the: complexity of the problem space + time constraints + cognitive biases + limits of knowledge.
For sure. Satisficing allows us to go fast (which is necessary for survival) but sometimes makes us go too fast, stopping us from switching systems. 100% relevant. When we think of heurisitcs, they are really simplifiers, a way to deal with complexity and the limits of our knowledge which sometimes is good (short cuts are often good enough and speed us up) and sometimes are bad, causing errors we can't tolerate.
I LOVED your books, Thinking in Bets and Quit. I'd be tremendously honored to have you come on my podcast, Meditations with Zohar; if so, email is zoharatkins at gmail dot com
Here's a metaphysical question: why is time structured in such a way that our decisions are irreversible and what would it do to our lives if we could go back and select the other options, as in sci movies like back to the future.
Does meaning of life derive fundamentally from the combination of uncertainty about how our actions will turn out and the irreversibility of our decisions?
This is so philosophical! Some decisions are reversible...not in the sense that you can literally get the time back and go back and completely start over but in the sense that you can quit what you are doing and, sometimes, go back and select an option you previously rejected. Paying attention to how reversible a decision is is actually a key decision skill!
As for the meaning of life, I have no idea! I'm still trying to figure that out for myself :)
If I wanted to become a competent poker player, eg be more likely to win when playing casually with friends, what path would you suggest?
Read books, follow great players on twitch, find a good poker solver site, and fool around with low limit games to learn to apply the concepts!
Hello Anne, I've enjoyed both of your books immensely. I've sent and given your book to friends and colleagues. I have no question but thank you for providing this discussion thread. I learned a lot from the other reader's questions and your answers. Great format. Thank you.
:)
Thank you for sharing your knowledge! Caught you on Brian Koppelman’s podcast recently and joined here. Looking forward to reading both the books in the near future.
Thank you! Brian's podcast was a blast. Love him!
Along a similar vein, I’m wondering what’s underneath the thoughts we have that keep us from moving forward and doing something that might help us. Say hire a weight loss coach or go to bed earlier or leave a job we hate.
That's an interesting question. I would love to hear what Katy Milkman (author of How to Change) has to say on this topic of weight loss coaches and going to bed early. As far as weight loss coaches, I am guessing part of it has to do with optimism bias (we are too optimistic we can do it ourselves) and illusion of control (we think we have more control over our outcomes than we do).
When it comes to going to bed early, that is likely a temporal discounting issue...we overly wait a reward in the present compared to a reward in the future. This is the classic Walter Mischel marshmallow study where kids will take one marshmallow now instead of waiting fifteen minutes to get two marshmallows. With adults, they will stay up late instead of missing out on the fun in order to get the reward of a happier and more productive day in the future.
As for why people don't leave a job they hate, I just wrote a whole book called QUIT on that topic. You will have to check it out! :)
This is great. I read a quote recently, not sure from whom, the essence of it was that if you say yes to something you say no to something else. Yes to staying up late, no to a productive morning. (Completely embarrassed if it was you lol)
I loved Thinking in Bets, great work! I just gave a copy to my college student daughter.
That is great! Thank you. I will read QUIT. I loved Thinking in Bets. I swear everybody, I did not know about QUIT! 😄
Hi Laurie,
I am a health and wellness coach and Lifestyle coach and here are a few of my favorite resources I wanted to share. I love BJ Fogg from Stanford University
The Maui Habit
Getting up everyday to say "Today is going to be a great day!"
Finding something you love to do for example walking and how that if you love it make the decision to do it you don't always have to find the motivation.
His book Tiny Habits is great. https://www.youtube.com/watch?v=NFsEOr-jJYo
Hope you enjoy and love to share thoughts.
Why do you think people make decisions without considering what information they are missing?
I think for many reasons, two of which I will offer here.
1) I think a lot of the time people don't realize they are missing information. As example, I don't think people know that you have to ask about the untreated group to understand information. So if someone says they have figured out that the best software engineers in their company are disagreeable, they don't know to ask if their unsuccessful software engineers are also disagreeable (maybe software engineers are just disagreeable!). Without knowing you need to ask for a control group, you might start looking for disagreeable people to hire, which would be a big mistake.
2) When the information someone already has confirms their priors, they will be biased to not search for new information (which might disconfirm their prior belief). So bias is a big part of why people don't search for information.
I was struggling with a decision an upcoming decision. Which week in March 2024 to choose for a Canadian Ski vacation? I remembered your section in How do Decide, "When a decision is hard, that means it’s easy" and after re-reading it I made my decision in under 5 minutes!
The important decision was decide to do to the trip in the first place. This is in alignment with my values to live a great live, be healthy and fit, and have great adventures.
The decision of which week to choose seemed hard because both were great options, and I was focusing on the potential downsides (what if I choose week when the weather/snow conditions are bad?). In reality, the odd of this are nearly identical for both weeks, so fretting won't improve my decision. Also, If one of the weeks was my "only option" I'd jump on it and be thrilled. So, I picked a week, booked it, and moved on.
Thanks Annie for such great and practical advice. I could have fretted over this decision for weeks! Now I can focus on getting excited and in shape for next year's trip!
Mark
LOVE this! Thank you so much for sharing!
Does anyone have a curriculum for teaching poker and probability in middle schoolers? If life is like poker rather than chess let’s expose kids to this early. As a parent it MIGHT be fun to host a tween poker night for actual face-to-face games rather than screen games. thoughts?
Here is a non-profit that teaches women and girls poker. I believe this might be a good start. https://pokerpower.com/
I'm currently in the early stages of developing a risk tool to change how lenders look at the climate risks of certain assets. The current practice is to price a set of risks at zero, although lenders freely admit that the risk is not zero. (As a side note, I find it strange how people whose job it is to think about risk seem to have no feeling for the subject.) From these sparse details, do you have any recommendations for me? One thought I've had is to portray the events behind the risks with a high degree of detail and show the cause and effect but I'd be curious if there are practices you think I should follow.
I also want to say thanks for Thinking In Bets. It's had a tremendously positive impact on me.
I think I would need more details. Hopefully you can join the upcoming AMA
Just finished reading your book and thought it was very informative. If you have another revision of the book there's another story you might want to include about a major corporation that figured out when to abandon their major line of business. Intel was a semiconductor company when it was founded and they had a major share of the memory market in the 70's and 80's but Japanese manufacturers were taking market share and driving prices down precipitously. In the mid 80's the leaders Andy Grove and Gordon Moore had a conversation. As NPR reports
Grove says he and Moore were in his cubicle, "sitting around ... looking out the window, very sad." Then Grove asked Moore a question. "What would happen if somebody took us over, got rid of us — what would the new guy do?" he said. "Get out of the memory business," Moore answered. Grove agreed. And he suggested that they be the ones to get Intel out of the memory business.
https://www.npr.org/2012/04/06/150057676/intel-legends-moore-and-grove-making-it-last
Annie- Loved "Thinking In Bets". It is a framework I find I ask myself on almost a daily basis now. It is a fantastic mental model for me. A question I wonder about - do you think you would make a better poker player today that you were say 10 years ago? In other words, beyond acquiring certain objective skills in any endeavor can decision making continue to improve indefinitely (or at least until cognitive decline occurs)?
Hi Anne, I'm reading Quit now and I just commend that your storytelling is top-notch. The topic of quitting as a positive trait was always novel and useful, and you've added memorable to it.
I've two questions, both around any correlation between learning habits and quoting decisions.
1) Would you say single loop learning tends to contribute more toward status quo than quitting? If we opened ourselves to reflection, is it possible that we may decide more often to quit than to stick.
2) It is hard to get past defensive reasoning and question our assumptions when analyzing outcomes. That's the barrier, as I understand, between single and double loop learning. If so, is it plausible that a group is better placed at getting past this barrier than an individual. In the sense that it is easier to make group analysis (project retrospectives) fun by having someone moderate, but it is much harder for someone to decide to reflect upon a negative outcome on their own. I'm trying to explore if groups are at a learning advantage here.
Thank you!
And imagine i got your name wrong, Annie! *facepalm* The substance app isn't kind to mistakes :(
Annie (and forum): I may be missing something basic........ {EDIT I WAS - I have left the original coomet as an example of wrong thinking}............' but with regard to the Monty Hall problem do you accept the statistical outcome is correct? If the event can be argued not to be independent, random, and a one off can statistical probability even apply? For me Monty Hall is set-up so that the first decision becomes irrelevant in all ways. The real action starts and where probability first applies is when there are only two doors? Whilst the first decision will affect the thinking of the player he is faced (and always was going to be) with at best a 50-50 choice?' .......... {EDIT: NOW I THINK I HAVE GOT IT- . The car is behind Door A so he can never open that door. If you choose Door A then Monty will open EITHER door B OR door C (this is the key as the process only provides for ONE opening of these doors) Here you switch but lose. If you guess Door B he has to open Door C (as the Car is behind A) You switch to A and win. If you select C he opens Door B again you switch to A and win again. Overall 2-1 - Neat! A good lesson learnt about fully understanding the process and using information correctly}................ 'I suppose the point is that whilst it is always better to have cold hard numbers when deciding in the quest to provide these it is possible to go too far and pin mathematical certainties incorrectly or where they do not apply'
Try it! You will see that you win 2/3rds if the time! The video is very helpful in this regard :)
Yes thanks. I mapped it out and having gone through the four 'logical' options on the second round then there was a beam of illumination about when you originally predict correctly there is only one opening of the doors not two. The processs is mechanistic (like a clock) rather then probablistic so I was looking at it the wrong way. But 'error is where education starts'!
Hello Annie, I've been following your writing ever since I heard you on a Jill on Money podcast episode a few years ago. I had known of your poker career before that though and it was such a pleasant surprise to re-discover you through the personal finance and behavioural psychology space. I quickly read through Thinking in Bets and How to Decide after that podcast episode, and Quit was one of my favourite books last year (along with Dan Pink's The Power of Regret and Maria Konnikova's The Biggest Bluff).
I've been pondering a behavioural situation lately that seems to be a pattern in my professional and social circle. I'm of the age where those I know seem to be falling into 2 noticeable groups: those who have stayed at the same job for many years/decades (sometimes it's their first and only job) and those who have worked at a dozen or more companies. Of those in the first group, many are unhappy but are unwilling to leave. A common reason they give is that they're hoping for a severance package. I'm guessing that this gives them a plausible and rational sounding reason to not make a change but then I read Quit and perhaps there's a strong omission-commission bias too. On paper I think the math is quite compelling that the expected value of a long-odds severance payout is lower than the well documented advantages of increasing one's income through more frequent job changes. No one is interested in seeing the numbers though :D Is this a scenario that's been academically studied? I'm a bit fascinated by it because if I squint really hard, I can see many behavioural biases within.
Check out my book QUIT. I talk about our ability to rationalize sticking with the status quo. The stickers will generally be unhappier but will easily come up with reasons it is better to stay. Goes into the devil you know category quite a bit.
In an uncertain environment any decision will normally have to go beyond the facts (the known) and there has to be management of that element of uncertainty. Whether we like it or not, to some degree, ‘feelings’ about the uncertainty will have a part in the process. It would be stupid for a person who has considerable experience of an environment not to use those feelings – it could be called judgement – in that ‘it looks right and feels right’ will normally beat it ‘looks right but feels wrong ’ or 'just looks right'? { Edit - PS In terms of the Monty Hall the 'look' is the choice which seems to have improved from 1 in 2 to 50-50 and ,as it stands now, actually is. But is the problem really statistical? Patrick Veitch (a noted British punter on horse racing) suggests the problem is actually behavioural because Monty Hall has shown you the goat and also knows what is behind the other doors. If you do not actually read between the lines then you would surely have some feeling about your original choice being wrong (even though you may stick with it for other reasons - it woud fall into the 'looks good- feels bad' category). }
The thing about feelings, and I agree they are inputs into any decision, is that feelings are generally left unexamined. They are inputs so make those gut feelings explicit so that you can interrogate them. Otherwise you may never figure out it is a mistake not to switch doors.
Thanks Annie - I agree and they have to be part of the decison process and handled as carefully as the data, logic and beliefs that go into making choices. Enjoying this open forum, so much to take in.
I just accepted new temporary new job assignment. Annie’s concept of Free Roll (if I don’t like this job it only lasts 4 month and I still can come back to my current job) and encouragement to assess probability of outcomes, (85-98%) probability that I’ll continue to frustrated with my current leadership in the next 6 months and 55% - 80% probability that I’ll enjoy new job that will lead to interesting new connections helped me make this decision.
AWESOME! So happy to hear it!
Your post on omission and commission bias gave me a lot to think about and I'm curious to hear your feedback on an idea that came up while applying your writing to risk taking. I think that we tend to think of risk dualistically, that we think we either hit home runs or strike out, which glosses over all the neutral outcomes. I think that when we increase the amount of risks we take, we think that the result will be to decrease positive outcomes and increase negative ones because equate "taking more risks" with irresponsible behaviors/decisions. But I think that what we're actually doing when we increase risk is squeezing some neutral outcomes out towards each end. The question becomes how often does the changed neutral outcome result in a positive or negative outcome instead. I'm curious to hear your thoughts and, if my thinking seems useful, how would you go about testing such an idea? Thanks.
Hi Annie Duke! I am so happy to be here and have learned so much from your research and how to apply practical tools for better decision making that I use in my work and everyday life.
I learned about you from Dr. Nirav Shah from Stanford University. I I am the lead educator for his course on Healthcare Leadership. The students also love your work and have found it valuable as they lead and shift.
I look forward to more conversations with you and others around decision making for healthcare workers and their own wellbeing and Life/ Work integration
As Well as an athlete lover of sports decision making. Sports like football, Mountain Biking , Ice Climbing etc... some of my Favs I would love to start this with you!
Hi, Annie!
I'm reading your book "Thinking In Bets" and have watched a few discussions in which you spoke. I've spent some thinking time about the topic of assigning probabilities to outcomes in the everyday world, as well as practicing that myself, some years back. For me, the practice seemed to be a subversion of recognizing indicators of a typical outcome. Assigning probability was like a guesstimate at a fuzzy match of indicators, pro and con, of the outcome. I could emphasize one indicator, and raise the probability, emphasize another, and lower the probability, think of an outcome, and feel confident it will occur, but then think of a contrary outcome, and find that just as credible. When deciding how to weigh uncertain outcomes, weighting schemes seemed untrustworthy, because they would fit any narrative that I wanted. Uncertainty was more like a choice among pretended-to beliefs. Truth was I was ignorant, just didn't have enough information, but was desperate to control my future in the short-term.
At the same time, I noticed that I could distinguish something I believed would occur from other possibilities according to whether I expected it to happen. It didn't matter that I could disagree with it's occurrence or think of plausible alternatives. I expected it to happen. It seemed like it was easier to believe and be wrong than it was to "develop uncertainty" about the future, if that makes any sense.
I've been puzzling over my subjective experience, wondering about its contrast with reports from people who claim that they find confidence in the future reassuring but then hate to be wrong. I hate to be wrong too, but I go ahead and have confidence about my claims right up until they're proved wrong. My use of credences, these partial beliefs about outcomes, with probabilities assigned to them, was just a way to dodge that my expectations had consequences for me that I didn't want to acknowledge. If I kept blaming probabilities, I could avoid acting on what I thought was true or avoid feeling trapped because there was nothing to do or nothing I was brave enough to do. However, now it seems better to qualify predictions with as many meaningful, explicit caveats as I can find, and have a robust response ready, since different outcomes could occur, but still keep a belief about what will occur if I'm going to act on that belief anyway (for example, because I have to act on that belief anyway). I try to find early indicators that what I want to do has been prevented from happening in advance by something else so that I don't waste my effort, unless I'm going to do it anyway, in which case, well....
Reading through all that, if you got this far, I welcome your thoughts and comments. I enjoy your grasp of bias and the overall structure you impose on decisions, but see your practice of assigning odds as more like measuring your own feelings of certainty, regardless of the facts. I find my certainty to be easy to falsify, when I have it, so I avoid doing that, it's an uncomfortable walk down memory lane. We are very different people in different environments, so maybe my way of thinking lacks applicability to you and most others, I haven't thought about that much. Either way, your general grasp of decision-making is solid, and you pack many useful recommendations and distinctions into your work.
Thank you.
With implicit information we are still being impacted by the implicit data however the impact on our decision is unknown. So to have this hidden source more cognitively available we can weigh it with our explicit data. Seems a potential for increased data points though, causing some potential dissonance, thus reasonable that decision makers might avoid dealing with the implicit if nothing else than to avoid stress.
The issue, though, is that the implicit is included in the decision whether you make it explicit or not. If I decide when to set my alarm, implicit in that decision is a forecast of how long I will take to get ready/how long it will take me to get to work. Always better if it is already implicit to make it explicit...unless of course the decision is very low impact. Then don't bother with too much process.
Thinking in Bets is about probabilistic thinking in three categories: the known, the unknown, and the unknowable. Is there a default for % unknowable for given decisions? eg black swan stuff? how do you quantitatively balance or estimate known and unknown?
shout out Henry Gleitman
Henry! Loved him so much!
I don't think there is a default percentage for unknowable simply because it is unknowable. focus on the quality of the known and whether it is worth the cost (time and other resources) to discover the unknown in deciding how fast to go.
Big fan of the field having read books by Kahneman, Ariely and you of course. I'm always amazed at the lower turnout for local elections vs. national ones, yet the former is much more likely to have a direct impact on the voter's life. Would you chalk this up to availability bias since national elections are so widely covered in the news whereas local ones have much less coverage or is there some other bias at play?
I'd guess availability bias is at play. There is also less social proof and social comparison with local elections (fewer of your friends vote). There is great work by Todd Rodgers on the effect of comparison to Neighbors on saving energy. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2589269 I believe this is related!
The Book of Why: The New Science of Cause and Effect by Judea Pearl is my current read. I can’t claim to understand it all, but he explains how complicated interactions can be explored by diagramming them and calculating probabilities. At its most extreme, the methods offer the equivalent of a randomized control trial with just the data you happen to have on hand. The math and the concepts are just on the edge of my understanding, but I hope they will add to your insights into rational decision making.
Part of the topic of the next book I am considering! Always partition your data to create your own control group. Highly recommend that book as well.
Hi Annie, I have both your books. Thank you!
Do you have any advice for someone who ...
a) avoids decisions due to fear of later regret, or
b) who values themselves based on the accuracy of their decisions?
Any further reading as well that could be helpful?
For both, I suggest making sure that you create good decision rubrics (including forecasts) and make sure you fill them out for big decisions. Also, create kill criteria. Both of these methods are in How to Decide and Quit. Also, very much part of the book Noise by Kahneman, Sibony, and Sunstein.
The reason this is helpful is that is will improve accuracy but also make you less likely to regret a decision because you can 1) Look back at the process that went into deciding, including your foresight of the probability of an undesirable outcome and what the alternatives were at the time and 2) You are more likely to cut your losses sooner with kill criteria to implement.
Thanks!
Why do you think so many people operate with blinkers on? It’s a decision to be short sighted and only think first order. Do you think it’s the way we are shaped and conditioned?
Or do you think we are primed for survival and therefor most make decisions based on survive not thrive?
Or something else?
Oh, gosh...well I always preface any evolutionary explanation with the caveat that it is a just so story. But...here goes...when we evolved, we stayed in a small territory and interacting with less than 300 people. Many of the heuristics we developed were energy savers and also mostly accurate. Availability bias, for example, maps to what is actually the most frequent if you only experience a small territory and only a handful of folks. Second order thinking is effortful. We are energy savers and a lot of heuristics work well enough in many environments. The problem is that when they don't, they really don't. I don't think we have caught up with ourselves or our environments.
Thanks Annie. I think you hit the nail on the head “I don't think we have caught up with ourselves or our environments.”
I often compare how we evolve in work and out of work, and how you mimic others to do well in that work environment that ultimately hinders most when they leave
The reason is combinatorial explosion. That is, in any situation (and I do mean any), the number of things that could be considered before taking action is infinite. Therefore, our brains have to limit the scope of the things that we consider to be important enough to deal with. This process is called Relevance Realisation. The problem with this is that we can easily confuse things that are salient with things that are relevant. So for the example above, the hiring person's experience of engineers being disagreeable is salient but not necessarily relevant.
For a proper discussion of this I'd direct interested readers to John Vervaeke's magnum opus, Awakening from the Meaning Crisis.
Thanks Steve. I’ll take a look into your recommendation.
My recent observation is first order effects thinking is default to 90% of people and increasing. It’s very rare even with c suite they are really thinking second to third order and to mid to long term and I see this impacting business and personal relationships as busy is everyone’s badge of honour rather than going deeper into topics and themes to really understand the ripple effects
I couldn't say what the percentage of people who rely on what Kahneman calls "fast thinking" is or whether it's increasing.
My perspective on the C-suite is that this kind of second-order thinking as you term it is something that the inhabitants definitely "ought" to be doing. The questions I have long pondered are, why don't they do that kind of thinking and how did they manage to get selected for the role?
As someone who has been close to, but not in the C-suite in a number of different roles, my intuition is that the skills that are required to attain the C-suite are not the same as the skills that are required to be effective in the C-suite. I think that people also underestimate the role that luck plays in early career success - this applies to both the people who have these "successful" career trajectories and those who hire them.
From experience and coaching c suites - Most c-suite is about protecting your team, your budget and thinking about 4-6 months ahead. It’s often survival versus how do we thrive and collaborate. Very human instincts but many are being reshaped to think shorter term and protect themselves at all costs. Environmental for sure, but blinkered and detrimental for wider successes
Hello Annie,
I loved reading your book “Thinking in Bets -“. My immediate question that I wanted to find out an answer for was how do you find out the unconsciously formed beliefs you might have? If this is influencing the way I process information (motivated reasoning) then what are the ways to find out the beliefs I formed in the past unknowingly? (Hear it-->Believe it-->Vet rarely).
Thank you for your time.
Mandkhai
Ask others opinions without telling them yours. You will find gaps between what you think and how you model a problem and what they think. Dig into those gaps to discipline bias.
Thank you for your reply! I guess asking myself what I think about a given subject could also help surface beliefs I did not know I had.
Yes!
Hi Annie,
I have not read “How to Decide” yet, but I was wandering how do you decide when the chances of the outcome that you value (as there could be many types of success for a given decision) is 50/50, or simply you can’t know? I figured when things are not 50/50, it is easier to make a decision. But what is always a difficult decision is when you don’t know the probability of succeeding or that it is 50/50 chance.
Thank you!
This goes under the category of “when a decision is hard that means it’s actually easy.” When we get to two options that are so close we often spin out and waste time trying to parse out which is better. But for most things, given uncertainty, we can’t actually parse those differences out. Not only that, it isn’t worth the time. If the options are so close, you can’t be that wrong whichever you choose better to spend that time on other things.
So, when you are in these fifty fifty situations, instead of going into analysis paralysis, just flip a coin to decide!
A more fleshed out answer is in How to Decide in the section on a sheep in wolf’s clothing.
Thank you Annie for your reply. Much appreciate it.
Thanks for opening up this discussion, Annie! Just finished Thinking in Bets and about to start How to Decide.
I’ve got two questions my mind:
1. Do you have any thoughts or recommendations on decision-making frameworks for people with ADHD? I’m in the (arduous, labyrinthian) process of getting a diagnosis as an adult, which is making me more acutely aware of particular biases that I may either overlook or willfully (and irrationally) accept.
2. Do you have any suggestions on or can you recommend resources for assembling a decision-making support group? I’m a hard introvert when it come to online communities, so I’d love to find a very supportive guide for forming such a group online and/or a guide for starting an in-person group with strangers. (I don’t have enough friends in the area who are convinced of this premise, though I have already recommended your books.)
1) Checklists, Checklists, Checklists, and Rubrics! It is hard for someone who does not have ADHD to operate without these. It is even more beneficial for someone who does.
2) Yeah so I'm a bit in the same boat. That being said, I think if you can get into some cohort classes, that will naturally put you will like-minded people interested in improving decision making. I found it so much easier to form groups back when I was in school. I think creating natural collision with people makes it much easier for introverts.
I find this question fascinating as somebody who leads in schools for a couple of reasons:
1) we find that almost every school operates an implicit model for the practice of their teachers (good people off in classrooms making an enormous number of daily practice decisions at random) rather than an explicit practice framework. This robs any school of the chance for its teachers to practice consistently.
2) it really did get me thinking about that volume of bets (microdecisions) that teachers/principals make every day and how it must be among the highest for any profession.
This is going to sound a bit morbid. As I get older I find myself thinking more about the decisions I am making throughout the day than ever before. Prior to starting any project or devoting my energy and time toward anything, I think about the "end point" of moving forward and work backward. Is this worth my time? Am I getting any value out of what I'm about to start? My time is valuable and I don't want to waste any of it. Go/no-go decision on anything is hard given how little time we all have on this planet.
I don’t think that is morbid. That is smart decision making. Life’s too short to spend your time on things that aren’t worthwhile.
I’m curious if you’re focused on energy and time only as your own inputs, or if you also consider the “return” you might get in energy and time.
I have one older friend who has fully embraced a “return on energy” framework for making decisions, including what projects to work on and whom to spend time with. She reports that thinking about how energized she might feel by doing something provides a better metric compared to monetary return. (She’s even repositioned her consulting practice on this premise.)
So, your focus on time and energy seems quite valid for a personal decision-making framework.
For me it is always what excited me. As I look at the end result, will I be happy I did the work or will I wish I had walked my dog instead.
Unrelated but maybe related….I have been reflecting lately about how times that I’ve gone about my pursuits (education and music performance) in methodical ways, and in “intuitive” ways, for lack of a better term. Although music is an art, there are principles that apply. Same with teaching. I was struck today when I watched the video on Monty Hall and the selection of doors. I totally resisted the explanation of logic - until I didn’t! And it’s caused me to reflect on times when I’ve prepared for music performance (made decisions) using methodology and known, tested principles - even in art, proper decision making wins the day. At least at my level it does.
Love this! And, yes, so easy to resist the answer to the Monty Hall Problem because it doesn't "feel" right...until it does. Then once you get it you can't unsee it. I like the analogy to music/art.
Thanks! Your observation on the need to make explicit what's already implicit in your decisions fits in nicely with the Information Matrix I developed. http://instituteforfinancialtransparency.com/2018/05/14/yale-professor-basically-you-dont-understand-the-crisis/
Love the paper and plastic bag analogy. Excellent way to think about who has the information advantage.
The bag analogy is so useful and immediately made me think of the Long Term Capital debacle!
Thanks Annie,
I'm reading Quit right now.
Best,
Andrew
Agree. Both of these are reinforced if the individual doesn't directly suffer the consequences of their decision.
Great point!
To me the most terrifying aspect of my (human) nature is my ability to make terrible decisions despite obvious evidence to the contrary. Evidence at the time of the decision - not only outcome evidence.
I quit a job at the tail end of covid with an airline after 25 years in the hope of a better lifestyle with another one. The problem with being a pilot is that when you quit, you go from senior captain to bottom of the list first officer and it takes a long, long time to work your way back up. So, here I am, having quit, and desperately regretting it. There's no lifestyle and there's a 66% salary cut. Never mind the other ugly surprises of the new job.
I guess I didn't do a proper pre-mortem, despite reading about them in How to Decide.
I also disregarded the advice of someone about whom I have a negative opinion (for having acted against my and our family's interests before) and overweighted the advice of someone I respected (a professional counsellor).
I'm so sorry you are feeling these regrets. My book QUIT: the Power of Knowing When to Walk Away might help you with these types of decisions in the future.
I do think you have it right in this case. For these big, high impact, irreversible decisions also slow down, do a premortem, get outside opinions independently, look for base rates, set kill criteria, etc. Might help for you to make yourself a checklist for the future. I find it helpful!
Loved the Lit Video version of Thinking in Bets. Since then I've noticed an interesting dynamic. 1. Think of the win 2. Guess the probability 3. List aspects related to a true win 4. List the probabilities of each aspect 5. Calculate the probability. 5 is very different than 1. Going through this process helps make a better bet. Does this resonate?
Absolutely! It makes you make explicit why you are making the forecast you are. When you actually take the time to break down what is going into the forecast or judgment, it will improve accuracy, especially if you write it down and know you will check back in on it late after you see what the results are. Love your process.
I would like to know if there is a way to communicate uncertainty to people who are certain of their beliefs, and where their certainty is likely false or overstated/too high.
For example people who are not experts in a field, have little actual knowledge in that field, and whose beliefs contradict the beliefs of a lot of experts in that field.
The best way I have found is to ask if they want to bet on it. It usually gets them to back off of certainty to a place where they start examining possible gaps in their knowledge. "Wanna bet?" gets people to start interrogating their belief, asking themselves something like:
1) What does the person challenging me to the bet know that I don't know?
2) How sure am I of this belief?
3) What am I missing?
I've found this works really well!
Don’t some (many?) people bet impulsively, though?
Of course! Offering to bet will not work perfectly but it does get people at least thinking about what they do and do not know, more so than if they didn't think of it as a bet.
Thank you! That probably was in Thinking in Bets; I may need to re-read it, I read it a couple of years ago and there are probably other oddments I have forgotten since as well as that one.
I’ve been thinking a lot about the relationship between your rational brain and your embodied brain in decision-making. You might sub “cold cognition” for rational and “hot cognition” for embodied. Or conscious and subconscious. It came to mind when reading your piece today. Our cold cognition allows us to solve problems that no other animal can. Yet it also can serve as a barrier to good decision-making, one that the pigeons in your article today weren’t burdened with. I read recently that people who have suffered brain injuries that destroy their embodied brain but leave their rational brain completely intact are very capable or perceiving options in a given situation, but completely incapable of making a decision of any kind. I find that fascinating.
So interesting. I think the not switching is a system 1 (hot cognition) problem that has to do with bias. That being said, system 2 for most of us can't properly calculate the probabilities, which pigeons seem to easily do.
This reminds me of some work from Randy Gallistel also on pigeons. He had two people sitting on adjacent park benches throwing bread to pigeons. One of the people threw the breads at four times the rate of the other person. The pigeons very quickly formed groups in the same proportion to the rate of bread throwing. Four times as many pigeons queued up in front of the faster bread thrower and even throw individual pigeons might move between the two groups, the ratio stayed the same.
In some ways, system 1 is more biased. But System 2 is slow and can still be error-filled. And humans are just pretty bad at probabilities. Worse than pigeons in many ways!
Hmm, I was initially thinking, like you, that the failure to switch was a hot cognition problem- because of the emotional attachment to the initial guess. But then I thought it might actually be a cold cognition problem because it is essentially a sunk cost bias. It certainly seems right that our hot cognition struggles badly with probabilities- I would extend that to weighing things in general (see sunk cost bias above for an example or how bad we are at perceiving exponential change.)
And by the way these pigeons are awesome! How cool is it that they congregate in ratio to the food. I wonder if there is any connection to the sensory or thought process that allows them to fly in unison or formation.
I believe the rules that allow them to fly in formation are local rules (what is the bird in front of me doing?). I wonder if it is a local rule that allows them to do these proportions as well.
I think more in terms of System 1 and System 2. Most bias is a system 1 issue although bias is an issue that influences more deliberative thought. The pigeons are obviously in system 1. They are just being reinforced for choosing to switch over time. Humans don't seem to learn from the reinforcement in the same way.
Annie, do you think Herb Simon's notion of "bounded rationality" might explain some of the issues with switching systems? IOW, what is the: complexity of the problem space + time constraints + cognitive biases + limits of knowledge.
For sure. Satisficing allows us to go fast (which is necessary for survival) but sometimes makes us go too fast, stopping us from switching systems. 100% relevant. When we think of heurisitcs, they are really simplifiers, a way to deal with complexity and the limits of our knowledge which sometimes is good (short cuts are often good enough and speed us up) and sometimes are bad, causing errors we can't tolerate.
I LOVED your books, Thinking in Bets and Quit. I'd be tremendously honored to have you come on my podcast, Meditations with Zohar; if so, email is zoharatkins at gmail dot com
Here's a metaphysical question: why is time structured in such a way that our decisions are irreversible and what would it do to our lives if we could go back and select the other options, as in sci movies like back to the future.
Does meaning of life derive fundamentally from the combination of uncertainty about how our actions will turn out and the irreversibility of our decisions?
This is so philosophical! Some decisions are reversible...not in the sense that you can literally get the time back and go back and completely start over but in the sense that you can quit what you are doing and, sometimes, go back and select an option you previously rejected. Paying attention to how reversible a decision is is actually a key decision skill!
As for the meaning of life, I have no idea! I'm still trying to figure that out for myself :)