This piece appeared in The Atlantic in September. I love it so thought I would share it here.
Siobhan O’Keeffe, one of tens of thousands of runners in the 2019 London Marathon, noticed that her ankle started hurting four miles into the race. Despite the worsening pain, she continued running. Another four miles later, her fibula bone snapped. Medics bandaged her leg and advised her to stop running, but she refused. She actually finished the marathon, running the last eighteen miles in nearly unbearable pain.
Her story sounds fantastical. Running eighteen miles on a broken leg stretches the limits of believability. Even her orthopedic surgeon remarked as much. But what might be more unbelievable is that this story is not uncommon. In fact, that same day, at the same distance into the race, another runner, Steven Quayle, broke his foot. He kept running, through pain so bad that, during the final ten miles, he had to make four or five stops for medical assistance. But he, too, finished the race.
There is no central registry of such incidents, but it happens a lot. A quick Google search turns up many other stories of distance runners around the world suffering horrifying injuries and finishing their race.
These stories are so hard for us to swallow because I think everyone shares the intuition that it you were going to break your leg at the beginning of a marathon, you wouldn’t start in the first place. Dealing with pain and discomfort are expected in distance running, but I think we all agree that if we snapped our fibula at mile eight, we would stop.
Likewise, if we received equally clear signs we were in a losing situation in a job, or a relationship, or an investment, we would quit and do something else.
On the other hand, I’m guessing that part of you has admiration for those distance runners sticking it out. We look at these types of stories and think, “I wish I had that kind of grit.”
While grit can be an admirable quality, it isn’t always an unqualified virtue. Apart from the pain these runners were in, they were risking permanent physical damage, or at least making their condition sufficiently worse that it could cost them opportunities to run again.
This is the issue with grit. It can get you to stick to hard things that are worthwhile, but it can also get you to stick to hard things that are no longer worthwhile – like after your fibula snaps at mile eight. Throughout our lives, it turns out we are not very good listeners when the world tells us that we should stop.
There are a host of reasons why we ignore obvious signals to quit, not the least of which is that quitting has a nearly universal negative connotation. If someone called you a quitter, would you ever consider it a compliment?
We view grit and quitting as opposing forces and, in the battle between the two, grit has clearly won. “Quitters never win, and winners never quit.” “If at first you don’t succeed, try, try again.” Or even what we all learn as children by reading The Little Engine That Could: “I think I can. I think I can. I think I can.” The popular aphorisms boil down to grit as a virtue and quitting as a vice.
This is even reflected in the way the English language itself favors grit. If you’re gritty, you’re steadfast, determined, resolute, or unwavering. You might even be considered heroic or courageous. Meanwhile, quitting means failing, capitulating, or giving up. Quitters are losers. Quitting shows a lack of character.
It’s high time we rehabilitated quitting. Quittiness is at least as important a decision skill to develop as grittiness, because quitting gives us the ability to act as new information reveals itself. The world changes. Our state of knowledge changes. We change.
Imagine if the first thing you ever did, you had to stick to. It would be almost impossible to start anything because you wouldn’t have the option to stop it. It's the option to quit later that allows you to make the decision to go on a first date, even though you don't know how it's going to turn out. It's the option to quit that allows you to take a job even though you don't know if it'll be the right fit. It’s quitting that allows you to start a marathon, even though you might injure yourself in the middle.
We are able to make up our minds amid constant uncertainty because, once we learn any of these things aren’t right for us, we have the option to quit. That’s why, if I had to skill somebody up to get them to be a better decision-maker, quitting is the primary skill I would choose, because the ability to walk away is what allows you to react to the changing landscape.
The problem is that, too often, we persist too long. In fact, when we get bad news, we frequently do the opposite of what our intuition tells us that we would do. Instead of quitting, we double down, escalating our commitment, further entrapping ourselves in a losing cause. We see it everywhere: People staying too long in bad relationships, bad jobs, and bad careers; businesses continuing development and support of products that are clearly failing or long after conditions have changed; a nation sticking for decades in an unwinnable war.
*****
Nearly half a century of science has identified a host of cognitive forces that contribute to our inability to quit after getting bad news. The most well-known is the sunk cost fallacy, first identified as a general phenomenon by Nobel Laureate Richard Thaler in 1980. It’s a systematic cognitive error in which people take into account money, time, effort, or any other resources they have previously sunk into an endeavor when making decisions about whether to continue and spend more. In other words, we throw good money after bad.
The fear that you’ll have wasted what you’ve already put into something if you don’t continue, causes you to invest more in a cause that’s no longer worthwhile. That’s widespread behavior, whether it’s part of the continued commitment to a public works project that has gone off the rails, or why people won’t quit jobs they no like or leave relationships that have turned toxic.
Another commonly known error that keeps people from quitting is status quo bias, introduced in 1988 by economists Richard Zeckhauser and William Samuelson. In comparing options, individuals and enterprises overwhelmingly stick with the one representing the status quo, even when it is demonstrably inferior to the option representing change.
There are also cognitive and motivational issues surrounding our need to have our beliefs (and actions corresponding to them) validated as correct and consistent, and to be seen in that way by others. We don’t like it when our beliefs clash with new information that suggests that a belief that we hold is inaccurate. The uncomfortable feeling that we get is known as cognitive dissonance, the concept pioneered over sixty years ago by Leon Festinger. We can resolve that dissonance by changing our minds or rationalizing away the new information. Too often, as established in hundreds of studies, we choose the latter.
Our beliefs are really hard to quit.
*****
In 2013, economist Steven Levitt, coauthor of the bestseller Freakonomics, put up a website inviting people who visited to flip a virtual coin to help them make a close decision about whether to quit or stick. You might be skeptical that people would go to a website to flip a coin to help them make a major decision. But 20,000 people over the course of a year actually did this, including nearly 6,000 considering a life-changing decision like whether to quit their job, or go back to school, or retire, or end a relationship.
Obviously, these people must have felt that the choice of whether to quit or to persevere was so close, so 50-50, that flipping a coin to help them decide seemed like a reasonable option. It stands to reason that if these decisions were, in reality, as close as the coin flippers felt they were, they would be equally likely to be happier if the coin landed heads or if it landed tails, whether they ended up sticking or quitting.
But this isn’t what Levitt found. When he followed up with the coin flippers two and six months later, he discovered that the people who quit were happier, on average, than those who stuck. While the decisions may have felt close to the people making them, they were not actually close at all. As judged by the participants’ happiness, quitting was the clear winner. That meant that they were getting to the decision too late, long after it was actually a close call. Given all the cognitive biases that work to prevent us from quitting on time, this shouldn’t be surprising.
That being said, life is murky and uncertain environments exacerbate these types of errors. Maybe when we’re in an environment where the motivation to get the decision right is clear, along with having data objectively telling us when it’s the right time to quit, we would do it. But we don’t. Even in those environments rich in motivation and information, decision-makers tend to persist too long.
*****
The NBA and other professional sports leagues offer a unique environment in which to study quitting behavior. Decision-makers in pro sports get a lot of continuous, quick, clear feedback on player productivity. Pro basketball is a data-rich environment, with many objective measures of player performance, constantly being updated. The coach and team management are highly motivated to use the best players in the right situations and, obviously, to win.
In 1995, social psychologist Barry Staw and colleague Ha Hoang looked at escalation of commitment and quitting following one of the common, high-stakes decisions made by expert sports executives: The NBA draft. Did a basketball player’s draft order affect later decisions – independent of their on-court performance – on their playing time, likelihood of being traded, and career length?
They analyzed players and their draft order in the 1980–1986 NBA drafts, nine measures of player performance, minutes of playing time for five seasons, career length, and whether a player was traded.
It turned out that draft order did have an independent effect on future playing time and roster decisions. “Results showed that teams granted more playing time to their most highly drafted players and retained them longer, even after controlling for players’ on-court performance, injuries, trade status, and position played.”
This is where you can clearly see the effect of cognitive errors like the sunk cost fallacy. Spending a high draft pick to acquire a player burns a valuable, limited resource. Benching or trading or releasing such a player, despite performance data justifying it, feels tantamount to wasting that resource, so those players get a lot more chances than players drafted lower who are playing as well or better.
These findings can’t be dismissed as a relic of the pre-Moneyball era. Economist Quinn Keefer has conducted several field studies since the mid-2010’s on the effects of draft order and player compensation on playing time in the NFL and the NBA. Although the effect sizes were diminished, they were still significant, replicating the original findings from the 1980’s and 1990’s.
This begs the question: If pro sports teams, with their armies of analysts and constant pressure to win, are making this error, what’s happening in our everyday lives? What relationships are we staying in too long? What employees are we holding onto that we should be laying off? What jobs are we staying in that we should be walking away from? What beliefs do we cling to that we should have updated long ago? What marathons are we continuing to run with a broken leg?
*****
What makes quitting so hard is that, if we quit, we fear that we will have failed and wasted our time, effort, or money. When we worry that quitting means failing, what exactly are we failing at? If we quit something that’s no longer worth pursuing, that’s not a failure. That’s a success. We need to start thinking about waste as a forward-looking problem, not a backward-looking one. That means realizing that spending another minute or another dollar or another bit of effort on something that is no longer worthwhile is the real waste.
We need to redefine failure. We need to redefine waste. But ultimately, we need to rehabilitate the very idea of quitting.
Contrary to popular belief, winners quit a lot. That’s how they win.
It’s a complicated problem, isn’t it? Kanaman and Twersky showed that people rarely think in mathematically logical ways. A host of decision and perception biases are even named, but that still doesn’t stop people from having and exhibiting them. And then, there’s selective memory: bad relationship, but—since we’re stating rationalizations for not quitting—“better the devil you know.”
For me, relationships probably represent the clearest way I can understand my counterproductive self. It became clear to me about 20 years ago that I knew a relationship was doomed quite a while before I let go. I also knew that something had to happen for me heart to catch up to my mind. Once that event happened, and it would always be different, but recognizable, things would be officially over and done with. And so it has happened. Romantic situations, work situations, inventions that I made and fell in love with. Sooner or later, I hit walls.
But there’s a flip side to that coin, too. The idea of “fail early” in business has been clearly debunked. What seemed like a great idea when I first heard it isn’t so great after all. People who fail early typically have little more than a longer string of failures behind them. And those who have started successful businesses are in high demand. This is certainly true in the medical device space, which boasts a 90-95% mortality rate. I started a company in ‘96 which, by any criterion, succeeded. Now, trying to start another one, the mere fact that the company exited successfully gets me in all kinds of doors. My background is in sciences and medicine, not business; this had been a pleasant surprise to me.
But why should having been successful once be such a big deal? Lots of possible reasons present themselves: the basic idea was a winner, I could tell when a bad idea was creeping in and could steer away from it, we were able to solve the inevitable problems that arise from time to time in every company that does succeed, and more. While this was happening, we were also dealing with a coworker who was eventually fired for cause. That, not the R&D, was the biggest go/no-go factor determining whether we were viable.
Surgery, like battles, suffers from the maxim that no battle plan survives its first encounter with the enemy. Quitting early when plans aren’t realized is a recipe for being consistently ineffective. It takes a hopelessly naive one begin with nothing more than Plan A. But even Plans B-Z can fail, so if someone can’t improvise in real time, they won’t be suited for the job. Wargaming WWII battles tells you how close-run things were much of the time.
Each battle is unique, so falling in love with a formula for winning is guarantee for failure. Surgery is a little more formulaic since anatomy and pathology are at least relatively constant from patient to patient, but even so, a surgical plan, like a battle plan, rarely survives its first encounter with the enemy, i.e. the disease process.
I hadn’t thought about this until some observers were spending a week or so shadowing me. Before each case began, I’d tell them what I was planning to do so they would be oriented. Toward the end of the week, one asked whether things ever went exactly to plan. Turned out the answer was no, beyond very basic procedures. It hadn’t crossed my mind to think in those terms. If a general can’t see around a hill to determine enemy positions and strengths (cannons, mortars, etc.), a surgeon can only see so much through opaque skin. X-rays are essential, but rarely tell the whole story. Adaptability is what identifies effective warriors and surgeons. They know when to modify plans, and, as the article implies, when to quit.
Hi Annie,
Thank you for this. This is such an overlooked skill, perhaps because, as you say, the word "quit" has a pejorative connotation. Maybe a change in vocabulary would help--the 'art of quitting' is so different from the act itself.
The word volte-face comes to mind. It certainly sounds better, but is generally used for a change in opinion rather than an act like quitting one's job. Quitting a job is made to seem more consequential than say...'quitting' one's stance on war, but the latter could save the world. So in addition to having the superhuman power of being objective in subjective situations, "artful quitters" are expert at contextualizing and prioritization, regardless of social trends, which takes a high level of self knowledge. I suspect much of the 'quiet quitting' trend hitting the labor force is about people in the midst of and/or stuck in this personal transition.
Unlike the 'act of quitting', the 'art of quitting' is difficult because it requires an objective and nuanced understanding of a situation that you are most certainly invested in and therefore prone to be highly emotional about. In my studies "artful quitters" are also fearless and expert at assimilating information through iteration. Like a poker game, each successive hand, whether you bet or fold (quit) is turned into a lesson learned about the next hand.
In other words, 'artful quitting' requires an unnatural duality that takes time to master, but those that do attain some sort of magical ability. It is the ability to leapfrog over the rest of us mortals in the decision making process. It is also THE mark of leadership.
May these random thoughts serve as inspiration for future posts on the subject.
Thank you,
Celan