The Illusory Truth Effect
The more something is repeated, the more true it feels. And social media is an information repeating machine.
Amid a flurry of research into the dangers posed by misinformation on social media, a new study confirms that the repetition of misinformation both leads us to believe things are true that are untrue, but also mutes our moral judgments.
The illusory truth effect
Our decisions can only be as good as the accuracy of our beliefs. We may intend to rationally evaluate evidence to determine what’s true and what’s not, but, in reality, we use a lot of short cuts. One of those shorts cuts is how often we have heard or read something.
The more something is repeated, the more true it feels. This is known as the illusory truth effect.
In 1977, Lynn Hasher and colleagues demonstrated that “the repetition of a plausible statement increases a person’s belief in the referential validity or truth of that statement.” In other words, as they said, “if people are told something often enough, they'll believe it.”
Tell someone often enough that elephants are the only mammals that can’t jump and it will just feel true. (Look it up. It’s not!)
A 2010 meta-analytic review of more than 50 studies concluded that “the finding that repeated statements are believed more than new ones has been obtained under many conditions” and “appears to be very robust.”
As Daniel Gilbert said in 1991, summarizing a multitude of research literatures (and four centuries of philosophical debate on belief formation and updating),
People are credulous creatures who find it very easy to believe and very difficult to doubt. In fact, believing is so easy, and perhaps so inevitable, that it may be more like involuntary comprehension than it is like rational assessment.
The illusory truth effect and processing fluency
The illusory truth effect is an artifact of the relationship between processing fluency and “truthiness.” Processing fluency refers to the ease with which you can understand a message. The consensus among researchers is that repetition is one of the ways to increase the ease with which statements are processed, which in turn is used as a short cut to infer accuracy.
That means simpler messages will generally feel more true than complicated or nuanced ones because simpler messages are easier to understand. And it also means that messages you have heard before will feel more true.
Oft repeated things, even big lies, are easier to process and are more truthy simply because you have heard them before.
Social media: a message repeating machine
In the last decade or so, there has been a flurry of research on the illusory truth effect as it relates social media, and the results have been alarming.
Unless you’re completely oblivious to the burgeoning influence of social media, you can probably intuit the potential dangers from the spread and repetition of false information. Social media is an information repeating machine. Which means it is a machine perfectly designed to create illusory truth.
That’s all bad. But it gets worse.
Illusory truth effect can make us abandon previous beliefs…that are true
The pre-Twitter/pre-fake-news studies on the illusory truth effect usually contained a caveat that the repeated information where it would be unclear to most people if the information was true. Otherwise, the thinking went, people would judge the truth of the information based on their own knowledge rather than the fluency created through repetition.
Elephants are the only mammals that can’t jump. Jimmy Carter is the second president from the state of Georgia. Delhi is the most populous city in the world.
Most people would not be sure whether these statements were true. (None of them are).
Studies of the illusory truth effect purposely used these types of statements where the truth would be less certain because the researchers assumed that if a statement was known to be false then repetition wouldn’t have an effect.
Doesn’t that limitation seem like common sense? After all, our information processing is flawed, but we’re not going to start believing something that we used to know is false just because it pops up a lot. Right?
Wrong.
Unfortunately, we’re a lot more susceptible to manipulation by repetition than researchers originally thought.
In 2015, Lisa Fazio and colleagues tested the prior supposition that the illusory truth effect doesn’t occur when people know better, based on their previous knowledge. The title of the paper, “Knowledge Does Not Protect Against Illusory Truth,” tells you pretty clearly what they found.
You might know that Jimmy Carter is the only president from Georgia. But if you hear that he is the second president to hail from that state enough times, you will start to believe it.
Lies, even big ones, when repeated enough take the place of what we know to be true.
Illusory truth effect - applies to and explains the dangers of fake news
After the attention given to the phenomenon of fake news during the 2016 Presidential election, Gordon Pennycook and colleagues examined whether the illusory truth effect extends to fake news.
In experiments with 1,800 participants total, using actual fake news headlines seen on Facebook, they found that, “despite being implausible, partisan, and provocative, fake news headlines that are repeated are in fact perceived as more accurate.”
Just about every detail of their findings confirms the worst about social media’s potential for manipulation and promotion of inaccurate beliefs.
A single prior exposure was enough to increase perceived accuracy.
Repetition increased the belief in accuracy even for items that participants were not consciously aware they’d been exposed to.
More exposures also led participants to consider an item more true.
Putting a warning with the headlines that they had been disputed by third-party fact-checkers made little difference.
The effect even occurred with headlines that conflicted with the participants’ political ideology.
Repetition interferes with judgments of truth AND our morality
The Fazio and Pennycook studies are just the tip of the iceberg of recent work on the illusory truth effect. Raunak Pillai, Lisa Fazio (see above), and Daniel Effron recently published a paper demonstrating that repetition doesn’t just create truthiness. It also blunts the feeling that a bad act is morally or ethically wrong, a phenomenon they call the moral-repetition effect.
The researchers mimicked the experience of repeated exposure to a viral news story by texting participants real and made-up news headlines about corporate wrongdoing over a period of two weeks. In addition to replicating the illusory truth effect, they found that participants judged the wrongdoing as less unethical with repeated exposures.
That means the more we hear about wrongdoing, the more we will believe it…but the less we will care about it.
Both aspects of repetition-induced bias interfere with the accuracy of our perception of the world. Good decision-making depends on developing and updating the accuracy of our beliefs. What’s true or untrue? What’s the probability of various outcomes based on that information? What level of culpability or responsibility attaches to various actions?
Both the illusory truth effect and the moral-repetition effect make the foundation of our decisions, the beliefs we hold, shaky. Social media amplifies these effects.
Maybe that’s why the retraction is never as loud as the accusation. The accusation echoes repeatedly. The retraction lands with a thud only once. There’s no competition in that. Not cognitively anyway.
In the abstract of How Mental Systems Believe, Dan Gilbert suggests that "(a) the acceptance of an idea is part of the automatic comprehension of that idea and (b) the rejection of an idea occurs subsequent to, and more effortfully than, its acceptance." Believing is essential to complete the process of comprehension. Disconfirmation is effortful and happens after. This shows how much harder disconfirmation is, even when warranted, and how important it is.
Adam Grant made up something fictional called Sarick Effect in his book Originals to illustrate the impact of exposure effect. Borrowing from Grant, it's a case of familiarity breeds comfort and unfamiliarity breeds contempt.
Perfect timing with this, I just saw it happen in real time on Nextdoor. Someone posted an article from a satire site - but the person didn't realize it was satire. Neither did most of the commenters. The more people cheered the comments, the more they dug in and refused to consider it was false. (It was about a singer supposedly refusing $1 million to sing at the Super Bowl).
Not only did they refuse to admit it was true - but they started turning on previous singers for accepting the money. Yet according to every reference I've found, performers only get expenses and union wages. Perhaps that's wrong, but I couldn't find any evidence to the contrary.
What horrified me most was the comment "I don’t waste my time finding out if it’s a lie or truth. Who cares. I like [the singer] anyway"
I struggle with how to encourage people to examine information and articles like that. Or even WANT to examine to see if they are true - most were like the person I quoted above, they didn't even care if it was true, it matched their expectations so that's all that mattered. While I'm curious how they will react when the actual performers are announced (assuming it's not the person mentioned), I also don't want to stir that hornet's nest again.