A Little (Scientific) Knowledge is a Dangerous Thing
People with an intermediate understanding of science are prone to a menacing combination of overconfidence and anti-science attitudes.
It turns out that the expression, “a little knowledge is a dangerous thing,” is apt when that knowledge is about science. People with intermediate levels of scientific knowledge are a big obstacle to better policy and informed individual decisions, whether it is about climate change or the safety of vaccines or nuclear power or a host of other topics.
The title of a study published in Nature Human Behavior in September 2023 says it all: “Intermediate levels of scientific knowledge are associated with overconfidence and negative attitudes towards science.”
The day the study came out, Jay Van Bavel tweeted a succinct summary of the breadth of the research and big takeaway:
This explains the "I do my own research" crowd
Intermediate levels of scientific knowledge are associated with overconfidence and negative attitudes towards science. This is based on four large surveys, spanning 30 years in Europe and the United States and does not rely on self-reporting or peer comparison.
Be wary of overconfident people who know just enough to think they are experts, but not enough to understand the limitations of their own knowledge.
The well-known Dunning-Kruger effect, “in which people wrongly overestimate their knowledge or ability in a specific area,” applies to overconfidence of scientific knowledge. This was the conclusion of a study on vaccine hesitancy published in 2018 and other research, but this new paper, with its encompassing dataset, points out that the relationship is not simple or linear.
The researchers found that people with the least amount of scientific knowledge on a topic are not overconfident. Instead, consistent with the Dunning-Kruger effect, it is those with intermediate knowledge that are most prone to overconfidence … and the most anti-science attitudes.
The combination of overconfidence and anti-science attitudes for those who know just enough about a science to do some damage to themselves creates a perfect storm for derailing smart policymaking and good individual choices.
Van Bavel’s initial tweet immediately drew a lot of critical replies, which ironically proved the point of the study. Jay noted later that day:
Several replies to this study perfectly show this exact issue! People confidently rejecting this study because they distrust science, and they sometimes add a meme about how science is supposed to work.
These are just a few of the replies he may have been referring to:
“This is exactly the result one would look for it one wanted to make a persuasive case against ANYONE trying to figure things out in lieu of just, y’know, trusting ‘the experts’ implicitly. Sorry; I know ‘experts’ and half are below average.”
“Of course one mistrusts science then they’ve been caught lying about stuff like Corona etc. How can someone trust anything without doing their own research?”
“When I made my first post in defense of natural immunity in 2021, within an hour two friends DM’d about their injuries. That has only grown four fold and includes a death. Keep telling us we’re not smart enough.”
The issue, of course, is that any decision we make can only be as good as the quality of the beliefs that inform them. No matter how good a decision process is, junk-in/junk-out still holds. And when the junk-in is a belief based on, for example, fraudulent data that has been long-debunked demonstrating a link autism and vaccines, that affects some pretty high-stakes personal decisions. And that affects policy, too.
It is true that you can know just enough to hurt you.