It's Not Irrational to Have Dumb Beliefs
"Just look at the evidence" and other simple critical thinking quips miss the really hard problems
Please hit the ❤️ “Like” button at the top or bottom of this article if you enjoy it. It helps others find it.
Here's an annoying fact: There are people out there that disagree with you.
No matter who you are, there's bound to be some important issue where a sizable number of people think you’re wrong. It isn't just that we have different values, people often disagree on matters of fact, especially around issues like climate change, vaccines, and the constant hot-button issue of the correct philosophical theory of consciousness.
Well, that sucks. How do we account for people who disagree with us? If they're smart and believe something different from us, that could mean we're wrong!
Luckily, it's easy to come up with explanations for how they could come to such wrong beliefs. For example, they might be irrational and come to their beliefs through some form of motivated reasoning.
Perhaps, sometimes, that's true. But here's another annoying fact: It's probably best to interpret those who disagree with us as charitably as possible. You're more justified in thinking you're right if you can explain the positions of those who disagree with you without invoking “They’re stupid/evil/irrational”.
And a final annoying fact: Even without motivated reasoning, two rational, Bayesian reasoners can disagree on the implications of a piece of evidence.
So maybe we shouldn't just write everyone else off as irrational, which is really annoying because some people believe such dumb stuff.
Same Evidence, Different Bayesian Updates
In a previous post, I talked about an article that describes gradual exposure therapy as a treatment for anxiety. I criticized this part:
Gradual exposure therapy is a research backed approach shown to help reduce anxiety and treat anxiety disorders. It does this because it literally changes the brain, re-wiring the neural pathways and changing the release of chemicals in the brain.
Disclaimer so I don't mislead anyone here: We have a ton of good evidence that gradual exposure therapy works for various things. My gripe here is just with the article in question, not gradual exposure therapy.
To rehash what I said before: of course gradual exposure therapy changes the brain. Reading this post changes your brain. Anything you remember for more than a minute has “re-wired” some of the synaptic connections in your brain, that’s how memory works. Saying gradual exposure therapy works because it changes your brain is like saying Dostoevsky was a great writer because he put words on pages. Pointing out obvious prerequisites isn't a useful explanation.
When I see this kind of pointless neuro-babble, it raises a big red flag. If someone leans on irrelevant but fancy sounding neuroscience, there's a good chance they're a quack posting pseudoscience. People take explanations that contain neuroscience in them as more likely to be true even when the neuroscience is irrelevant, so hucksters love to stick neuroscience anywhere they can to justify all kinds of nonsense.
But to someone without my hang-ups about gratuitous neuroscience terms being sprinkled into writing, seeing an article that seems to have some scientific backing is likely to increase their confidence in what the article says.
Imagine three (rational) versions of myself reading the gradual exposure therapy article:
Naive Version: No strong beliefs about neuro-babble or gradual exposure therapy. The article nudges my beliefs towards thinking gradual exposure therapy works.
Skeptical Version: Believes neuro-babble is a strong indicator of quackery. The article lowers my trust in the website it's published on and nudges my beliefs towards thinking gradual exposure therapy is pseudoscience.
Balanced Version: Believes this kind of neuro-babble is indicative of quackery, but also believes strongly gradual exposure therapy works. The article mostly just makes me roll my eyes and has no effect on my beliefs.
The point is, when we assess evidence, a whole host of connected beliefs come into play. It's perfectly rational for people with differing webs of beliefs to come to different conclusions from the same piece of evidence. (If you want examples that explicitly run through the Bayesian math rather than this hand-wavy example, see this paper).
The reliability of sources
Notice in the example above how background beliefs didn’t just play a role in shaping the interpretation of the evidence. They also impacted how much the various versions of me believed a source should be trusted.
I suspect this kind of stuff happens all the time! When people accuse others of not looking at the evidence, the question is what evidence. I might get frustrated with anti-vaccine folks because it seems so obvious, if you just look at the scientific literature, talk to the experts, or listen to the guidance of health organizations, that vaccines are safe and effective. But notice here—I'm not exactly doing my own primary research here. I'm not conducting a bunch of high-powered, blinded studies. I have certain sources I trust.
If you listen to anti-vaccine folks, it's common to hear them dismiss certain sources of information because they suspect that source is in the pocket of big pharma (note to big pharma if you’re reading this: I accept direct deposit or mysterious envelopes stuffed with cash). Anti-vaxxers aren't able to do their own primary research either, so they're relying on their own sources that they've come to trust.

For someone who has read dozens of articles from various sources explaining the evils of vaccines, they aren’t going to be persuaded by a single article indicating vaccines are safe and effective. Instead it might just lead them to trust the source that published the article less. You might think that would be an irrational reaction, but it's perfectly rational to trust a source less when you discover it disagrees with a strong belief you hold. If a book I was reading claimed the Earth is flat, I might take the rest of what it says with a grain of salt.
"Who do you trust" is a hard problem. There isn't a one-size-fits-all solution. "Trust the scientists" doesn't work when there are tons of crappy "science news" stories exaggerating the dubious findings of a single published study. "Trust the scientific consensus" is more easily said than done—how do you find out what the consensus is? You would have to trust a source.
We are plopped into this big chaotic world without all of the answers. We are taught stuff in school—and some of it ends up wrong! Our parents tell us stuff—and some of that ends up wrong! We read books and newspapers—and some them end up wrong! Unless you're lucky and happen upon Cognitive Wonderland, you're never going to find a single source that's guaranteed to always be accurate.
We're forced to muddle through, constantly updating our picture of the world, but knowing how to update requires making judgments on how reliable the information we're getting is. And part of how we judge the reliability of that information is by comparing it to our current beliefs, formed from previous judgments of the reliability of other information. We have to pull ourselves up by our bootstraps to construct our view of the world.
Someone with different beliefs from you might trust different sources. We all follow different paths as we build up our web of beliefs about the world. For someone who grew up in a family and community skeptical of vaccines, anti-vax beliefs are going to have a gravitational pull. They'll place more trust in those websites and authorities that agree with the beliefs they initially gained from trusted family or community members. Those authorities that disagree with their beliefs will be more suspect. None of that is irrational. It's how we're all forced to navigate a world full of uncertainty.
All is not lost
The above feels bleak. It might seem that I'm arguing there’s an equivalence in the justification of all beliefs—that we're all just going to trust the sources that agree with us, so there's no point in talking to each other and no way to get at the truth of the matter.
That's not the lesson here, though.
Here's a reassuring fact: People can and do update their beliefs towards what the evidence tells them.
Despite a lot of noise about a supposed "Backfire Effect", where presenting someone with evidence against their belief leads them to more strongly embrace that belief, the Backfire Effect itself lacks evidence (so I can debunk it without worrying it's going to reinforce itself, yay). Studies showing the Backfire Effect have failed to replicate, and if it does exist, it's weaker and less consistent than once thought.
There's good evidence that people update their beliefs in the correct direction when seeing evidence that disagrees with them, even on charged political issues. Alexander Coppock, in his book Persuasion in Parallel, details studies he conducted on the impact of exposing people to evidence on one side or another of a charged political topic. The studies show small but measurable changes in belief in the direction of the evidence presented.
The changes in beliefs being small makes sense. You shouldn't change your beliefs immediately because we read one op-ed that made the case for a position you think is wrong.
Coppock also replicates the effect of previous studies showing people trust a source less if it disagrees with their view—for example, thinking a scientific study was of low quality if its conclusions contradict their beliefs. Some researchers have made a big deal about this fact, characterizing it as an irrational rejection of evidence we don't like. But it isn't irrational, that's exactly what you would expect a rational person to do. A rational person will trust a source less if it says something they believe to be false.
These combined effects are interesting—a source saying something we disagree with makes us trust the source less, but makes us less certain in our belief. It’s like the confidence we have in the belief and trust in the source trade off.
Here's another reassuring (but obvious) fact: There are reasonable ways to assess different sources.
Not all sources are created equal, and we judge them on more than just whether they agree with us or not. If they have reputations to maintain and have competitors that could call them out—like journalists and scientists do—their statements should hold more weight. If what they're saying doesn't make coherent sense or they demonize people who disagree with them, you probably shouldn't put much stock in what they say. We also should be more certain of evidence we've heard from multiple independent sources.
This is all obvious stuff, but it's worth taking stock of why we trust the sources we do. It can help us understand where our confidence comes from and develop more sympathy for those that disagree with us on substantive issues.
My point in all of this is that we can't just dismiss those that disagree with us as irrational or as refusing to look at the evidence. People are rightly conservative about changing their beliefs. There's reason to think people are at least a bit receptive to evidence that contradicts what they believe. We should acknowledge that we're all trying to solve the hard problem of understanding a world without an answer book and cut each other some slack. But we should also not be afraid to correct each other.
Texas is in the midst of an entirely preventable measles outbreak, thanks to people having wrong beliefs about vaccines. Some of this is because of bad actors, like Andrew Wakefield, who deserve derision. But the vast majority of people who forgo vaccinations just have incorrect beliefs. We shouldn't shy away from disagreeing with them or confronting these false beliefs head-on, but we also shouldn't demonize them. They're making the choices they think are best for their children. We're on the same team of trying to make sense of this confusing world. Let's try to help them out.
Please hit the ❤️ “Like” button below if you enjoyed this post, it helps others find this article.
If you’re a Substack writer and have been enjoying Cognitive Wonderland, consider adding it to your recommendations. I really appreciate the support.
Thanks for articulating this point so well, Tommy. I'm creating a curriculum to help my kids develop the ability to navigate the many sources of information they're exposed to. Can I get your permission to include this?
It would do better in communication with people that disagree for whatever reasob, if no assumptions of stupidity, irrationality or malice would be made.