When I look at the world around me, it's striking how many people believe things that are demonstrably false, even though we live in an era where information is more accessible than ever before.
This phenomenon affects everything from personal medical decisions to global policies.
So I can’t help but wonder why, in an age brimming with data and facts at our fingertips, do so many of us cling to untruths? And more importantly, how can we counteract this and foster a more accurate understanding of reality?
To unravel these questions, we need to delve into the psychological and social mechanisms that shape our beliefs.
Here, I explore 7 key consideration about how we think and perceive information, along with practical antidotes to combat our tendency to succumb to false narratives and beliefs:
Confirmation Bias
The most obvious reason for believing untrue things is confirmation bias, that is believing things that already fit in with out existing worldview. We tend to both search for and remember information that confirms our preconceptions. Very few of us consistently follow those who we don’t quite agree with, especially on issues of importance to us. And instead of looking for evidence of claims with an open mind, many of us are rather good at finding the kind that support existing beliefs, and thus reinforce them.
In other words, it’s easier to get someone to believe something false that aligns with their existing beliefs, than something false that doesn’t.
Not only that, but apparently it takes us longer to read something we disagree with than when we don’t, due to negative feelings.
Repetition and the “illusory truth effect”
Even when debunked, lies that get repeated a lot linger and influence us. It’s the "what if" effect.
In 1977 study, a phenomenon called the illusory truth effect was first identified at Villanova University and Temple University. It reflects the tendency to believe false information due to repeated exposure.
When you hear something for the first time your brain spends time processing it. However, repeated claims get less attention. Thus, we are wired to be more likely to believe something when we encounter it frequently enough.
When it’s relatively effortless for us to process something, we also tend to get the false impression of accuracy. We tend to FEEL that it is correct. And the more we see certain information repeated, the easier it gets for us to process it.
Social media exacerbates this significantly.
A study published in Science (2018) showed that false stories reach people significantly faster than true stories do—often due to people sharing false or inaccurate stories. In fact, the study found that false stories are 70 percent more likely to be shared than true ones, and travel at six times the speed. Particularly viral false stories can reach 1,500 people 10 or 20 times faster than facts.
“We found that falsehood diffuses significantly farther, faster, deeper, and more broadly than the truth, in all categories of information, and in many cases by an order of magnitude,” Sinan Aral, professor at the MIT Sloan School of Management and co-author of the paper told MIT News.
Propagandists (and advertisers) rely on this, it is part of their handbook.
Repeat a lie enough and it becomes the truth.
Truth nugget bombing
As is the case in a lot of false claims or fake news, there are nuggets of truth sprinkled throughout. It’s rarely the case that EVERYTHING is fabricated. As result, our brains recognize that certain things are true based on previous experience. As result of our recognition of these “truth nuggets,” we tend to treat the information that’s new to us with less skepticism. After all, if there’s much information here that’s accurate, what are the odds of the claim being false overall? That’s how the trojan horse gets in.
Social consensus
We tend to believe things that a lot of other people do—particularly if they are in our tribe. But we must remember that a lot of people believing something doesn’t mean that the claim is true. There are many examples of people being wrong at once throughout history. I just need to remind you that years ago people thought that the earth was flat, and not too long ago, doctors saw nothing wrong with smoking.
Additionally, people, as a whole, feel safer believing the same things as their peers.
As a society, we will continue to be wrong in masses, I’m sure. However, we should not take a mass consensus as confirmation of something being true on face value alone.
Verify what you can. And for the more complex topics, it’s best to find experts you trust based on their knowledge levels, critical thinking, and open-mindedness. Compare what those who fit that criteria think.
Availability heuristic bias
The more evidence we see that supports something, the more likely we are to believe it. But often false claims have plenty of easy-to-access supporting “evidence.” And this is where we get in trouble. When certain information is easy to access and understand, we tend to pursue more of it. For example: posts on social media, or breezy articles. But higher quality evidence tends to be found in peer-reviewed journals, length reports, difficult to analyze studies and statistics, etc. These are harder to access and more difficult to digest. So naturally, we see less of them. So we’re left with the impression that there’s more evidence to support one thing than another—especially since we’re directing our own flow of information. That is, we google what it is that we want to find to confirm our assertions.
Emotional reach
A false well-told story tends to resonate better than a poorly told truthful one. Further, emotionally intense delivery makes us remember things better.
It’s a tool the likes of Hitler used. He’d show up about 30 minutes late to build anticipation from the crowds, start his speeches quietly and slowly to get the audience’s attention, and then build up to an intense emotional fervor.
When people are more emotional, they tend to be far less rational. A useful tactic for genocidal dictator.
Things that play on emotions like fear or anger tend to be highly effective.
Paying more attention to how things hit you emotionally, and the tactics being used can be a good defense.
Cognitive Dissonance & Tribalism
If your entire social circle holds a particular belief, expressing a dissenting opinion can jeopardize your relationships within that group. This social pressure often makes it simpler to accept the prevailing "truth" and avoid seeking evidence that might contradict it. By not questioning or exploring deeper, you avoid the cognitive dissonance that arises from holding conflicting beliefs, as well as the potential social repercussions of challenging the group's consensus. Or remaining within that group despite knowing that they are wrong or your values don’t align.
For example, if everyone in your friend group believes in a specific diet as the healthiest option and you come across information suggesting otherwise, questioning their belief could lead to arguments or ostracism. People prefer to avoid conflicts and oblivion makes it easier.
Likewise, we tend to avoid internal conflicts. When confronted with information that contradicts our existing beliefs, cognitive dissonance creates an uncomfortable psychological tension so instead of changing our minds, we look for ways to rationalize the new information so that it fits out existing worldview. We might dismiss it as false or exaggerated—or avoid it altogether.
If it threatens our identity or self-image, particularly in the case of long-held beliefs, we’re more inclined to protect our sense of self by rejecting the information. And if we’ve invested time, effort, and money to support our beliefs, admitting that it was a lie would mean acknowledging that these investments were a waste—which carries with it a sense of loss that we’d prefer to avoid.
The Antidote
In an age where false information can spread rapidly, it's crucial to take deliberate steps to avoid falling for lies.
Start by being vigilant about repeated claims and assess them critically based on factual accuracy, not just their narrative appeal. Pay attention to repetition.
When confronted with extraordinary assertions, demand robust evidence rather than vague statements or references from unreliable sources.
Cultivate a sense of curiosity that drives you to explore information beyond your current beliefs, seeking out high-quality research rather than what is conveniently accessible. Be sure to acknowledge the limits of your knowledge and be wary of the temptation to share information that aligns with your biases without verification. Don’t become part of the problem by spreading lies.
Scrutinize arguments that play on your emotions rather than your rational judgment, and be alert to manipulative tactics used to sway your opinion. Resist the urge to entwine your beliefs with your identity or conform to the beliefs of your social circle.
By becoming more intentional about how (and why) you consume information, you become better equipped to spot lies and avoid mind traps.
Don’t forget to pay attention to repetition.
Lastly, just because you want to believe it, doesn’t make it true.
Have I missed anything important? Please share in the comments.
Feel free to pass this post along to anyone who might be interested in understanding why even the smartest people can believe false things from time to time.
Order my book, No Apologies: How to Find and Free Your Voice in the Age of Outrage―Lessons for the Silenced Majority —speaking up today is more important than ever.
NOTE TO READERS:
Thank you for keeping me company. Although I try to make many posts public and available for free access, to ensure sustainability and future growth—if you can—please consider becoming a paid subscriber. In addition to supporting my work, it will also give you access to an archive of member-only posts. And if you’re already a paid subscriber, THANK YOU!
☕️ By popular request, you can also support my work by making a one-off donation via Buy Me a Coffee.
I recognize some of these tendencies in myself, such as avoiding the pain involved in reading something that might make me change my mind versus the pleasure I take in something that confirms my beliefs. I'm not too bad at admitting I'm wrong about something, but it needs to come in small doses or in particularities. Anything more requires the time and leisure to contemplate a change. In fact, I would say another factor would be the lack of time most people have to even begin to process a challenge to what they already believe. And even then, they have to be capable of such a process. Not all people are. Of course, overturning a world view would be pretty hard for anyone. That's the stuff of memoirs.
Great summary. Social consensus and tribalism are fueled by MSM, academia, and big tech which all have hard left biases. Improving your information diet is critical to better mental and physical health: https://yuribezmenov.substack.com/p/substack-info-diet-tips-unplug-from-matrix