The Illusory Truth Effect: Why Your Brain Believes Lies It's Heard Before
Repetition makes false claims feel true — even when you know better. The illusory truth effect explains how propaganda, ads, and social media exploit your brain's shortcuts.

Hearing something twice makes your brain rate it as more true than something you're hearing for the first time. It doesn't matter whether the claim is actually true. It doesn't even matter if you already know the correct answer. This is the illusory truth effect — one of the most reliable findings in cognitive psychology — and it's the engine behind every propaganda campaign, viral myth, and effective advertisement you've ever encountered.
Your Brain Takes Shortcuts. Propagandists Know This.
In 1977, psychologists Lynn Hasher, David Goldstein, and Thomas Toppino ran a simple experiment at Villanova and Temple University. They showed participants a list of trivia statements. Some were true. Some were false. Two weeks later, they showed the same people a mix of old and new statements and asked them to rate how true each one felt.
The repeated statements — true and false alike — got higher truth ratings. Just seeing them before was enough.
The mechanism is called processing fluency. When your brain encounters something it's processed before, that information moves through your mental machinery more smoothly. The experience feels easy, familiar, comfortable. And your brain has learned a rough rule from years of navigating the world: things that feel easy to process are usually things you've encountered because they're true.
Usually, that rule works fine. You've heard "the Earth orbits the Sun" thousands of times. It processes smoothly. It's also true. But the shortcut doesn't check whether the source of that familiarity is actual truth or just repetition.
Knowing Better Doesn't Save You
Here's the part that should worry you.
In 2015, Lisa Fazio and colleagues at Vanderbilt University tested whether knowledge could protect people from illusory truth. They showed participants statements like "A sari is the name of the short plaid skirt worn by Scots." Most participants knew a kilt is the correct answer. They could produce the right answer on a knowledge test.
But after seeing the false sari statement repeatedly, they still rated it as more true than new false statements they hadn't seen before. They knew the right answer and still felt the pull of the repeated wrong one.
The researchers used multinomial modeling to show that participants sometimes relied on fluency even when correct knowledge was available to them. In other words, your gut feeling about what's true can override what you actually know — and repetition is what sets that gut feeling.
This isn't about intelligence. A 2019 analysis found that higher intelligence and analytical thinking style offered no protection against the effect. Smart people fall for it too. They just fall for it about things they know less about.
The Biggest Jump Happens on the Second Exposure
Recent experiments have mapped how the illusory truth effect scales with repetition. In a 2021 study published in Cognitive Research: Principles and Implications, researchers showed participants trivia statements up to 27 times and measured truth ratings at each stage.
The results followed a logarithmic curve. The biggest single jump in perceived truth happened between the first and second exposure. After that, each additional repetition still increased perceived truth, but by smaller and smaller amounts.
Think about what that means for your social media feed. You don't need to see a false claim 27 times for it to start feeling true. You need to see it twice. Once in a tweet. Once in a screenshot someone else shared. That's enough to shift the needle.
Social Media Is a Repetition Machine
The MIT Media Lab published the largest study of news diffusion on Twitter in 2018. They tracked 126,000 verified true and false news stories shared by roughly 3 million people over 11 years. False news reached 1,500 people about six times faster than true stories. False political news was even worse — it reached 20,000 people roughly three times faster than other types of false news.
The researchers found that humans, not bots, were primarily responsible for spreading false news. People shared it because it felt novel and emotionally charged. But the structural result was mass repetition: millions of people seeing the same false claims multiple times across different accounts, retweets, and screenshots.
Social media doesn't just spread false claims. It creates the exact conditions — repeated, multi-source exposure — that make your brain classify those claims as true.
And the platforms' algorithms accelerate this. Content that gets engagement gets shown more. False claims that trigger outrage get engagement. So they get shown again. And again. Each impression quietly nudging your brain's truth meter upward.
How This Gets Weaponized
Every effective information operation relies on the illusory truth effect, whether the operators know the term or not.
Political campaigns repeat slogans because repetition works. "Build the wall." "Yes we can." "Take back control." The policy content matters less than the familiarity. After hearing a phrase enough times, it starts to feel like common sense rather than a campaign message.
State propaganda outlets publish the same narrative framing across dozens of seemingly independent channels. The goal isn't to convince you with a single persuasive article. It's to make sure you encounter the same frame — "NATO aggression," "colour revolution," "Western interference" — from enough different sources that it starts to feel like an obvious description of reality rather than a specific editorial choice.
Advertisers discovered this decades before psychologists named it. Effective frequency — the number of times a person needs to see an ad before acting — has been a core concept in marketing since the 1970s. The industry settled on a rough minimum of three exposures. Not coincidentally, that's about where the illusory truth effect starts to plateau on the logarithmic curve.
What You Can Actually Do
Knowing about the illusory truth effect won't make you immune. The research is clear on that. But it changes how you pay attention.
Track the source, not the feeling. When a claim feels obviously true, ask yourself: do I believe this because I've seen evidence, or because I've seen it repeated? These are different things. Be suspicious of familiarity. The feeling of "I've heard this before, so it's probably true" is exactly the bias in action. Familiarity is not evidence. Watch for multi-source repetition. If you see the same claim across five accounts, check whether they're citing the same original source or whether each one independently verified the claim. Five people sharing one unverified tweet isn't five sources. It's one source and four repetitions. Slow down on the second exposure. The biggest truth-rating jump happens when you encounter something the second time. That's your highest-risk moment. If you notice yourself thinking "oh yeah, I saw that" — that's the moment to fact-check, not the moment to scroll past.The illusory truth effect isn't a flaw you can patch. It's a feature of how human cognition processes the world. The only real defense is understanding how it works and building habits that account for it. Your brain will always trust familiarity. Your job is to make sure familiarity hasn't been manufactured for you.
Sources & Verification
Based on 5 sources from 1 region
- Cognitive Research: Principles and Implications (PMC)North America
- Journal of Experimental Psychology (APA)North America
- MIT News / Science (Vosoughi et al.)North America
- The Decision LabNorth America
- Psychology TodayNorth America
Keep Reading
Headlines Change What You Think Before You Read the Article
A news headline doesn't just summarize a story. It tells your brain how to interpret everything that follows. Here's how that works — and what you can do about it.
The Media Literacy Vaccine That Backfires
UNESCO research shows deepfake exposure increases gullibility, not skepticism. Every 'spot the fake' quiz might be making the problem worse.
2026 Iran Conflict: AI Deepfakes Got 70M Views, Real War Harder to Track
The 2026 Iran conflict unleashed a flood of AI-generated combat footage. 70 million people watched fake missile strikes. Even a governor fell for it.
Explore Perspectives
Get this delivered free every morning
The daily briefing with perspectives from 7 regions — straight to your inbox.