Why Your Brain Tricks You Into Thinking You’re Always Right
Confirmation bias is the reason a flat-earther can watch a rocket launch and walk away more convinced the earth is flat — and it’s the same reason you’ve probably argued with someone online, presented them with solid evidence, and watched them double down anyway. Including, if you’re honest, yourself.
This isn’t a flaw in a few stubborn people. It’s a universal feature of the human brain. And once you understand how it works, you’ll start seeing it everywhere — in the news you read, the friends you trust, and the quiet way your own mind cherry-picks reality.
What Confirmation Bias Actually Does to Your Brain
The term was coined by English psychologist Peter Wason in 1960 after a deceptively simple experiment. He showed participants a sequence of three numbers — 2, 4, 6 — and asked them to figure out the hidden rule. Participants could test their own sequences and get told if they followed the rule.
Almost everyone immediately guessed the rule was “ascending even numbers” and then spent the rest of the experiment only testing sequences that confirmed that guess — like 4, 8, 12 or 10, 12, 14. Very few tested anything that might disprove their theory, like 1, 2, 3. The actual rule was simply “any three ascending numbers.” Most never found it. They were so busy confirming what they already believed that they never tested whether they were wrong.
“The mind is not a blank slate — it is a prosecutor looking for evidence to win a case it has already decided.”
That’s confirmation bias in a nutshell. According to Psychology Today, it’s one of the most documented and studied cognitive biases in all of psychology. Your brain doesn’t passively receive information — it actively filters it. Information that supports what you already believe gets fast-tracked. Information that challenges it gets skeptically interrogated, or quietly ignored.
The Brain Science Behind Why This Happens
There’s a neurological reason your brain does this, and it’s not stupidity — it’s energy conservation. The brain accounts for about 20% of your body’s energy use while being only 2% of your body weight. Updating existing beliefs requires mental effort. Confirming them doesn’t. So the brain, being the efficiency-obsessed machine it is, defaults to confirmation whenever it can.
Research published in NCBI shows that when people encounter information that aligns with their beliefs, the brain’s reward circuits — the same ones that light up for food and social approval — activate. Agreeing feels good. This is why correcting misinformation is so difficult. You’re not just fighting ignorance; you’re competing with dopamine.
This connects directly to a broader family of mental shortcuts known as cognitive biases that shape your decisions. Confirmation bias is often considered the most powerful of them because it operates upstream of everything else — it determines which information your other mental shortcuts even get to work with.
The “Backfire Effect” — When Correction Makes Things Worse
Here’s where it gets even stranger. In a series of studies by Brendan Nyhan and Jason Reifler, researchers gave people factually corrected versions of news stories — stories where the original false claim was explicitly debunked. For many participants, the correction didn’t just fail to change their minds. It made them believe the original false claim more strongly than before.
This phenomenon, called the backfire effect, suggests that when a belief is tied to your identity — political, religious, social — challenging it feels like a personal attack. The brain responds defensively, rallying more support for the original position rather than considering the new evidence. You’ve probably seen this play out in a Thanksgiving dinner argument at some point.
Confirmation Bias in the Real World
The 2016 US election offers one of the most studied real-world examples. Research from BBC Future found that voters on both sides consumed almost entirely separate media ecosystems, each reinforcing their existing views. When the same economic data was presented to supporters of both candidates, both groups interpreted the figures as evidence that their side was right.
The same pattern appears in medicine. Doctors who form an early diagnosis have been shown to unconsciously weight subsequent test results toward confirming it — a phenomenon called anchoring bias, a close cousin of confirmation bias. This is why second opinions save lives.
In financial markets, investors hold onto losing stocks longer than they should because they keep finding reasons the price “will come back.” Warren Buffett’s former partner Charlie Munger once said the most dangerous words in investing are “I know I’m right” — because that certainty is almost always built on selectively gathered evidence.
If you want to understand how this connects to social media and the psychology of outrage, check out our piece on why social media is designed to exploit your emotions.
The Echo Chamber Isn’t Just a Metaphor
Social media platforms make confirmation bias dramatically worse, and the mechanism is algorithmic. When you engage with content — liking it, commenting, spending time on it — the algorithm learns what confirms your worldview and shows you more of it. You aren’t searching for your bubble. The platform is building it around you in real time.
A 2021 study by NYU’s Center for Social Media and Politics found that users who were randomly assigned to see more cross-cutting content did not change their political opinions — but they did report feeling more exhausted. Encountering challenging information is genuinely cognitively tiring. No wonder we avoid it.
How to Actually Outsmart Your Own Confirmation Bias
The uncomfortable truth is that awareness alone doesn’t fix this. Studies show that people who learn about confirmation bias often become more confident in their own objectivity while applying it more rigorously to others. Knowing the name of the trap doesn’t automatically keep you out of it.
What actually helps is changing your process. One technique used by intelligence analysts and medical diagnosticians is called steelmanning — deliberately constructing the strongest possible version of the opposing argument before evaluating it. Not the weakest, straw-man version your brain naturally reaches for. The strongest one.
Another approach is to ask “what would change my mind?” before forming an opinion. If the honest answer is “nothing,” that’s not confidence — that’s confirmation bias operating at full power. This connects to the broader science of how to think more clearly under pressure, where metacognition — thinking about your thinking — is shown to produce significantly better decisions.
Scientists use pre-registration for exactly this reason: they publicly commit to their hypothesis and methodology before collecting data, so they can’t unconsciously shape the research around what they hope to find. You can apply a stripped-down version of this to your own beliefs by writing down your current position and the specific evidence that would genuinely update it — before you go looking for information.
The Socratic Approach
Socrates reportedly said he was the wisest man in Athens because he alone knew that he knew nothing. That’s not false modesty — it’s a description of a thinking style that actively resists confirmation bias. He built his entire method around asking questions designed to expose the limits of his own and others’ knowledge, not to confirm it. Two thousand years later, it remains one of the most effective intellectual tools we have.
You don’t have to be a philosopher to use it. The next time you’re very sure about something, try asking: “What’s the best evidence against this view?” Not to abandon your position, but to genuinely test it. The beliefs that survive that process are the ones actually worth having.
Curiosity, ultimately, is the antidote. Not the performed curiosity of someone who’s already decided what the answer is — but the genuine openness to being surprised. It’s harder than it sounds. But it’s what separates informed opinions from comfortable ones.
Think you’ve got a handle on how your brain really works? Try our Basic Science Quiz and find out just how much your confident brain actually knows — the results might surprise you.
What’s one belief you hold that you’ve never seriously tried to disprove? Drop it in the comments — let’s talk about it.
KK is one of the curious minds behind OrbitalBuzz.com. With a passion for exploring the world’s less-traveled paths, he uncovers hidden research and surprising facts that explain everything from the secrets of your brain to the patterns in your everyday life. He believes true knowledge begins with a question no one else is asking.


The Socratic Approach
