跳轉到主要內容
Category: Effects
Type: Cognitive Bias
Origin: Cognitive psychology research, 1960s–1970s, Peter Wason, Ray Nickerson
Also known as: Confirmatory Bias, Myside Bias, Perseverance Effect
Quick Answer — Confirmation Bias is the cognitive tendency to seek, interpret, and remember information that confirms existing beliefs while ignoring or discounting contradictory evidence. First documented by Peter Wason in the 1960s, this bias affects every stage of information processing—from what we notice to how we remember it. It helps explain why people rarely change their minds despite new evidence.

What is Confirmation Bias?

Confirmation Bias is not a single bias but a family of related cognitive tendencies that lead people to favor information confirming their preexisting beliefs. This bias operates at multiple levels: what we choose to look for, how we interpret what we find, and what we later remember. The core mechanism is selective processing: when you hold a belief, you unconsciously tend to seek evidence that supports it. If you believe a particular investment will succeed, you read positive analyses and ignore warning signs. If you distrust a particular political figure, you focus on their failures and dismiss accomplishments.
We see what we believe, not just believe what we see.
This bias is remarkably resilient. Even when presented with clear contradictory evidence, people often dismiss it, reinterpret it, or simply fail to notice it. The famous phrase “seeing is believing” underestimates how much our perceptions are shaped by what we already believe.

Confirmation Bias in 3 Depths

  • Beginner: Noticing how you automatically seek information that agrees with your views—social media feeds that reinforce opinions, news sources that align with political beliefs, conversations with like-minded people.
  • Practitioner: Actively seeking disconfirming evidence by asking “What would change my mind?” and deliberately exposing yourself to opposing viewpoints and uncomfortable data.
  • Advanced: Building systematic processes that force objectivity—blind analysis, pre-registration of hypotheses, devil’s advocate roles, and decision journals that track predictions against outcomes.

Origin

Confirmation bias was first systematically documented by Peter Wason in his famous 1960 “2-4-6 task” experiment. Wason asked participants to discover a rule he had in mind by generating number triples. Most people generated only examples that would confirm their hypothesized rule, rather than trying to disprove it—even when explicitly invited to do so. In the 1970s, Ray Nickerson conducted influential research showing how confirmation bias pervades reasoning in science, medicine, intelligence analysis, and everyday judgment. His work demonstrated that even trained professionals fall victim to this bias when evaluating evidence. The concept gained wider attention through Thomas Gilovich’s 1991 book “How We Know What Isn’t So,” which documented confirmation bias across multiple domains. Later, researchers like Raymond Nickerson (1998) and Ziva Kunda (1990) elaborated the mechanisms, showing that the bias operates even when people are motivated to be accurate. Today, confirmation bias is recognized as one of the most robust and consequential cognitive biases, with significant implications for decision-making, policy formation, and social polarization.

Key Points

1

Search Bias

People tend to seek information that confirms existing beliefs. This begins with what we choose to read, watch, and whom we associate with—creating echo chambers that reinforce pre-existing views. In experiments, participants asked to judge whether a statement was true disproportionately sought confirming evidence.
2

Interpretation Bias

Ambiguous information is interpreted as supporting existing beliefs. The same data—a stock price movement, a political event, a medical symptom—can be read in multiple ways, and we reliably read it in the way that supports what we already believe.
3

Memory Bias

We remember information that confirms our beliefs more accurately than contradicting information. Over time, this creates increasingly skewed memories that reinforce original beliefs, even when the actual evidence was mixed.

Applications

Scientific Research

Scientists can unconsciously design experiments that favor their hypotheses, interpret ambiguous results favorably, and remember supporting studies while forgetting contradictory ones. Pre-registration of hypotheses and blind analysis help combat this.

Investment Decisions

Investors often seek confirming evidence for their thesis while ignoring warning signs, leading to delayed recognition of bubble bursts or company failures. Diversifying information sources and setting pre-defined exit criteria can help.

Medical Diagnosis

Doctors may anchor on an initial diagnosis and seek evidence supporting it while dismissing contradictory symptoms. This contributes to diagnostic errors. Second opinions and systematic differential diagnosis counter this tendency.

Legal Judgment

Jurors and judges may interpret evidence through the lens of existing beliefs about defendants, victims, or the case. Blind review processes and clear evidentiary standards help reduce this bias in fair trials.

Case Study

The 2008 Financial Crisis and Confirmation Bias

The 2008 financial crisis provides a stark example of confirmation bias across the financial industry. For years before the crash, many analysts, ratings agencies, and investors looked at the same housing market data but interpreted it very differently depending on their prior beliefs. Those who believed housing prices would continue rising saw strong demand, limited supply, and population growth as confirmation. Those who were skeptical saw rising mortgage defaults, increasing debt-to-income ratios, and declining lending standards as warnings. The critical failure: Bear Stearns hedge funds collapsed in June 2007, providing clear disconfirming evidence for the “housing never falls” belief. Yet many investors and institutions reinterpreted this as an anomaly, focused on reassuring data, and maintained or increased their mortgage-backed securities positions. Analysts at Moody’s and Standard & Poor’s, who had conflicts of interest (being paid by the issuers they rated), showed particularly strong confirmation bias in issuing inflated AAA ratings. Post-crisis analysis by the Financial Crisis Inquiry Commission found that “we saw what we wanted to see” described the collective failure across the system. The confirmation bias that had been documented in laboratory experiments for decades played out catastrophically in real-world financial decisions.

Boundaries and Failure Modes

Confirmation bias is nearly universal but has important nuances:
  • Myside Bias is Stronger Than Truth-Seeking: People generally prefer confirming evidence even when explicitly trying to be objective. The bias is not simply about laziness or stupidity—even motivated scientists show it.
  • Identity-Strengthening Beliefs Resist Disconfirmation: Beliefs tied to personal or group identity are especially resistant to contradictory evidence. The more central a belief is to identity, the stronger the confirmation bias around it.
  • Disconfirming Evidence Can Strengthen False Beliefs: In some cases, presenting contradictory evidence to strongly held beliefs backfires, causing people to cling even more tightly to their original views—the “backfire effect.”
  • It’s Not Always Bad: Some confirmation bias may serve psychological functions, reducing anxiety and providing stability. The question is whether the benefit outweighs the cost of potential error.

Common Misconceptions

Wrong. Extensive research shows that intelligence does not protect against confirmation bias. In some studies, more intelligent participants showed stronger bias when their identity was tied to the belief.
Wrong. Wanting to be objective does not eliminate the bias. Research shows that even when people are explicitly motivated to be accurate, confirmation bias persists. Systematic processes, not just intentions, are needed.
Wrong. Confirmation bias operates at multiple stages—search, interpretation, and memory. Even if you see the same data as someone else, you’ll likely remember and interpret it differently based on prior beliefs.
Confirmation Bias connects to several other cognitive biases and mental phenomena.

Anchoring Effect

The tendency to rely too heavily on the first piece of information encountered. Often works with confirmation bias—once anchored on a belief, you seek confirming evidence.

Dunning-Kruger Effect

The tendency for people with low ability to overestimate their competence. Often interacts with confirmation bias—less knowledgeable people may be more confident in their beliefs and thus more resistant to disconfirming evidence.

Sunk Cost Fallacy

Continuing a behavior because of previously invested resources. Often reinforced by confirmation bias—you confirm that past investments justify continued investment.

Availability Heuristic

Judging probability by ease of recalling examples. Often works with confirmation bias—you remember evidence that is more memorable, which often aligns with existing beliefs.

Belief Perseverance

The tendency to maintain beliefs even when evidence contradicts them.

Selective Perception

The tendency for expectations to affect perception.

One-Line Takeaway

Combat confirmation bias by actively seeking disconfirming evidence, asking “what would change my mind?” before concluding, and building systematic decision processes that force objectivity.