Good News For Science, Bad News For Humanity – The “Bias Blind Spot” Just Replicated (“Everyone Else Is More Biased Than Me”)

GettyImages-1092016748.jpgBy Matthew Warren

Psychology’s replication crisis receives a lot of airtime, with plenty of examples of failed replications and methodological issues that cast doubt on past research findings. But there is also good news: several key results in cognitive psychology and personality research, for example, have been successfully replicated.

Now researchers have reproduced the results of another highly-cited study. Back in 2002, Emily Pronin and colleagues first described the “bias blind spot”, the finding that people believe they are less biased in their judgments and behaviour than the general population – that is, they are “blind” to their own cognitive biases. And while that study kick-started a whole line of related research, no one had attempted to directly replicate the original experiments.

But in a preregistered preprint published recently to ResearchGate, Prasad Chandrashekar, Siu Kit Yeung and colleagues report reproducing the original study, first in a small group of Hong Kong undergraduates, and then in two larger samples of 303 and 621 Americans who completed online surveys. 

Each group was given detailed descriptions of eight cognitive biases such as “hostile media bias” (where people view neutral media reports as hostile towards their own point of view) and “self-serving bias” (where people take responsibility for their successes but not failures). They were also given descriptions of three visible “personal shortcomings”, like procrastination and fear of public speaking.

The participants rated the extent to which they thought they exhibited these biases and shortcomings, before rating how much other people generally show these tendencies. Consistent with Pronin’s original result, across all three studies participants thought that their own susceptibility to biases was lower than that of others. 

When it came to rating personal shortcomings, the results diverged from the original study. While Pronin had found no difference in how people rated their own and others’ shortcomings, the new research found that participants rated their own shortcomings as less severe. However, this difference was small, suggesting that people are more “blind” to their cognitive biases – which remain invisible – than to personal weaknesses which may have much more obvious effects (e.g. being late to hand in an assignment due to procrastination).

Chandrashekar’s team also added a new spin to the original study by investigating whether participants’ blind spot bias was influenced by their belief in free will (rated by their agreement with statements like “I am in charge of the decisions I make”), which is known to affect how people judge themselves and others.

Participants with a stronger belief in free will had a greater blind spot when it came to personal shortcomings. That is, the gap between how they rated their own and others’ shortcomings was larger than the gap for people with a weaker belief in free will. The authors suggest that this is because people who believe more strongly that someone’s behaviours and decisions are completely under their own control are less likely to see such shortcomings in themselves: for them, such an admission would imply they have some innate weakness that can’t be overcome through their own choices. Belief in free will did not have any effect on the blind spot for cognitive biases, however. 

The authors write that their results “provide reasonable support” for the findings of the original study, suggesting that the bias blind spot is a robust phenomenon. But it’s perhaps surprising that it is reportedly the first direct replication of a piece of research that is more than 15 years old and has received nearly 1,000 citations. Hopefully, the new effort is a sign that the field is increasingly recognising the value in well-designed replication attempts – something the authors themselves acknowledge. “The study contributes to the recent call for systematic, large-scale, and preregistered replication and validation studies,” they write.

Agency and self-other asymmetries in perceived bias and shortcomings: Replications of the Bias Blind Spot and extensions linking to free will beliefs [this paper is a preprint meaning that it has not yet been peer reviewed and the final published version may differ from the version that this report was based on]

Matthew Warren (@MattbWarren) is Staff Writer at BPS Research Digest

9 thoughts on “Good News For Science, Bad News For Humanity – The “Bias Blind Spot” Just Replicated (“Everyone Else Is More Biased Than Me”)”

  1. This is interesting, but I’d also like to point out a bias in the phrasing itself. There seems to be a confusion in the understanding of belief in free will versus the belief in personal responsibility. Even those who argue against the concept of free will, as I do myself, would agree that “I am in change of the decisions I make”. It’s the causal mechanism of how I come up with that decision that is the issue, the “free” part. Even chess-playing algorithms are in charge of the decisions they make in terms of what move to make; it is merely that the decision is made through a computational process of propagating a set of inputs through a computational model in a finite time, given a variety of other parameters dependent on other physical factors, that results in the output decision.

    Responsibility has to do with consequences toward others. Regardless of the cause of decisions, if somebody is accused of doing harm to others then they need to be segregated, investigated, judged, and if found to have not be a causal problem to others (not guilty, innocent, or justified), then allowed to continue freely. If found to have been the causal problem, then for the sake of others it is important to correct that decision-making process, whether it be to adjust cost-benefit analysis (deterrence), changing the computational model (rehabilitation), permanent segregation if unfixable (life imprisonment), or disassembly (capital punishment). Responsibility is about the risks to others in society, and how to change them, not about free will.

    I suggest then that the question of “I am in change of the decisions I make” is a measure of their beliefs about personal responsibility, not about free will.

  2. Hey my friend, I think this is a typo – charge/change
    belief in free will (rated by their agreement with statements like “I am in change of the decisions I make”),

    By the way, I always read your material. It most often contains compelling evidence to further my own self-understanding and more global perspective.

    Thank you.

  3. I wonder if bias is stronger in people with strong beliefs in something like say ‘evidence based therapy’ the people fond of saying things like ‘the research says’ without ever having read any of the critics of the research and only a basic grasp of research methods and how to make sense of them beyond the superficial- this echo chamber is most easily spotted at any mental health service where stressed people are running a sort of production line of suffering focussed on empty paper based notions of recovery that determine and drive continued service funding and self interest.

Comments are closed.