If you look at the research literature on self-serving biases, it’s little surprise that critical thinking – much needed in today’s world – is such a challenge. Consider three human biases that you may already have heard of: most of us think we’re better than average at most things (also known as illusory superiority or the Lake Wobegon Effect); we’re also prone to “confirmation bias”, which is favouring evidence that supports our existing views; and we’re also susceptible to the “endowment effect” which describes the extra value we place on things, just as soon as they are ours.
A new paper in the Quarterly Journal of Experimental Psychology by Aiden Gregg and his colleagues at the University of Southampton extends the list of known biases by documenting a new one that combines elements of the better-than-average effect, confirmation bias and the endowment effect. Gregg’s team have shown that simply asking participants to imagine that a theory is their own biases them to believe in the truth of that theory – a phenomenon that the researchers have called the Spontaneous Preference For Own Theories (SPOT) Effect.
Across three studies, the researchers asked hundreds of participants to imagine a fictional planet in a distant solar system, inhabited by various creatures some of which are predators and others prey. Focusing on two of the creatures on the planet – Niffites and Luppites – the participants were either told to imagine that they (that is, the participant himself or herself) had a theory that the Niffites were the predators and the Luppites were their prey, or that somebody called Alex had this theory. This background scenarios was chosen to be neutral and unconnected to existing beliefs, and the hypothetical “ownership” of the theory by some of the participants was intended to be as superficial and inconsequential as possible.
Next, the researchers presented the participants with a series of seven facts relevant to the theory. The first few were mildly supportive of the theory (e.g. Niffites are bigger), but the last few provided strong evidence against (e.g. Luppites have been observed eating the dead bodies of Niffites). After each piece of evidence, the participants were asked to rate how likely it was that the theory was true or not.
The way the participants interpreted the theory in the light of the evidence was influenced by the simple fact of whether they’d been asked to imagine the theory was theirs or someone else’s. When it was their own theory, they more stubbornly persisted in believing in its truth, even in the face of increasing counter-evidence.
This spontaneous bias toward one’s own theory was found across the studies: when different names were used for the creatures (Dassites and Fommites); whether testing happened online or in groups in a lecture room; regardless of age and gender; and also when an additional control condition stated that the theory was no one’s in particular, as opposed to being Alex’s. The last detail helps clarify that the bias is in favour of one’s own theory rather than against someone else’s. Surprisingly perhaps, the bias wasn’t found to be stronger in participants who scored higher for other forms of self-serving bias such as narcissism or claiming to be familiar with non-existent words.
In statistical terms, the size of the bias that participants showed toward their “own” theory was modest, meaning that the practical implications of these new findings are probably limited (though the basic principle – that we’re biased towards our own theories – certainly seems relevant to understanding the wrangles over facts and truth going on in the media these days). Gregg and his colleagues said that the more important point was the theoretical implications of their results. The fact it was so easy to induce this bias in the participants, even based on such minimal personal connection to the theory, “only underlines how exquisitely sensitive to self-enhancing biases the human mind actually is,” they said.