We love a puzzle here at Research Digest — so here’s a couple from a recent paper in Cognition. See whether you can unscramble the anagrams in the following sentences (read on for the answers!):
The Cocos Islands are part of idnionsea
eeebyoshn kill more people worldwide each year than all poisonous snakes combined
If you successfully solved the anagrams, you may have experienced an “Aha!” or “Eureka” moment: a flash of insight where the solution suddenly becomes clear, perhaps after you have spent a while completely stumped. Usually when we experience these moments we have indeed arrived at the correct answer — they don’t tend to occur as much when we’ve stumbled upon an incorrect solution. And in fact, researchers have suggested that we even use Aha! moments as a quick way to judge the veracity of a solution or idea — they provide a kind of gut feeling which tells us that whathas just popped into our mind is probably correct.
But relying on these experiences to gauge the truth of an idea can sometimes backfire, according to the authors of the new paper. The team found that experiencing sudden moments of insight when deciphering a statement can make people more likely to believe that it is true — even when it isn’t.
“The sense of self is a hallmark of human experience. Each of us maintains a constellation of personal memories and personality traits that collectively define ‘who we really are'”.
So begins a new paper, published in the Journal of Experimental Psychology, which reveals that who you “are” can easily be manipulated. Just imagining somebody else can alter all kinds of aspects of how you see yourself, even including your personality and memories.
Is your mental library a haven of accurate and well-informed facts, or are there mistruths hiding on the shelves? It’s natural to assume that we update our beliefs in line with the most recent and well-established evidence. But what really happens to our views when a celebrity endorses a product that becomes discredited by science, or when a newspaper publishes a story which is later retracted?
A recent paper from the Journal of Consumer Psychology presents a novel take on this topic, by investigating the continued influence effect. Anne Hamby and colleagues suggest that our likelihood of continuing to believe retracted information depends on whether or not it helps us to understand the cause-and-effect structure of an event. Crucially, the team proposes, we would rather have a complete understanding of why things happen than a perspective which is more accurate, but less complete.
Imagine that tomorrow a catastrophe wipes out 99% of the world’s population. That’s clearly not a desirable scenario — we would all agree that a peaceful, continued existence is preferable. Now imagine that the disaster kills everyone, wiping out the human race. Most of us would rate that as an even worse occurrence.
But how do we see the relative severity of these different possibilities? Is there a bigger difference between nothing happening and 99% of people dying, or between 99% and 100% of people being wiped out?
This thought-experiment was first posed by the philosopher Derek Parfit, who thought most people would believe the first difference is greater — after all, going from business-as-usual to almost total annihilation is a big step. He, on the other hand, felt the second difference was greater by far: even if just a tiny fraction of humans survive, civilisation could continue for millions of years, but if humanity is wiped from the face of the Earth, then it’s all over.
Now a new study in Scientific Reports has found that, like Parfit predicted, most people don’t seem to share his view of human extinction as a “uniquely bad” catastrophe — until they are forced to go beyond their gut feeling and reflect on what extinction really means in the long term.
Log on to Twitter, open a newspaper or turn on the news and you’ll soon see just how prevalent anti-Muslim sentiment is, as well as how likely collective blame is to be placed on the group as a whole for actions perpetrated by a few Islamic extremists. Though American mass shootings are far more likely to be perpetrated by white men than Muslims, collective blame is rarely assigned to that group — instead, they are characterised as “lone wolves”, even when they explicitly belong to or espouse the views of neo-Nazi, white supremacist or misogynistic hate groups.
When a gym recently opened up near my house, I was determined to go regularly and make the most of the facilities. And I did — for about a month. But gradually, my visits became fewer and further between, until I realised I was paying for a bunch of machines and slabs of metal that I hadn’t touched in weeks. Guiltily, I cancelled my membership.
But perhaps I have my personality to blame. A new study tracking gym users has honed in one key factor that is related to how often they visit: their “planfulness”. This aspect of our personality, say the researchers, could be “uniquely useful” for predicting a range of goal-directed behaviours.
If you hear an unfounded statement often enough, you might just start believing that it’s true. This phenomenon, known as the “illusory truth effect”, is exploited by politicians and advertisers — and if you think you are immune to it, you’re probably wrong. In fact, earlier this year we reported on a study that found people are prone to the effect regardless of their particular cognitive profile.
But that doesn’t mean there’s nothing we can do to protect ourselves against the illusion. A study in Cognition has found that using our own knowledge to fact-check a false claim can prevent us from believing it is true when it is later repeated. But we might need a bit of a nudge to get there.
There’s a huge amount of research into how people differ in their ability to learn things deliberately and “explicitly”, such as memorising a list of words or instructions, for example. Far less studied is “implicit learning”. Ask a five-year-old to explain the grammatical rules of their language and they’ll likely have no clue where to start. And yet, they do know them – or at least, well enough to form coherent sentences. This kind of unconscious acquisition of abstract knowledge is an example of “implicit” learning.
Implicit learning may be especially important for young children, but adults depend on it, too. It “is recognised as a core system that underlies learning in multiple domains, including language, music and even learning about the statistical structure of our environments,” note the authors of a new paper, published in Cognition.
It’s a trick that politicians have long exploited: repeat a false statement often enough, and people will start believing that it’s true. Psychologists have named this phenomenon the “illusory truth effect”, and it seems to come from the fact that we find it easier to process information that we’ve encountered many times before. This creates a sense of fluency which we then (mis)interpret as a signal that the content is true.
Of course, you might like to believe that your particular way of thinking makes you immune to this trick. But according to a pre-print uploaded recently to PsyArXiv, you’d be wrong. In a series of experiments, Jonas De keersmaecker at Ghent University and his collaborators found that individual differences in cognition had no bearing on the strength of the illusory truth effect.
You should take just under two-and-a-half minutes to finish reading this blog post. That’s going by the findings of a new review, which has looked at almost 200 studies of reading rates published over the past century to come up with an overall estimate for how quickly we read. And it turns out that that rate is considerably slower than commonly thought.
Of the various estimates of average reading speed bandied around over the years, one of the most commonly cited is 300 words per minute (wpm). However, a number of findings of slower reading rates challenge that statistic, notes Marc Brysbaert from Ghent University in Belgium in his new paper released as a preprint on PsyArxiv.