By guest blogger Dan Jones
When, in Shakespeare’s Julius Caesar, Marc Anthony delivers his funeral oration for his fallen friend, he famously says “The evil that men do lives on; the good is oft interred with their bones.”
Anthony was talking about how history would remember Caesar, lamenting that doing evil confers greater historical immortality than doing good. But what about literal immortality?
While there’s no room for such a notion in the scientific worldview, belief in an immortal afterlife was common throughout history and continues to this day across many cultures. Formal, codified belief systems like Christianity have a lot to say about the afterlife, including how earthly behaviour determines our eternal fate: the virtuous among us will apparently spend the rest of our spiritual days in paradise, while the wicked are condemned to suffer until the end of time. Yet, according to Christianity and many other formal religions, there’s no suggestion that anyone – good, bad or indifferent – gets more or less immortality, which is taken to be an all-or-nothing affair.
This is not how ordinary people think intuitively about immortality, though. In a series of seven studies published in Personality and Social Psychology Bulletin, Kurt Gray at The University of North Carolina at Chapel Hill, and colleagues, have found that, whether religious or not, people tend to think that those who do good or evil in their earthly lives achieve greater immortality than those who lead more morally neutral lives. What’s more, the virtuous and the wicked are seen to achieve different kinds of immortality.
Continue reading “New research reveals our folk beliefs about immortality – we think the good and bad will live on, but in very different ways”
By guest blogger Tomasz Witkowski
Celebrities are people famous for being famous. Have you ever given any thought to how it happens that pop-culture figures become so well-known, even when they have risen to the top upon a wave of interest for which there was not the slightest rational explanation? What is the real root cause of our lemming-like rush to keep tabs on insignificant but famous people? What leads us to share this information on social media? Why do we visit gossip portals and read tabloids, even though they’re totally worthless to us? Partial answers to these questions are given by a trio of researchers via a series of creative experiments that they’ve reported in Psychology of Popular Media Culture.
Continue reading “Even those participants who claimed pop culture is unimportant suffered psychological ill effects from feeling out of the loop”
By guest blogger Jon Brock
Johannes Eichstaedt was sitting in a coffee shop by Lake Atitlan in Guatemala when he received a slack about a tweet about a preprint. In 2015, the University of Pennsylvania psychologist and his colleagues published a headline-grabbing article linking heart disease to the language used on Twitter. They’d found that tweets emanating from US counties with high rates of heart disease mortality tended to exhibit high levels of negative emotions such as anger, anxiety, disengagement, aggression, and hate. The study, published in Psychological Science, has proven influential, already accruing over 170 citations. But three years later, the preprint authors Nick Brown and James Coyne from the University of Groningen claimed to have identified “numerous conceptual and methodological limitations”. Within the month, Eichstaedt and his colleagues issued a riposte, publishing their own preprint that claims further evidence to support their original conclusions.
As recent revelations surrounding Facebook and Cambridge Analytica have highlighted, corporations and political organisations attach a high value to social media data. But, Eichstaedt argues, that same data also offers rich insights into psychological health and well-being. With appropriate ethical oversight, social media analytics could promote population health and perhaps even save lives. That at least is its promise. But with big data come new challenges – as Eichstaedt’s “debate” with Brown and Coyne illustrates.
Continue reading “Are tweets a goldmine for psychologists or just a lot of noise? Researchers clash over the meaning of social media data”
By guest blogger Bradley Busch
Can a brief video telling students that it’s possible to improve their intelligence and abilities make much difference to their educational outcomes? And if fostering a “growth mindset” in this way does make a difference, does it benefit all students and schools equally?
Research on growth mindset over the past twenty years has progressed from experiments in a laboratory into real world settings, such as classrooms. This has shown that having a growth mindset leads to a small but positive improvement in grades and better mental health. But to date, little work has examined whether a brief mindset intervention is likely to help some adolescents more than others, especially those at greater risk of poor outcomes later in life.
Keen to rectify this, 23 of the leading researchers in this field, including the likes of Carol Dweck, Angela Duckworth and David Yeager, recently collaborated on a large study which they released briefly as a pre-print (they are now revising the manuscript pending submission to peer review). As a Chartered Psychologist who delivers mindset workshops, I believe the preliminary findings are extremely promising.
Continue reading “This cheap, brief “growth mindset” intervention shifted struggling students onto a more successful trajectory”
By guest blogger Helge Hasselmann
Across the globe, ADHD prevalence is estimated around 5 per cent. It’s a figure that’s been rising for decades. For example, Sweden saw ADHD diagnoses among 10-year olds increase more than sevenfold from 1990 to 2007. Similar spikes have been reported from other countries, too, including Taiwan and the US, suggesting this may be a universal phenomenon. In fact, looking at dispensed ADHD medication as a proxy measure of ADHD prevalence, studies from the UK show an even steeper increase.
Does this mean that more people today really have ADHD than in the past? Not necessarily. For example, greater awareness by clinicians, teachers or parents could have simply captured more patients who had previously had been “under the radar”. Such a shift in awareness or diagnostic behaviour would inflate the rate of ADHD diagnoses without necessarily more people having clinical ADHD. However, if this is not the true or full explanation, then perhaps ADHD symptoms really have become more frequent or severe over the years. A new study in The Journal of Child Psychology and Psychiatry from Sweden with almost 20,000 participants has now provided a preliminary answer.
Continue reading “The dramatic increase in the diagnosis of ADHD has not been accompanied by a rise in clinically significant symptoms”
By guest blogger Simon Oxenham
Historically, the kind of false memories induced in volunteers by psychologists have been relatively mundane. For example, a seminal study used leading questions and the encouragement to confabulate, to apparently implant in participants the memory of getting lost in a shopping mall as a child. This reliance on mundane false memories has been problematic for experts who believe that false memories have critical real world consequence, from criminal trials involving false murder confessions, to memories of child abuse “recovered” during therapy using controversial techniques.
The discrepancy between psychologists’ lab results and their real world claims vanished abruptly in 2015 when Julia Shaw (based then at the University of Bedfordshire) and Stephen Porter (University of British Columbia) shocked the memory research community with their staggering finding that, over several interview sessions, and by using false accounts purportedly from the participants’ own caregivers, they had successfully implanted false memories of having committed a crime as a teenager in 70 per cent of their participants, ranging from theft to assault with a weapon. But now other experts have raised doubts about these claims.
Continue reading “Psychologists clash over how easy it is to implant false memories of committing a crime”
By guest blogger Tomasz Witkowski
It’s hard to imagine a crueller fate than when a child receives a diagnosis of an illness as difficult as cancer. A young human being, still not fully formed, is suddenly and irrevocably thrown into a situation that many adults are unable to cope with. Each year, around 160,000 children and youngsters worldwide are diagnosed with cancer, and this trend is growing in industrialised societies. Faced with such facts, it is particularly important to understand how children cope. What traces of the experience remain in their psyche if they manage to survive?
Partial answers to these questions come from a trio of Australian researchers in their systematic review and meta-analysis of existing research into the psychological effects of cancer on children, published recently in Psycho-Oncology. Their findings give us reason for some optimism. It turns out children and adolescents affected by cancer are no more likely to develop post-traumatic stress symptoms than their healthy peers. In fact, several studies have found that children affected by cancer go on to experience greater than usual adjustment and quality of life and lower anxiety and post-traumatic stress symptoms. In psychology, we refer to this as the post-traumatic growth (PTG) effect, which can arise from the struggle with highly challenging life circumstances or trauma.
Continue reading “When tears turn into pearls: Post-traumatic growth following childhood and adolescent cancer”
By guest blogger Helge Hasselmann
Video games do not enjoy the best of reputations. Violent games in particular have been linked with aggression, antisocial behaviour and alienation among teens. For example, one study found that playing a mere 10 minutes of a violent video game was enough to reduce helping behaviour in participants.
However, some experts are sceptical about whether games really cause aggression and, even if the games are to blame, it remains unclear what drives their harmful effects. Earlier studies identified empathy as a key trait that may be affected by violent gameplay. Now a study by Laura Stockdale at Loyola University Chicago and her colleagues in Social Affective and Cognitive Neuroscience has taken a closer look at how gamers and non-gamers differ at a neural level, uncovering evidence that suggests chronic violent gameplay may affect emotional brain processing, although more research is needed to confirm this.
Continue reading “Brain differences in avid players of violent video games suggest they are “callous, cool and in control””
By guest blogger Lucy Foulkes
When you see someone laughing hysterically, do you often find yourself laughing too? Laughter is usually extremely contagious. In fact, we are up to 30 times more likely to laugh with someone else than when alone. It’s a powerful bonding tool: we enjoy seeing other people happy, we enjoy laughing with them, and this brings us closer together.
But is this equally true for everyone, or is laughter more contagious for some people than others? For a paper in Current Biology, a team of researchers at UCL, led by Elizabeth O’Nions and César F. Lima, has investigated whether adolescent boys at risk of psychopathy are less likely to find laughter catching.
Continue reading “For teen boys at risk of psychopathy, laughter isn’t catching”
By guest blogger Bradley Busch
Dr. Seuss wrote “the more that you read, the more things you will know. The more that you learn, the more places you’ll go”. The trouble is, we forget so much of what we read. Is there a way to read that makes it more likely we’ll remember things?
Keen to answer this question, researchers Noah Farrin and Colin MacLeod, from the University of Waterloo in Ontario Canada, ran a study published in Memory. Their results shed new light on how to study more effectively.
Continue reading “Why you’re more likely to remember something if you read it to yourself out loud”