Being rich(er) may not guarantee happiness, as shown by ample evidence from the social sciences, but there are ways of spending money that will make you happier than others. Recent research has uncovered the “experiential advantage”: greater happiness from spending money on experiences (holidays, meals, theatre tickets) instead of material things (gadgets, clothes, jewellery). This could be for a number of reasons, such as experiences being more closely aligned with our values and being less likely to produce rumination and regret. There are exceptions to this rule, of course. Studies have found that personality traits can influence whether experiences or things make a person happiest; for example, introverts are made much happier by spending vouchers in a bookshop than a bar.
Another likely exception, that hasn’t previously been studied, is how social class, and specifically access to resources, affects this experiential advantage. Indeed, most research in this area has been performed with college students, who are typically more affluent than the general population, and there are reasons to believe that those who are less well-off might prefer material goods. For them, buying things as opposed to experiences could be more practical: they last longer, can be used multiple times and potentially resold in the future. To put this reasoning to the test, a recent paper in Psychological Science investigated whether the experiential advantage is diminished or absent for people who can afford very little compared with those who can afford a lot.
The last time you and your class-mates or co-workers pulled an all-nighter before a deadline, you may have noticed:there are always those lucky individuals who seem to do just fine after a lack of sleep, while others feel drowsy and confused – almost like they had too much to drink.
New research conducted at the German Aerospace Center suggests this could be because alcohol intoxication and sleep deprivation are more similar than we once thought.
In their study published recently in PNAS, Eva-Maria Elmenhorst and David Elmenhorst and their colleagues show how both affect us via a shared mechanism. And what’s more, if you’re sensitive to one, you’re likely to cope poorly with the other as well.
“Update: On Twitter, some researchers argued, reasonably in my view, that I wasn’t quite sceptical enough in relating these findings. See the update at the end of this post for more details.”
If you wanted a poster child for the replication crisis and the controversy it has unleashed within the field of psychology, it would be hard to do much better than Fritz Strack’s findings. In 1988, the German psychologist and his colleagues published research that appeared to show that if your mouth is forced into a smile, you become a bit happier, and if it’s forced into a frown, you become a bit sadder. He pulled this off by asking volunteers to view a set of cartoons (paper ones, not animated) while holding a pen in their mouth, either with their teeth (forcing their mouth into a smile), or with their lips (forcing a frown), and to then use the pen in this position to rate how amused they were by the cartoons. The smilers were more amused, and the frowners less so – and best of all, they mostly didn’t discern the true purpose of the experiment, eliminating potential placebo-effect explanations.
This basic idea, that our facial expressions can feed back into our psychological state and behavior, goes back at least as far as Darwin and William James, but “facial feedback”, as it is known, had never been demonstrated in such an elegant and rigorous-seeming manner. Over time, this style of experiment was replicated and expanded upon, and soon it came to be considered a true blockbuster, so famous it found its ways into psychology textbooks, as well as popular books and articles citing it as an example of the unexpectedly subtle ways our bodies and environments can affect us psychologically. Often, facial feedback has been popularised along the lines of Maybe you can smile your way to happiness!, which added an irresistible self-help element that likely helped spread the idea. Either way, it seemed like a genuinely safe and solid psychological finding. That changed rather abruptly in 2016.
Take a moment to consider how old you feel. Not your actual, biological age – but your own subjective feelings.
Abundant research during the past few decades has shown that this “subjective age” can be a powerful predictor of your health, including the risk of depression, diabetes and hypertension, dementia, and hospitalisation for illness and injury, and even mortality – better than your actual age. In each case, the younger you feel, the healthier you are.
The link probably goes in both directions. So while it’s true that ill-health may make you feel older, a higher subjective age could also limit your physical activity and increase feelings of vulnerability that make it hard to cope with stress – both of which could, independently, lead to illness. The result could even be a vicious cycle, where feelings of accelerated ageing lead you to become more inactive, and the resulting ill-health then further confirms your pessimistic views. And as I recently wrote for BBC Future, understanding this process could be essential for designing more effective health programmes.
Yannick Stephan at the University of Montpellier has led much of the work examining this phenomenon, and his latest paper, published with colleagues in the journal Intelligence, extends this understanding by revealing a surprising link with IQ. According to this research, the more intelligent we are in our late teens and early 20s, the younger we will feel in our 70s – and this may also be reflected in various markers of biological ageing.
Teaching, it has often been said, is the one profession that creates all other professions. Therefore it is so important that we learn how to do it right. The ways that teachers learn from each other is likely to be an important part of this, especially how they discern each other’s expertise and whetherthey are inclined to seek advice and help from the most able.
A team led by James Spillane at Northwestern University has published a study in Educational Evaluation and Policy Analysis that looks into these teacher behaviours. The researchers employed a mixed-method approach that spanned five years and involved staff from fourteen different primary schools in the US. This included surveys and interviews to explore how maths teachers conceptualised expert teaching, and then an analysis of student test scores along with teachers’ self-reported interactions with their colleagues, to assess if expert teachers behave differently from their peers.
Amid all the talk of a “replication crisis” in psychology, here’s a rare good news story – a new project has found that a sub-field of the discipline, known as “experimental philosophy” or X-phi, is producing results that are impressively robust.
The current crisis in psychology was largely precipitated by a mass replication attempt published by the Open Science Collaboration (OSC) project in 2015. Of 100 previously published significant findings, only 39 per cent replicated unambiguously, rising to 47 per cent on more relaxed criteria.
We could say without exaggeration that the discovery of a means of achieving full control over oneself is something of a “holy grail” for psychology. There is nothing to indicate that we are getting any closer to finding one, but recent decades have brought us a growing number of discoveries that at least partially allow us to enhance self-control mechanisms. One of them is the light which has been shed on the importance of rituals in boosting self-control. Now in a new paper in the Journal of Personality and Social Psychology, Allen Ding Tian and his collaborators have examined whether enacting rituals (defined as “a fixed episodic sequence of actions characterised by rigidity and repetition”) can enhance subjective feelings of self-discipline, such that rituals can be harnessed to improve behavioural self-control.
We all know someone who is convinced their opinion is better than everyone else’s on a topic – perhaps, even, that it is the only correct opinion to have. Maybe, on some topics, you are that person. No psychologist would be surprised that people who are convinced their beliefs are superior think they are better informed than others, but this fact leads to a follow on question: are people actually better informed on the topics for which they are convinced their opinion is superior? This is what Michael Hall and Kaitlin Raimi set out to check in a series of experiments in the Journal of Experimental Social Psychology.
When, in Shakespeare’s Julius Caesar, Marc Anthony delivers his funeral oration for his fallen friend, he famously says “The evil that men do lives on; the good is oft interred with their bones.”
Anthony was talking about how history would remember Caesar, lamenting that doing evil confers greater historical immortality than doing good. But what about literal immortality?
While there’s no room for such a notion in the scientific worldview, belief in an immortal afterlife was common throughout history and continues to this day across many cultures. Formal, codified belief systems like Christianity have a lot to say about the afterlife, including how earthly behaviour determines our eternal fate: the virtuous among us will apparently spend the rest of our spiritual days in paradise, while the wicked are condemned to suffer until the end of time. Yet, according to Christianity and many other formal religions, there’s no suggestion that anyone – good, bad or indifferent – gets more or less immortality, which is taken to be an all-or-nothing affair.
This is not how ordinary people think intuitively about immortality, though. In a series of seven studies published in Personality and Social Psychology Bulletin, Kurt Gray at The University of North Carolina at Chapel Hill, and colleagues, have found that, whether religious or not, people tend to think that those who do good or evil in their earthly lives achieve greater immortality than those who lead more morally neutral lives. What’s more, the virtuous and the wicked are seen to achieve different kinds of immortality.
Celebrities are people famous for being famous. Have you ever given any thought to how it happens that pop-culture figures become so well-known, even when they have risen to the top upon a wave of interest for which there was not the slightest rational explanation? What is the real root cause of our lemming-like rush to keep tabs on insignificant but famous people? What leads us to share this information on social media? Why do we visit gossip portals and read tabloids, even though they’re totally worthless to us? Partial answers to these questions are given by a trio of researchers via a series of creative experiments that they’ve reported in Psychology of Popular Media Culture.