The idea that taking a gap year allows you to “find yourself” is often derided. But if you spend that time living in one foreign country, it just might. And if you can make it years, even better.
Hajo Adam at Rice University, US, led what his team say is the first empirical investigation of the effects of living abroad on “self-concept clarity” – how clearly and confidently someone defines who they “are”. Since people are increasingly spending time living abroad for work or study – and since other “transitional” life experiences, such as getting a new job or getting divorced have been associated with decreases in self-concept clarity – it’s important to study this, the researchers write in their paper in Organizational Behavior and Human Decision Processes.
What is nationality? Is it something fixed that we inherit biologically from our parents or is it a characteristic that we can change and acquire? A new study in Nature Human Behaviour is the first to study people’s “folk theories” about nationality – based on surveys of US and Indian participants – and the results show that, at least in these countries, people are broadly sympathetic toward both these contrasting theories of nationality at the same time, although with a bias toward the fluid theory.
The relative strength of people’s endorsement of the theories at any given time depended on the way questions about nationality were framed, the researchers found. Moreover, and perhaps most interesting for future investigation, the results showed people’s ideas about nationality were tied to their attitudes toward immigration, even after factoring out any differences in political leanings.
The learning-by-teaching effect has been demonstrated in manystudies. Students who spend time teaching what they’ve learned go on to show better understanding and knowledge retention than students who simply spend the same time re-studying. What remains unresolved, however, is exactly why teaching helps the teacher better understand and retain what they’ve learned.
For a new study in Applied Cognitive Psychology researchers led by Aloysius Wei Lun Koh set out to test their theory that teaching improves the teacher’s learning because it compels the teacher to retrieve what they’ve previously studied. In other words, they believe the learning benefit of teaching is simply another manifestation of the well-known “testing effect” – the way that bringing to mind what we’ve previously studied leads to deeper and longer-lasting acquisition of that information than more time spent passively re-studying.
In research published in the 1990s, psychologists asked people to list their biggest regrets in life and found that they tended to mention things they hadn’t done, rather than things they had. Now, one of the psychologists behind that seminal research – Thomas Gilovich at Cornell University – together with his colleague Shai Davidai at The New School for Social Research – have looked into the content of people’s regrets, as opposed to how they were brought about (by action or inaction). Across six studies, the pair present new evidence, published in Emotion, that our most enduring regrets concern not living up to our ideal selves (i.e. not becoming the person we wanted to be), as opposed to not living according to our “ought selves” (the person we should have been based on our duties and responsibilities).
You’re at a ten-pin bowling alley with some friends, you bowl your first ball – and it’s a strike. Do you instantly grin with delight? Not according to a study of bowlers, who smiled not at a moment of triumph but rather when they pivoted in their lanes, to look at their fellow bowlers.
That study provided the earliest evidence for a controversial hypothesis, the Behavioural Ecology View (BECV) of facial displays, outlined in detail in a new opinion piece in Trends in Cognitive Sciences. Carlos Crivelli at De Montfort University, Leicester, UK and Alan Fridlund at the University of California, Santa Barbara, put forward the case that facial displays are not universal, “pre-wired” expressions of emotion – a concept supported by 80 per cent of emotion researchers in a recent poll – but are flexible tools for influencing the behaviour of other people.
Celebrities are people famous for being famous. Have you ever given any thought to how it happens that pop-culture figures become so well-known, even when they have risen to the top upon a wave of interest for which there was not the slightest rational explanation? What is the real root cause of our lemming-like rush to keep tabs on insignificant but famous people? What leads us to share this information on social media? Why do we visit gossip portals and read tabloids, even though they’re totally worthless to us? Partial answers to these questions are given by a trio of researchers via a series of creative experiments that they’ve reported in Psychology of Popular Media Culture.
In 1914, the psychologist Leta Hollingworth’s experiments punctured holes in the prevailing idea that menstruating affects women’s intellect. But a century on, the ovulation cycle continues to interest psychologists, who today focus on how it affects sexual behaviour. A popular evolutionary psychology theory states that during fertile periods, women become more interested in men who use dominant masculine behaviour, as this signals they are likely to provide good genes for any offspring. A University of Goettingen team have now conducted the largest ever test of this idea, published as a pre-print at PsyArxiv.
In the adverts for anti-ageing skin products, everyone is smiling, positively blooming with youthfulness. A canny move by the marketeers you might think – after all, past research has found most of us believe smiling makes people look younger. It’s just that actually, it doesn’t. It makes you look older. That’s according to a new paper in Psychonomic Bulletin and Review that explores an intriguing mismatch between our beliefs and perceptions.
Does the prospect of taking a “Facebook holiday” fill you with dread as you picture a life of social isolation, or does it sound like an appealing and refreshing chance to change priorities?
A new paper in the Journal of Social Psychology has investigated the psychological effects of taking time off from using Facebook. Given that Facebook helps keep us connected but can also expose us to many social stressors, like envy and gossip, the researchers, led by Eric Vanman at the University of Queensland, expected to find a Facebook break would be associated with a drop in life-satisfaction, but also a reduction in stress levels. Their findings are largely in line with their predictions “[and] consistent with the general ambivalent feelings that may typify most active users about Facebook”. However, the study also features ambiguities and limitations that may leave sceptical readers unconvinced.
Intelligence is a concept that some people have a hard time buying. It’s too multifaceted, too context-dependent, too Western. The US psychologist Edwin Boring encapsulated this scepticism when he said “measurable intelligence is simply what the tests of intelligence test.” Yet the scientific credentials of the concept are undimmed, partly because intelligence is strongly associated with so many important outcomes in life. Now Utah Valley University researchers Russell Warne and Cassidy Burningham have released evidence that further strengthens the case for intelligence being a valid and useful concept. Their PsyArXiv pre-print presents a cross-study analysis suggesting a single intelligence-like factor underpins mental performance across a wide range of non-western cultures.