Why should the “better-than-average” effect be so pronounced for moral traits? In new work, published in Social Psychological and Personality Science, Ben Tappin and Ryan McKay at Royal Holloway, University of London have found that it’s because we’re especially irrational when it comes to evaluating moral traits. Moral superiority appears to be “a uniquely strong and prevalent form of positive illusion,” they write.
If a friend sees you suffering and tells you “I feel your pain”, it may be more than an expression of empathy. For about a quarter of people, it could be literally true. A recent study, led by Thomas Grice-Jackson at the University of Sussex, found that 27 per cent of participants experienced so-called “mirror pain” – watching someone falling off a bicycle or receiving an injection, for instance, caused them to experience physical pain of their own.
Now in a paper in Frontiers in Human Neuroscience, the same team of researchers has explored the neurological underpinnings of mirror pain. When some people have this experience, they don’t just show more activity in the so-called “pain matrix” (the network of brain regions linked to the experience of pain), they also show unusual patterns of neural activity that suggest they struggle to distinguish other people’s experience from their own.
As social creatures, accurately recognising and understanding the mental states of others (their intentions, knowledge, beliefs, etc.) is crucial to our social bonds and interactions. In fact, in today’s multi-cultural world and strongly divided political climate, this skill – known as Theory of Mind – is perhaps more important than ever. A recent study published in the Journal of Cognitive Enhancement proposes that an effective way to develop our Theory of Mind lies in learning to better understand ourselves.
“I don’t think you are truly mean, you have sad eyes” Tormund Giantsbane ponders the true self of Sandor ‘The Hound’ Clegane in Game of Thrones, Beyond The Wall.
Who are you really? Is there a “true you” beneath the masquerade? According to a trio of psychologists and philosophers writing in Perspectives on Psychological Science, the idea that we each have a hidden true or authentic self is an incredibly common folk belief, and moreover, the way most of us think about these true selves is remarkably consistent, even across different cultures, from Westeros to Tibet.
This makes the concept of a true self useful because it helps explain many of the judgments we make about ourselves and others. Yet, from a scientific perspective, there is actually no such thing as the true self. “The notion that there are especially authentic parts of the self, and that these parts can remain cloaked from view indefinitely, borders on the superstitious,” write Nina Strohminger and her colleagues at Yale University.
Most brain imaging studies involving transgender people or people with gender dysphoria have focused on whether their brains look more like what’s typical for the gender they identify with, rather than the gender they were assigned at birth based on their biological sex. For example, whether trans men have “masculine” brains, and trans women have more “feminine” brains.
The results have been mixed and if anything point towards trans people having brains with distinctfeatures that are neither stereotypically male or female.
A new study in Brain Imaging and Behaviour adds to this trend, showing that trans men have unusual patterns of connectivity in brain networks involved in processing of the self, as compared with male and female controls. “The present data do not support the hypothesis that sexual differentiation of the brain of individuals with gender dysphoria is in the opposite direction as their sex assigned at birth,” the researchers said, adding that the unusual connectivity patterns they found in trans men “was detected in comparison with both male and female controls, and there were no differences between the control groups”.
“After decades of debate, a consensus is emerging about the way self-esteem develops across the lifespan.” So wrote a pair of psychologists – one from Kings College London, the other from the University of California Davis – in a paper published back in 2005. That “consensus” is that self-esteem is relatively high in childhood, drops during adolescence, rises gradually through adulthood before dropping sharply in old age. But a new paper suggests that there’s a major blip in this pattern for one huge part of the population. Becoming a mother triggers a decline in self-esteem and relationship satisfaction over at least the next three years, according to research on nearly 85,000 mothers in Norway, forthcoming in the Journal of Personality and Social Psychology.
Our autobiographical memory is fundamental to the development of our sense of self. However, according to past research, it may be compromised in autism, together with other skills that are also vital for self understanding, such as introspection and the ability to attribute mental states to others (known as mentalising).
For example, experiments involving autistic children have highlighted retrieval difficulties, “impoverished narratives”, and a greater need for prompting, while also suggesting that semantic recall (facts from the past) may be impaired in younger individuals.
Now a UK research team, led by Sally Robinson from London’s St. Thomas’ Hospital, has published the first attempt to assess the nature of – and relationships between – autobiographical memory, mentalising and introspection in autism. Reporting their findings in Autism journal, the group hope their results will shed more light on the way that autistic children and teens develop a sense of self.
Feeling authentic in a relationship – that is, feeling like you are able to be yourself, rather than acting out of character – is healthy, not just for the relationship, but for your wellbeing in general. This makes sense: after all, putting on a fake show can be exhausting. But dig a little deeper and things get more complicated because there are different ways to define who “you” really are.
Is the real you how you actually think and behave, for instance? Or, taking a more dynamic perspective, is it fairer to say that the true you is the person you aspire to be: what psychologists call your “ideal self”?
For a paper in Personality and Social Psychology Bulletin, Muping Gan and Serena Chen asked members of the public about this and 70 per cent of them thought that the ability to be your actual self was more important for feeling authentic in a relationship than being able to be your ideal self.
But contrary to this folk wisdom, across several studies, the researchers actually found evidence for the opposite – that is, feelings of authenticity in a relationship seem to arise not from being our actual selves in the relationship, but from feeling that we can be our best or ideal self.
If you look at the research literature on self-serving biases, it’s little surprise that critical thinking – much needed in today’s world – is such a challenge. Consider three human biases that you may already have heard of: most of us think we’re better than average at most things (also known as illusory superiority or the Lake Wobegon Effect); we’re also prone to “confirmation bias”, which is favouring evidence that supports our existing views; and we’re also susceptible to the “endowment effect” which describes the extra value we place on things, just as soon as they are ours.
A new paper in the Quarterly Journal of Experimental Psychology by Aiden Gregg and his colleagues at the University of Southampton extends the list of known biases by documenting a new one that combines elements of the better-than-average effect, confirmation bias and the endowment effect. Gregg’s team have shown that simply asking participants to imagine that a theory is their own biases them to believe in the truth of that theory – a phenomenon that the researchers have called the Spontaneous Preference For Own Theories (SPOT) Effect.
Taking selfies makes us feel self-conscious and sends tremors through our self-esteem, according to new research published in the Journal of Personality and Individual Differences. One group of undergraduates at Yonsei University in Seoul used their phone’s camera to take a selfie, while a control group photographed a cup on a desk. Afterwards selfie takers showed signs of increased social sensitivity, at least according to a test that involved detecting the direction of arrows on a computer screen. The arrows appeared in locations previously occupied by the features of a face and the idea was that participants would be more focused on these facial features, and thus quicker to detect the arrows, if they were in a socially vigilant state.
The fact that selfie takers showed enhanced social sensitivity (they were quicker to detect the arrows) is consistent with the way that our social sensitivity goes up when we are in front of a mirror or when someone else points a video camera at us, making us acutely aware of the imperfections we have on show.
The researchers, graduate students at the university, used this indirect measure to assess social sensitivity because they thought people might not respond honestly if they were simply asked how they were feeling.
In a similar vein, the researchers used an indirect measure to test if taking a selfie affected participants’ self-esteem, specifically whether it shrunk their written signature compared to its size at the start of the study (past research has linked bigger signatures with greater self-esteem). It did, but only for selfies not posted to social media, but simply saved to the phone. The authors speculated that the act of taking a selfie hurts self-esteem by bringing feelings about personal imperfections to the fore, but this wound can be salved through the self-promotional aspect of sharing your image to the wider world. On this reading, selfie-taking is a self-esteem rollercoaster, one that might put you back more or less where you started.