Sharing with others, helping people in need, consoling those who are distressed. All these behaviours can be encouraged by empathy – by understanding what other people are thinking and feeling, and sharing their emotions. Enhance empathy, especially in those who tend to have problems with it – like narcissists – and society as a whole might benefit. So how can it be done?
In fact, the cultivation of empathy is a “presumed benefit” of mindfulness training, note the authors of a new study, published in Self and Identity, designed to investigate this experimentally. People who are “mindfully aware” focus on the present moment, without judgement. So, it’s been argued, they should be better able to resist getting caught up in their own thoughts, freeing them to think more about the mental states of other people. As mindfulness courses are increasingly being offered in schools and workplaces, as well as in mental health settings, it’s important to know what such training can and can’t achieve. The new results suggest it won’t foster empathy – and, worse, it could even backfire.
Most employers like their workers to think of themselves not as employees but as “citizens” of the organisation, proactively engaging in activities like helping others out or coming up with company improvements – activities that aren’t specified in a job description yet help the organisation thrive. But more and more, these supposedly discretional citizenship behaviours are being demanded by managers more overtly – outlined in ‘The Way We Work’ documents, or threatened informally as necessary to get ahead. Now an article in the Academy of Management Journal suggests being forced to be a good citizen has some perverse consequences: when you’re grudgingly good, you become blasé about doing bad.
Fifty years ago, in Connecticut, a series of infamous experiments were taking place. The volunteers believed they were involved in an investigation into learning and memory, and that they would be administering shocks to a test subject whenever he answered questions incorrectly. But despite pretences, the scientist behind the research, Stanley Milgram, wasn’t actually interested in learning. The real topic of study? Obedience.
Milgram recorded how far his participants were willing to go when told to deliver larger and larger shocks. In one version of the study, 26 out of 40 participants continued to the highest shock level – two steps beyond the button labelled “Danger: severe shock”.
But this was 50 years ago – surely the same wouldn’t happen if the experiment were conducted today? That’s what a group of researchers from SWPS University of Social Sciences and Humanities in Poland aimed to find out, in a “partial replication” of Milgram published recently in Social Psychological and Personality Science.
“The believer is not the one who eats when his neighbour beside him is hungry” said the founder of Islam, but many unbelievers see this as the norm: that religious people rarely do the good demanded by their faith. Some evidence seems to support this cynicism. Surveys on tackling inequality and support for welfare often find that the religious show less enthusiasmfor helping the poorest in society. This would seem to reflect badly on the faithful, but new research in the Journal of Neuroscience, Psychology, and Economics involving Turkish Muslims offers some redemption. The findings suggest that the religious may be taking a pragmatic approach that expresses their compassion for the needy while remaining consistent with their beliefs about a just deity … and in fact, from a practical perspective, this approach may lead to surprisingly good outcomes.
Most of us have a sense of what it means to be human. Research shows that we agree with each other that traits like friendly, jealous or impatient are more “human” than others like unemotional or selfless. What’s more, we like to see ourselves as human: we care more about human traits and claim to possess them more than other people. In other words, we “self-humanise”, laying claim to the good and the bad as long as they emphasise our own humanity.
But this research on self-humanising presents a conundrum. A different, abundant line of evidence shows that humans bitterly protect a highly positive self-image, supported by cognitive biases that attribute our own failings to circumstances and other people’s to their deficiencies. So, do we really overestimate the bad in ourselves, claiming to be more human, warts and all? According to a critique of the self-humanising field in The Journal of Social Psychology, this is an oversimplification: when it comes to undesirable human traits, we see ourselves as pretty similar to other people.
If you’re in need of some renewed faith in human nature, the research literature on altruism by toddlers is a great place to look. Charming studies have shown that little children will readily go out of their way to help you, such as picking up things you’ve dropped, or passing you stuff you can’t reach. They can even do “paternalistic helping” which is when they ignore your specific request to help you in a way that you’ll find even more beneficial.
There are some evolutionarily tinged theoretical explanations for why children have these instincts: we’re a highly social species so it makes sense that we’re naturally inclined to help each other as a way to gain status and receive reciprocal favours later. A new paper in Developmental Psychology has taken a slightly different approach, asking: what is it, in the moment, that motivates toddlers to help others? Robert Hepach and his colleagues, including Michael Tomasello who’s conducted a lot of the landmark work on the development of altruism, report that toddlers are helpful, at least in part, because, well, they enjoy it. In fact, based on a new body-language measure of their emotion, they seem to find helping someone else about as pleasurable as they find helping themselves.
Coming up with the perfect recipe for crisps or the ideal marketing strategy for a soft drink used to depend on explicit measures. In focus groups and surveys, consumers were asked which product tasted best or which commercial was most appealing. But these measures are imperfect: consumers may choose to hide their true opinions or they might not be fully aware of their own preferences. Food and drinks companies need more objective measures. Currently their best hope is functional magnetic resonance imaging (fMRI).
The idea is that somewhere in the brain, a “buy button” is hidden away: a region (or combination of regions) that influence your purchase decision. The promise of neuromarketing is that one day, we will be able to find this region, record its activity when you watch an ad or sample a product, and then predict how well this product will sell. So far, the success has been limited. But in a recent study in NeuroImage, Simone Kühn from the University Clinic Hamburg-Eppendorf and her colleagues claim to have found “multiple ‘buy buttons’ in the brain”.
Since she got back from her year abroad, there’s been something different about Sam. Once an avid rule-follower, now she’s breaking them – and when you raise it she explains that these things, after all, are just a matter of perspective.
Can exposure to other countries breed a flexible relationship to the rules, even moral relativism? According to new research in the Journal of Personality and Social Cognition, it can.
Columbia University’s Jackson Lu led an international team to explore this question through a range of studies. They knew living abroad has been associated with positive outcomes such as reduced judgment of other groups and, in particular, cognitive flexibility, which supports creativity. But Lu’s team theorised a possible downside: that this flexibility could extend to the domain of morality. Perhaps experiencing many moral codes can prompt us to question our own.
Work is getting stale, and you’ve recently been courted by an exciting new company for a great role, the one drawback being a slight pay cut. Before you’ve made up your mind, your manager asks you whether you have plans to go elsewhere. If you wanted to avoid showing your hand, you could lie blatantly. You could change the topic. Or, you could palter: use a truthful statement to create a misleading impression.
“Financially, you’re treating me really well and I don’t think there’s anything out there that could match that.”
Paltering is the topic of a new paper in the Journal of Personality and Social Psychology. The authors, Todd Rogers and others at Harvard University, focused on negotiation situations, where access to accurate information had concrete consequences. They found that paltering is fairly common – real-life negotiators reported doing it more frequently then telling a lie, and as commonly as neglecting to share information – and that one reason for this is that they believed it wasn’t such a big deal as lying. In this, they were sadly mistaken.
When we feel ostracised, we’re more likely to behave aggressively. Previous research suggests that vengeance on those who we think have wronged us can be driven by a sense of justice, and may activate neural reward centres. But being ostracised can also lead to generalised aggression, even lashing out at unrelated people, so there seems to be more going on. In new research in Journal of Personality and Social Psychology, David Chester and C. Nathan DeWall tested the idea that social rejection, by making us feel wounded and unwanted, triggers a need to repair our mood by whatever means available, including through the satisfaction of causing harm to those who have made us suffer. They found that aggression can indeed be a viable method of mood repair.