Category: Morality

Our Feelings Towards People Expressing Empathy Depend On Who They’re Empathising With

By Emily Reynolds

We tend to think of empathy as a wholly positive thing, a trait that’s not only favourable to possess but that we should actively foster. Books and courses promise to reveal secret wells of empathy and ways to channel them; some people even charge for “empathy readings”, a service that seems to sit somewhere between a psychic reading and a therapy session.

It would be easy to assume, therefore, that people who express empathy are generally well-liked. But a new study in the Journal of Personality and Social Psychology finds that our feelings towards “empathisers” depends on who they are empathising with. While empathisers were considered warmer overall, participants judged people who expressed empathy for those with troubling political views more harshly — suggesting that we don’t always interpret empathy as a pure moral virtue.

Continue reading “Our Feelings Towards People Expressing Empathy Depend On Who They’re Empathising With”

We Are Less Likely To Dehumanise Prisoners Who Are Approaching The End Of Their Sentence

By Emma Young

Criminals are often characterised in the popular press as “animals” or “cold-blooded”. Such adjectives effectively dehumanise them, and there’s no end of research finding that if we deny fully human emotional and thinking capacities to other people, we are less likely to treat them in a humane way. But how long does prisoner dehumanisation last? Is it a life sentence? Or, wondered the authors of a new paper, published in Personality and Social Psychology Bulletin, does it depend on how long a prisoner has left to serve?

Continue reading “We Are Less Likely To Dehumanise Prisoners Who Are Approaching The End Of Their Sentence”

People Are More Positive About Hacking When They Feel They’ve Been Treated Unfairly

By Emily Reynolds

Type the word “hacker” into any stock photo search engine and you’ll be greeted with pages and pages of images of someone sitting in the dark, typing threateningly at their laptop, and more often than not wearing a balaclava or Guy Fawkes mask. That Matrix-inspired 1990s aesthetic of green code on black is still prevalent — and still implies that hackers have inherently nefarious ends.

More recently, however, the idea of hacking as a prosocial activity has gained more attention. Earlier this year, one group of hackers made headlines for donating $10,000 in Bitcoin to two charities, the result of what they say was the extortion of millions of dollars from multinational companies.

While the charities declined the donations, social media responses were more mixed, with some praising the hackers. And in a new study, Maria S. Heering and colleagues from the University of Kent argue that our view of hacking is somewhat malleable: when people were treated unfairly and the institutions responsible did nothing to redress their grievances, they felt more positive about hackers who targeted the source of their anger.

Continue reading “People Are More Positive About Hacking When They Feel They’ve Been Treated Unfairly”

After Cheating On A Test, People Claim To Have Known The Answers Anyway

By Emily Reynolds

Cheating is common, ranging from benign instances like looking up an answer on your phone during a pub quiz, to the fairly major, such as using a series of coughs to fraudulently bag yourself a million pounds on a popular TV game show. But wherever we fall on that scale, research suggests, we’re still likely to think of ourselves as honest and trustworthy.

There’s something of a tension here — we’re seemingly both prone to cheating and convinced of our own integrity. Matthew L. Stanley and colleagues from Duke University have one explanation for this apparent contradiction in their latest paper in Psychonomic Bulletin & Review: when we cheat, we claim we knew the answers all along.

Continue reading “After Cheating On A Test, People Claim To Have Known The Answers Anyway”

Leaders Can Feel Licensed To Behave Badly When They Have Morally Upstanding Followers

By Emma Young

Countless studies have investigated how a leader’s behaviour influences their followers. There’s been very little work, though, on the reverse: how followers might influence their leaders. Now a new paper, published in the Journal of Applied Psychology, helps to plug that gap with an alarming finding: good, morally upstanding followers can create less ethical leaders.

Continue reading “Leaders Can Feel Licensed To Behave Badly When They Have Morally Upstanding Followers”

People Who Want To Be More Empathic May Also Develop “Liberal” Moral Values

By Emily Reynolds

No matter how happy you are in yourself, there’s probably something about your personality you’d like to change. Maybe you feel you’re too uptight or want to be more outgoing, or perhaps you’d like to be less moody or more tolerant of other people’s shortcomings.

It’s likely that such a change in personality will have some kind of social consequence, whether that’s in your relationship with your spouse or your ability to get on with your colleagues. But it might also affect which moral values you hold important.

That’s what Ivar R. Hannikainen and colleagues suggest in a new paper in the Journal of Research in Personality. They found that growth in one area, empathy, was associated with a shift in moral foundations to a more classically “liberal” style of morality.

Continue reading “People Who Want To Be More Empathic May Also Develop “Liberal” Moral Values”

When Causing Harm Is Unavoidable, We Prefer To Cause More Harm For More Benefits Rather Than Less Harm For Fewer

By Matthew Warren

Imagine that you’re an official faced with an unenviable decision: you must choose whether to establish a farm on existing land which can produce enough to feed 100 hungry families, or cut down an acre of rainforest to create a larger farm able to feed 500 hungry families. What choice would you make?

If you chose not to cut down the rainforest, you’re in the majority. In a new paper in Psychological Science, participants tended to avoid choosing to harm the rainforest, despite the benefits it would bring. This isn’t surprising: time and again, researchers have found that we will avoid causing harm if possible.

Now imagine that your choice is made harder. There’s no free land left; you have to cut down some of the rainforest. Would you cut down one acre to feed 100 families, or two acres to feed 500?

It’s an interesting question, because although researchers believe we’re generally averse to causing harm, they hadn’t really studied how we make decisions when some amount of harm is unavoidable.  And, perhaps surprisingly, in this second scenario almost 80% of people chose to do more damage, cutting down two acres of forest rather than one. In fact, across five other studies as well, Jonathan Berma from London Business School and Daniella Kupor from Boston University find that in situations where harm is unavoidable, people consistently try to maximise the social benefit, rather than minimise the amount of harm caused.

Continue reading “When Causing Harm Is Unavoidable, We Prefer To Cause More Harm For More Benefits Rather Than Less Harm For Fewer”

When We Follow Orders To Hurt Someone, We Feel Their Pain Less Than If We Hurt Them Freely

By Emma Young

It’s one of the best-known and also controversial experiments in psychology: in 1963, Stanley Milgram reported that, when instructed, many people are surprisingly willing to deliver apparently dangerous electrical shocks to others. For some researchers, this — along with follow-up studies by the team — reveals how acting “under orders” can undermine our moral compass.

Milgram’s interpretation of his findings, and the methods, too, have been criticised. However, the results have largely been replicated in experiments run in the US, Poland, and elsewhere. And in 2016, a brain-scanning study revealed that when we perform an act under coercion vs freely, our brain processes it more like a passive action rather than a voluntary one.

Now a new study, from a group that specialises in the neuroscience of empathy, takes this further: Emilie Caspar at the Social Brain Lab at the Netherlands Institute for Neuroscience and colleagues report in NeuroImage that when we follow orders to hurt someone, there is reduced activity in brain networks involved in our ability to feel another’s pain. What’s more, this leads us to perceive pain that we inflict as being less severe. This process could, then, help to explain the dark side of obedience.

Continue reading “When We Follow Orders To Hurt Someone, We Feel Their Pain Less Than If We Hurt Them Freely”

Private Good Deeds That Appear To Compensate For Bad Public Behaviour Make People Seem Hypocritical

By Emma Young

It’s hard to find a clearer example of moral hypocrisy than this: in 2015, Josh Duggar, a family values activist and director of a lobby group set up “to champion marriage and family as the foundation of civilisation, the seedbed of virtue, and the wellspring of society” was outed as holding an account with a dating service for people who are married or in relationships.

As Kieran O’Connor at the University of Virginia and colleagues point out in a new paper in the Journal of Personality and Social Psychology: Attitudes and Social Cognition, Duggar’s apparently virtuous public image was in stark contrast to his private behaviour. This was a classic case, then, of hypocrisy. But as the team now reveal through a compelling series of seven studies, another type of discrepancy is seen as being hypocritical too. That’s when individuals are perceived to use private good deeds to assuage their guilt over morally dubious public works.

Continue reading “Private Good Deeds That Appear To Compensate For Bad Public Behaviour Make People Seem Hypocritical”

We Tend To See Acts We Disapprove Of As Deliberate — A Bias That Helps Explain Why Conservatives Believe In Free Will More Than Liberals

By guest blogger Jesse Singal

One of the most important and durable findings in moral and political psychology is that there is a tail-wags-the-dog aspect to human morality. Most of us like to think we have carefully thought-through, coherent moral systems that guide our behaviour and judgements. In reality our behaviour and judgements often stem from gut-level impulses, and only after the fact do we build elaborate moral rationales to justify what we believe and do.

A new paper in the Journal of Personality and Social Psychology examines this issue through a fascinating lens: free will. Or, more specifically, via people’s judgments about how much free will others had when committing various transgressions. The team, led by Jim A. C. Everett of the University of Kent and Cory J. Clark of Durham University, ran 14 studies geared at evaluating the possibility that at least some of the time the moral tail wags the dog: first people decide whether someone is blameworthy, and then judge how much free will they have, in a way that allows them to justify blaming those they want to blame and excusing those they want to excuse.

Continue reading “We Tend To See Acts We Disapprove Of As Deliberate — A Bias That Helps Explain Why Conservatives Believe In Free Will More Than Liberals”