Cheating is common, ranging from benign instances like looking up an answer on your phone during a pub quiz, to the fairly major, such as using a series of coughs to fraudulently bag yourself a million pounds on a popular TV game show. But wherever we fall on that scale, research suggests, we’re still likely to think of ourselves as honest and trustworthy.
There’s something of a tension here — we’re seemingly both prone to cheating and convinced of our own integrity. Matthew L. Stanley and colleagues from Duke University have one explanation for this apparent contradiction in their latest paper in Psychonomic Bulletin & Review: when we cheat, we claim we knew the answers all along.
Countless studies have investigated how a leader’s behaviour influences their followers. There’s been very little work, though, on the reverse: how followers might influence their leaders. Now a new paper, published in the Journal of Applied Psychology, helps to plug that gap with an alarming finding: good, morally upstanding followers can create less ethical leaders.
No matter how happy you are in yourself, there’s probably something about your personality you’d like to change. Maybe you feel you’re too uptight or want to be more outgoing, or perhaps you’d like to be less moody or more tolerant of other people’s shortcomings.
It’s likely that such a change in personality will have some kind of social consequence, whether that’s in your relationship with your spouse or your ability to get on with your colleagues. But it might also affect which moral values you hold important.
That’s what Ivar R. Hannikainen and colleagues suggest in a new paper in the Journal of Research in Personality. They found that growth in one area, empathy, was associated with a shift in moral foundations to a more classically “liberal” style of morality.
Imagine that you’re an official faced with an unenviable decision: you must choose whether to establish a farm on existing land which can produce enough to feed 100 hungry families, or cut down an acre of rainforest to create a larger farm able to feed 500 hungry families. What choice would you make?
If you chose not to cut down the rainforest, you’re in the majority. In a new paper in Psychological Science, participants tended to avoid choosing to harm the rainforest, despite the benefits it would bring. This isn’t surprising: time and again, researchers have found that we will avoid causing harm if possible.
Now imagine that your choice is made harder. There’s no free land left; you have to cut down some of the rainforest. Would you cut down one acre to feed 100 families, or two acres to feed 500?
It’s an interesting question, because although researchers believe we’re generally averse to causing harm, they hadn’t really studied how we make decisions when some amount of harm is unavoidable. And, perhaps surprisingly, in this second scenario almost 80% of people chose to do more damage, cutting down two acres of forest rather than one. In fact, across five other studies as well, Jonathan Berma from London Business School and Daniella Kupor from Boston University find that in situations where harm is unavoidable, people consistently try to maximise the social benefit, rather than minimise the amount of harm caused.
It’s one of the best-known and also controversial experiments in psychology: in 1963, Stanley Milgram reported that, when instructed, many people are surprisingly willing to deliver apparently dangerous electrical shocks to others. For some researchers, this — along with follow-up studies by the team — reveals how acting “under orders” can undermine our moral compass.
Milgram’s interpretation of his findings, and the methods, too, have been criticised. However, the results have largely been replicated in experiments run in the US, Poland, and elsewhere. And in 2016, a brain-scanning study revealed that when we perform an act under coercion vs freely, our brain processes it more like a passive action rather than a voluntary one.
Now a new study, from a group that specialises in the neuroscience of empathy, takes this further: Emilie Caspar at the Social Brain Lab at the Netherlands Institute for Neuroscience and colleagues report in NeuroImage that when we follow orders to hurt someone, there is reduced activity in brain networks involved in our ability to feel another’s pain. What’s more, this leads us to perceive pain that we inflict as being less severe. This process could, then, help to explain the dark side of obedience.
It’s hard to find a clearer example of moral hypocrisy than this: in 2015, Josh Duggar, a family values activist and director of a lobby group set up “to champion marriage and family as the foundation of civilisation, the seedbed of virtue, and the wellspring of society” was outed as holding an account with a dating service for people who are married or in relationships.
As Kieran O’Connor at the University of Virginia and colleagues point out in a new paper in the Journal of Personality and Social Psychology: Attitudes and Social Cognition, Duggar’s apparently virtuous public image was in stark contrast to his private behaviour. This was a classic case, then, of hypocrisy. But as the team now reveal through a compelling series of seven studies, another type of discrepancy is seen as being hypocritical too. That’s when individuals are perceived to use private good deeds to assuage their guilt over morally dubious public works.
One of the most important and durable findings in moral and political psychology is that there is a tail-wags-the-dog aspect to human morality. Most of us like to think we have carefully thought-through, coherent moral systems that guide our behaviour and judgements. In reality our behaviour and judgements often stem from gut-level impulses, and only after the fact do we build elaborate moral rationales to justify what we believe and do.
A new paper in the Journal of Personality and Social Psychology examines this issue through a fascinating lens: free will. Or, more specifically, via people’s judgments about how much free will others had when committing various transgressions. The team, led by Jim A. C. Everett of the University of Kent and Cory J. Clark of Durham University, ran 14 studies geared at evaluating the possibility that at least some of the time the moral tail wags the dog: first people decide whether someone is blameworthy, and then judge how much free will they have, in a way that allows them to justify blaming those they want to blame and excusing those they want to excuse.
No matter how high your self-confidence, it’s likely that you have certain traits you’d change given the opportunity: maybe you’d turn down your anxiety, feel more outgoing in company, or be a bit less lazy. One 2016 study found that 78% of people wanted to better embody at least one of the Big Five personality traits (extraversion, emotional stability, conscientiousness, agreeableness, or openness to experience), so the desire to change who you are is not uncommon.
But are we so keen to change how moral we are? That is, how concerned are we really about being a good or bad person? A new study published in Psychological Science suggests that we’d rather spend time improving those parts of us that aren’t morally relevant, with traits like honesty, compassion and fairness taking a back seat.
There are lots of differences between those who express opposing political affiliations — and they may not just be ideological. Liberals and conservatives have different shopping habits, for instance, with one series of studies finding that liberals preferred products that made them feel unique, whilst conservatives picked brands that made them feel better than others. They even view health risks differently when they’re choosing what to eat.
How do you measure the success of a child’s education? Test results are one thing, and according to a recent global survey, British children have risen in the league tables for both maths and reading. However, these same teens reported among the lowest levels of life satisfaction. They may be performing well academically, but they’re not thriving.
This isn’t a problem only in the UK, of course. At a recent conference that I attended, organised by the Templeton World Charity Foundation, research psychologists, education specialists, economists and philosophers from around the world met to discuss how to help individuals and societies flourish in the 21st century. One word hung in the air as key: “character”.