In a post-truth world of alternative facts, there is understandable interest in the psychology behind why people are generally so wedded to their opinions and why it is so difficult to change minds.
We already know a lot about the deliberate mental processes that people engage in to protect their world view, from seeking out confirmatory evidence (the “confirmation bias“) to questioning the methods used to marshal contradictory evidence (the scientific impotence excuse).
Now a team led by Anat Maril at the Hebrew University of Jerusalem report in Social Psychological and Personality Science that they have found evidence of rapid and involuntarily mental processes that kick-in whenever we encounter opinions we agree with, similar to the processes previously described for how we respond to basic facts.
The researchers write that “their demonstration of such a knee-jerk acceptance of opinions may help explain people’s remarkable ability to remain entrenched in their convictions”.
From the beginning of recorded time, humanity has been fascinated by the figure of the wise person, wending their path through the tribulations of life, and informing those willing to learn. What sets them apart? Maybe that’s the wrong question. In a new review in European Psychologist, Igor Grossman of the University of Waterloo argues that understanding wisdom involves taking the wise off their pedestal, and seeing wisdom as a set of processes that we can all tap into, with the right attitude, and in the right context.
Mental imagery helps us anticipate the future, and vivid mental pictures inject emotion into our thought processes. If operating in a foreign language diminishes our imagination – as reported by a pair of psychologists at the University of Chicago in the journal Cognition – this could affect the emotionality of our thoughts, and our ability to visualise future scenarios, thus helping to explain previous findings showing that bilinguals using their second language make more utilitarian moral judgments, are less prone to cognitive bias and superstition, and are less concerned by risks.
Kids who are better at resisting unhelpful impulses and distractions go on later in life to perform better academically, professionally and socially. But how this kind of self-control develops with age has not been so clear. Teenagers’ show more self-control than children in many ways, but in other respects – think of their propensity for risk-taking – they actually seem to show less.
In a new paper, published in Developmental Science, Ania Aïte at Paris Descartes University, France, led research investigating whether this might be because there are two types of impulse control – “cool” control, in which emotions are not involved, and “hot” control, in which they are – and that they might show different developmental trajectories. If so, this could have implications for educational interventions aimed at reducing teens’ sometimes dangerous behaviour.
Our current bodily states influence our preferences and our behaviour much more than we usually anticipate – as anyone who has gone shopping hungry and come back with bags full of fattening food can attest. “Even when people have previous experience with a powerful visceral state, like pain, they show surprisingly little ability to vividly recall the state or to predict how it affects someone (including themselves) when they are not experiencing it,” write Janina Steinmertz at Utrecht University and her colleagues in their paper in Personality and Social Psychology Bulletin.
The good news is their research suggests we can exploit this phenomenon – we can trick ourselves into thinking we’re feeling differently, thereby influencing our preferences in ways that help us. For instance, one potentially important finding from their paper was that people who thought themselves full went on to choose smaller food portion sizes.
Imagine contemplating which treatment to undertake for a health problem. Your specialist explains there are two possibilities, and strongly endorses one as right for you. But when you discuss it with a friend, she suggests that based on what she’s heard, the other would be better. Another friend, the same. And another. Does there come a point where the friends outweigh the expert? Given enough information – the accuracy of the expert in the past, the degree to which the public have any insight on the issue – you can in theory mathematically “solve” this issue with a probabilistic model. In fact, according to new research published in Thinking and Reasoning, that’s exactly what we do intuitively and with a high degree of accuracy.
Surveys and opinion polls are notoriously bad at predicting election results, as a chain of rather unexpected events last year demonstrated. These instruments usually ask people about their explicit attitudes and opinions. Often, however, these “external” proxies are not entirely representative of what a person is really thinking. For example, severalstudies have shown that implicit attitudes – that is, subtle preferences or biases outside the realm of our consciousness – can be more useful in predicting our future choices.
As scary as this may sound, there is also mounting evidence that our physiological responses can be even more accurate in revealing how we’re likely to vote. In a new paper in Social Cognitive and Affective Neuroscience, researchers from Kingston University and the University of Essex have taken a closer look at a voting outcome in the UK that, last year, came as a surprise to a lot of people. Their findings suggest that people’s brain responses to statements about the EU were a more accurate predictor of they way they went on to vote in the Brexit referendum than their stated intentions.
Unrelenting faith in the face of insurmountable contradictory evidence is a trait of believers in conspiracy theories that has long confounded researchers. For instance, past research has demonstrated how attempting to use evidence to sway believers of anti-vaccine conspiracy theories can backfire, increasing their certainty in the conspiracy. Could it also be the case that knowing that most people doubt a conspiracy actually makes believing in it more appealing, by fostering in the believer a sense of being somehow special? This question was explored recently in the European Journal of Social Psychology by Roland Imhoff and Pia Karoline Lamberty at Johannes Gutenberg University in Mainz, Germany.
The authors of the new paper, led by Amit Bhattacharjee at Erasmus University, believe this anti-profit bias leads many voters and politicians to endorse anti-profit policies that are likely to lead to the very opposite outcomes for society that they want to achieve. “Erroneous anti-profit beliefs may lead to systematically worse economic policies for society, even as they help people satisfy their social and expressive needs on an individual level” they said.