When you have a disagreement with your boss, how do you respond? Do you consider that you might be at fault and try to consider the other’s viewpoint? Or do you dig in your heels and demand that other people come around to your way of thinking? In other words, do you make wise, practical decisions, or are you prone to being stubborn and petty in the face of criticism? Continue reading “Why Fear Of Rejection Prevents Us From Making Wise Decisions”→
There’s a huge amount of research into how people differ in their ability to learn things deliberately and “explicitly”, such as memorising a list of words or instructions, for example. Far less studied is “implicit learning”. Ask a five-year-old to explain the grammatical rules of their language and they’ll likely have no clue where to start. And yet, they do know them – or at least, well enough to form coherent sentences. This kind of unconscious acquisition of abstract knowledge is an example of “implicit” learning.
Implicit learning may be especially important for young children, but adults depend on it, too. It “is recognised as a core system that underlies learning in multiple domains, including language, music and even learning about the statistical structure of our environments,” note the authors of a new paper, published in Cognition.
Socrates famously declared that “the unexamined life is not worth living” and that “knowing thyself” was the path to true wisdom. But is there a right and a wrong way to go about such self-reflection?
Simple rumination – the process of churning your concerns around in your head – isn’t the answer. It’s likely to cause you to become stuck in the rut of your own thoughts and immersed in the emotions that might be leading you astray. Certainly, research has shown that people who are prone to rumination also often suffer from impaired decision-making under pressure and are at substantially increased risk of depression.
Instead, the scientific research suggests that you should adopt an ancient rhetorical method favoured by the likes of Julius Caesar and known as “illeism” – or speaking about yourself in the third person (the term was coined by Samuel Taylor Coleridge from the Latin ille meaning “he, that”). If I was considering an argument that I’d had with a friend, for instance, I may start by silently thinking to myself “David felt frustrated that…” The idea is that this small change in perspective can clear your emotional fog, allowing you to see past your biases.
A bulk of research has already shown that this kind of third-person thinking can temporarily improve decision making. Now a preprint at PsyArxiv finds that it can also bring long-term benefits to thinking and emotional regulation. It is, according to the authors, “the first evidence that wisdom-related cognitive and affective processes can be trained in daily life and of how to do so.”
In the era of social media and rolling news there’s a constant pressure to be in the know, always on hand with an aperçus or two. Today intellectual humility therefore feels more important than ever – having the insight and honesty to hold your hands up and say you’re ignorant or inexpert about an issue.
Psychologists are responding by taking an increasing interest in intellectual humility, including investigating its consequences for learning and the thinking styles that support it. For a new paper in The Journal of Positive Psychology a team led by Elizabeth Krumrei-Mancuso have continued this endeavour, showing, among other things, that intellectual humility correlates with superior general knowledge. This is a logical outcome because, as the researchers write, “simply put, learning requires the humility to realise one has something to learn.”
It’s now well known that many of us over-estimate our own brainpower. In one study, more than 90 per cent of US college professors famously claimed to be better than average at teaching, for instance – which would be highly unlikely. Our egos blind us to our own flaws.
But do we have an even more inflated view of our nearest and dearest? It seems we do – that’s the conclusion of a new paper published in Intelligence journal, which has shown that we consistently view our romantic partners as being much smarter than they really are.
Take a moment to consider how old you feel. Not your actual, biological age – but your own subjective feelings.
Abundant research during the past few decades has shown that this “subjective age” can be a powerful predictor of your health, including the risk of depression, diabetes and hypertension, dementia, and hospitalisation for illness and injury, and even mortality – better than your actual age. In each case, the younger you feel, the healthier you are.
The link probably goes in both directions. So while it’s true that ill-health may make you feel older, a higher subjective age could also limit your physical activity and increase feelings of vulnerability that make it hard to cope with stress – both of which could, independently, lead to illness. The result could even be a vicious cycle, where feelings of accelerated ageing lead you to become more inactive, and the resulting ill-health then further confirms your pessimistic views. And as I recently wrote for BBC Future, understanding this process could be essential for designing more effective health programmes.
Yannick Stephan at the University of Montpellier has led much of the work examining this phenomenon, and his latest paper, published with colleagues in the journal Intelligence, extends this understanding by revealing a surprising link with IQ. According to this research, the more intelligent we are in our late teens and early 20s, the younger we will feel in our 70s – and this may also be reflected in various markers of biological ageing.
A systematic survey in the US of people’s beliefs about their own intelligence – the first for 50 years – has shown that was true then is also the case in the modern era: a majority of people think they are smarter than average.
The research, led by Patrick Heck from the Geisinger Health System and published in PLOS One, combined an online survey and phone survey, with each involving 750 people reflecting a cross-section of the US population, balanced in terms of sex, age, education levels and race.
Across both surveys, 65 per cent of participants agreed with the statement “I am more intelligent than the average person.” Around 70 per cent of men versus 60 per cent of women made the above-average claim, and a similar pattern was found in the young and old. There were also no clear racial differences.
This research accords with both the half-century old study from the Russell Sage Foundation, and more recent research that suggests there is an overconfidence bias in the general population. The trouble is that most of the work following the Foundation’s study has often come from self-selecting or convenience samples that are unlikely to be representative of the wider population, or from students, who may be special cases, struggling, for instance, to envisage what the average person is like from their campus existence.
The new surveys also validate another cornerstone of overconfidence research: that the least intelligent tend to be the most overconfident. While university graduates (who are typically one standard deviation higher than average in intelligence) collectively tended to slightly underestimate their intelligence, those participants whose highest qualification was a high school diploma collectively over-shot in estimating theirs. Heck’s team conclude “that American’s self-flattering beliefs about intelligence are alive and well several decades after their discovery was first reported.”
Intelligence is a concept that some people have a hard time buying. It’s too multifaceted, too context-dependent, too Western. The US psychologist Edwin Boring encapsulated this scepticism when he said “measurable intelligence is simply what the tests of intelligence test.” Yet the scientific credentials of the concept are undimmed, partly because intelligence is strongly associated with so many important outcomes in life. Now Utah Valley University researchers Russell Warne and Cassidy Burningham have released evidence that further strengthens the case for intelligence being a valid and useful concept. Their PsyArXiv pre-print presents a cross-study analysis suggesting a single intelligence-like factor underpins mental performance across a wide range of non-western cultures.
Basic neuroscience teaches us how individual brain cells communicate with each other, like neighbours chatting over the garden fence. This is a vital part of brain function. Increasingly however neuroscientists are zooming out and studying the information processing that happens within and between neural networks across the entire brain, more akin to the complex flow of digital information constantly pulsing around the globe.
This has led them to realise the importance of what they call “brain entropy” – intense complexity and irregular variability in brain activity from one moment to the next, also marked by greater long-distance correlations in neural activity. Greater entropy, up to a point, is indicative of more information processing capacity, as opposed to low entropy – characterised by orderliness and repetition – which is seen when we are in a deep sleep or coma.
A new study in Scientific Reports is the first to examine whether and how ingesting a psychostimulant – in this case caffeine – affects brain entropy. The results show caffeine causes a widespread increase in cerebral entropy. This dose of neural anarchy is probably welcome, especially considered in light of another new paper, in PLOS One, which finds greater brain entropy correlates with higher verbal IQ and reasoning ability.