Where is the future? The tendency in our culture – and most, but not all, others – is to compare the body’s movement through space with its passage through time: ahead are the things we are on our way to encounter. We intuit that the past is linked to the space behind and the future to that in front. But research in the Journal of Experimental Psychology: General has found that some Western people buck this tendency: those born blind.
Finger counting by young kids has traditionally been frowned upon because it’s seen as babyish and a deterrent to using mental calculations. However, a new Swiss study in the Journal of Cognitive Psychology has found that six-year-olds who finger counted performed better at simple addition, especially if they used an efficient finger counting strategy. What’s more, it was the children with higher working memory ability – who you would expect to have less need for using their fingers – who were more inclined to finger count, and to do so in an efficient way. “Our study advocates for the promotion of finger use in arithmetic tasks during the first years of schooling,” said the researchers Justine Dupont-Boime and Catherine Thevenot at the Universities of Geneva and Lausanne.
Mental imagery helps us anticipate the future, and vivid mental pictures inject emotion into our thought processes. If operating in a foreign language diminishes our imagination – as reported by a pair of psychologists at the University of Chicago in the journal Cognition – this could affect the emotionality of our thoughts, and our ability to visualise future scenarios, thus helping to explain previous findings showing that bilinguals using their second language make more utilitarian moral judgments, are less prone to cognitive bias and superstition, and are less concerned by risks.
The desire to catch people in a lie has led to the development of techniques that are meant to detect the physical markers of dishonesty – from the polygraph to brain scans. However, these methods are often foundwanting. The insights of cognitive psychologists have arguably fared better, based on the idea that lying is more mentally demanding than telling the truth – real knowledge is automatically called to mind when we are questioned, and this needs to be inhibited before we answer, leading to slower responses. Unfortunately new research in the Journal of Experimental Psychology: Applied seems to pour cold water on the idea of using these subtle reaction-time differences to develop objective (and cheap) measures to get at the truth. The findings suggest that all it takes to render this cognitive approach ineffective is a prepared false alibi.
Some researchers hope that focusing on the cognitive, neural, genetic and social processes that contribute to symptom dimensions – like anxiety-depression or social withdrawal – may be more fruitful than trying to understand the causes of different diagnostic categories, like “schizophrenia” or “major depression”. It’s in this vein that a new paper in Biological Psychiatry has used a simple perceptual task to investigate how judgment confidence, judgment accuracy and metacognition (judgment insight) are related to various trans-diagnostic symptom dimensions in the general public.
Ego depletion is the notion that willpower is a fuel that gets burned away by effort, and once it burns low we lose our focus and bow to our immediate desires. However, this once dominant theory has recently come into question, thanks in part to a large-scale replication that failed to find an ego-depletion effect and a meta-analysis that argued that the size of the effect is minimal. Complicating the picture, other recent findings have provided a strong demonstration of the effect. But now researchers from Johannes Gutenberg University in Mainz have released a pre-print at PsyArxiv in which they suggest the debates over the size of the ego-depletion effect are missing the point because when you look over the long-term, ego depletion becomes meaningless.
It’s common for psychologists to use the terms “self-control” and “cognitive control” interchangeably. Consider the introduction to a review paper published recently in Trends in Cognitive Sciences on whether our self-control is limited or not (I’ve added the emphases): “Whereas cognitive control relies on at least three separate (yet related) executive functions – task switching, working-memory, and inhibition – at its heart, self-control is most clearly related to inhibitory cognitive control …”
When scholars do make a distinction, they mostly use self-control to refer to the ability to delay immediate gratification in the service of a longer-term goal, whereas they use the term cognitive control to refer to the related ability to ignore distracting information or stimuli. Defined this way, do self-control and cognitive control essentially involve the same mental processes? According to a new study by Stefan Scherbaum at Technische Universität Dresden and his colleagues in Acta Psychologica, they do not.
Video games do not enjoy the best of reputations. Violent games in particular have been linked with aggression, antisocial behaviour and alienation among teens. For example, one study found that playing a mere 10 minutes of a violent video game was enough to reduce helping behaviour in participants.
However, some experts are sceptical about whether games really cause aggression and, even if the games are to blame, it remains unclear what drives their harmful effects. Earlier studies identified empathy as a key trait that may be affected by violent gameplay. Now a study by Laura Stockdale at Loyola University Chicago and her colleagues in Social Affective and Cognitive Neuroscience has taken a closer look at how gamers and non-gamers differ at a neural level, uncovering evidence that suggests chronic violent gameplay may affect emotional brain processing, although more research is needed to confirm this.
Distressing conditions including PTSD, depression and anxiety have something in common: a difficulty in suppressing unwanted thoughts. Negative self-judgments and re-experienced traumas directly impact mental health and make recovery harder by intruding into the new experiences that should provide distance and a mental fresh start. Understanding what’s involved in thought suppression may therefore be one key to helping people with these conditions. Now research in Nature Communications has uncovered an important new brain process that may help explain why some people struggle to control their thoughts.
For years, “ego depletion” has been a dominant theory in the study of self control. This is the intuitive idea that self control or willpower is a limited resource, such that the more you use up in one situation, the less you have left over to deploy in another. It makes sense of the everyday experience of when you come home after a hard day at the office, abandon all constructive plans, and instead binge on snacks in front of the TV.
The trouble is, the theory has taken some hard knocks lately, including a failed joint replication attempt by 23 separate labs. Critics have pointed out that most supportive studies – and there are over 200 of them – are small and underpowered. A meta-analysis that corrected for a positive bias in the existing literature concluded that ego depletion is not real. A study in India – where there’s a cultural belief that exercising self-control is energising – even found evidence for “reverse ego depletion“.
It’s not easy to weigh the evidence for and against, but perhaps the science is tipping back in favour of ego depletion. Two new studies, made publicly available on the PsyArXiv preprint website, provide what the researchers at Texas A&M University, led by Katie Garrison, describe as “the strongest evidence yet of the ego depletion effect”.