It’s really awkward when you’re chatting to someone whose sense of appropriate interpersonal space is way too close. There’s the option of performing a subtle backward shuffle, but what if they simply close the gap again?
Our judgments about such things obviously vary with individual personality – people with more social anxiety tend to prefer a greater distance – and also on the nature of the relationship we have with the other person. But culture must surely play a part too.
To find out how preferred interpersonal distances vary across the world, the authors of a new study in the Journal of Cross-Cultural Psychology approached – not too close, presumably – nearly 9,000 participants in 42 countries and asked them to indicate on a simple graphic how close another person – either a stranger, friend or more intimate relation – could get to them during a conversation for things to remain comfortable.
In the TV series Better Call Saul, Saul’s brother Chuck believes that electromagnetic signals from mobile phones and other devices make him seriously ill. He lives as a recluse and uses a foil blanket to protect himself. By some estimates, millions of people – around 5 per cent of the population – believe that they too suffer from “electrosensitivity” or “electromagnetic hypersensitivity”. Though they may not suffer as much as Chuck, these individuals claim that wi-fi and other signals make them ill, triggering headaches and other symptoms.
The medical consensus based on double-blind trials (in which neither researcher nor test subject knows when a test device is real or pretend) is that while the experience of electrosensitivity-related symptoms may be real, they are not caused by electromagnetic fields. More likely is that the symptoms arise from a “nocebebo effect” – a strong belief that the fields are harmful.
A new study in the Journal of Health Psychology sheds new light on electrosensitivity by suggesting that it is people who feel especially connected to nature – normally considered a positive trait – who may be particularly likely to suffer from electrosensitivity, probably because their love of nature is accompanied by a heightened negative attitude to anything they consider artificial.
Zsuzsanna Dömötör and her colleagues surveyed 510 people online, 74 of whom described themselves as electrosensitive. The electrosensitive participants tended to score higher than the others on modern health worries in general (related to things like pollution and tainted food), on sensitivity to bodily symptoms, and nature relatedness (measured by agreement with items like “I always think about how my actions affect the environment” and “My ideal vacation spot would be a remote, wilderness area”). What’s more, nature connectedness interacted with the other variables: people prone to modern health worries were especially likely to complain of electrosensitivity if they also felt a connection with nature.
The human mind has been so successful in transforming the material world that it is easy to forget that it too is subject to its own constraints. From biases in our judgment to the imperfection of our memory, psychology has done useful work mapping out many of these limits, yet when it comes to the human imagination, most of us still like to see it as something boundless. But new research in the journal Cognition, on the capacity of our visual imagination, suggests that we soon hit its limits.
If the courts wanted to know if a suspected sex offender was attracted to children, they could ask him or her, or they could ask experts to measure signs of the suspect’s sexual arousal while he or she looked at different images. But a devious suspect would surely lie about their interests, and they could distract themselves to cheat the physical test.
Brain scans offer an alternative strategy: research shows that when we look at images that we find sexually attractive, our brains show distinct patterns of activation. But of course, the same issues of cheating and deliberate distraction could apply.
Unless, that is, you could somehow prevent the suspect from knowing what images they were looking at, by using subliminal stimuli that can’t be seen at a conscious level. Then you could see how their brain responds to different types of image without the suspect even knowing what they were looking at.
This is the essence of a strategy tested in a new paper in Consciousness and Cognition. Martina Wernicke at Asklepios Forensic Psychiatric Hospital of Gottingen and her colleagues have provided a partial proof of principle that it might one day be possible to use subliminally presented images in a brain scanner to provide a fraud-proof test of a person’s sexual interests. It’s a potentially important break-through for crime prevention – given that deviant sexual interest is one of the strongest predictors of future offences – but it also raises important ethical questions.
Academically successful children are more likely to drink alcohol and smoke cannabis in their teenage years than their less academic peers. That’s according to a study of over 6000 young people in England published recently in BMJ Open by researchers at UCL. While the results may sound surprising, they shouldn’t be. The finding is in fact consistent with earlier research that showed a relationship between higher childhood IQ and the use in adolescence of a wide range of illegal drugs.
When you experience frustrations at work – spats with colleagues, or last-minute demands – it’s natural to want to voice your feelings. And surely it’s healthier. After all, better out than in! Not according to new evidence in the European Journal of Work and Organizational Psychology that shows complaining about negative events actually cements their impact. The researchers Evangelia Demeroutia and Russell Cropanzano recommend trying instead to meet the slings and arrows of everyday indignity with all the “sportsmanship” you can muster.
Some fortunate people have more “working memory” than others. It’s as if they have an extra pair of hands available for mental juggling; extremely useful for doing arithmetic and similar tasks in your head. These folk with abundant working memory capacity also tend to fare well academically and in their careers. Little surprise that “brain training” games like Lumosity and Cogmed target working memory in pursuit of these knock-on benefits (though the evidence that the training brings such benefits is weak).
What is surprising is the discovery a number of years ago that mentally dextrous people with greater working memory capacity seem to be particularly susceptible to “brain freeze” or choking under pressure.
For a new study in the Journal of Applied Research in Memory and Cognition, researchers at the University of Chicago and Michigan State University attempted to find out more about why this happens. Their results suggest that actually it’s only a subgroup of high working memory people who have this problem and it’s because of their high distractibility. These high ability chokers or brain freeze victims are “typically reliant on their higher working memory resources for advanced problem solving” but their poor attentional control renders them easily distracted by anxiety, causing their usual mental deftness to break down when the pressure is on.
Most of us have a sense of what it means to be human. Research shows that we agree with each other that traits like friendly, jealous or impatient are more “human” than others like unemotional or selfless. What’s more, we like to see ourselves as human: we care more about human traits and claim to possess them more than other people. In other words, we “self-humanise”, laying claim to the good and the bad as long as they emphasise our own humanity.
But this research on self-humanising presents a conundrum. A different, abundant line of evidence shows that humans bitterly protect a highly positive self-image, supported by cognitive biases that attribute our own failings to circumstances and other people’s to their deficiencies. So, do we really overestimate the bad in ourselves, claiming to be more human, warts and all? According to a critique of the self-humanising field in The Journal of Social Psychology, this is an oversimplification: when it comes to undesirable human traits, we see ourselves as pretty similar to other people.
Most people who undertake psychotherapy seem to benefit from it. How do we know? Arguably, the most important evidence comes from meta-analyses that combine the results from many – sometimes hundreds – of randomly controlled trials. Based on this, it’s been estimated that psychotherapy is effective for about 80 per cent of people (meanwhile, between five to 10 per cent of clients may suffer adverse effects).
But now the more concerning news: a team of researchers led by Evangelos Evangelou at the University of Ioannina, Greece has assessed the quality of 247 of these psychotherapy meta-analyses and they report in Acta Psychiatrica Scandinavica that many of them have serious methodological short-comings.
Coincidentally, a separate research group led by Brent Roberts at the University of Illinois, Urbana-Champaign has just published in Journal of Personality some of the first observational data on how people’s personalities change after undertaking psychotherapy. In contrast to what’s been found in the clinical literature, they report that people who’ve been in therapy seem to show negative changes in personality and other psychological outcomes.
Racism and prejudice are sometimes blatant, but often manifest in subtle ways. The current emblem of these subtle slights is the “microaggression”, a concept that has generated a large programme of research and launched itself into the popular consciousness – prompting last month’s decision by Merriam-Webster to add it to their dictionary. However, a new review in Perspectives on Psychological Science by Scott Lilienfeld of Emory University argues that core empirical and conceptual questions about microaggressions remain unaddressed, meaning the struggle against them takes place on a confusing battlefield, one where it’s hard to tell between friend and foe.