Various visual impairments and abnormalities, such as unusual eye movement patterns, blink rates and retinal problems, are more common than usual in people diagnosed with schizophrenia, suggesting these issues may contribute to the development of the condition. Yet paradoxically, since the 1950s, there have also been intriguing hints that people who are blind from birth or an early age are less likely to develop schizophrenia and other kinds of psychoses, suggesting blindness can act as a protective factor against the illness.
Before now, findings – mostly from case-study type research – suggested that cortical blindness (resulting from abnormalities in the occipital cortex of the brain, rather than the eyes) may even be completely protective. As far as the authors of a new study are aware, not a single case of schizophrenia has ever been reported in someone who is cortically blind.
“Note that most authors are cautious to add that ‘absence of evidence is not evidence of absence’,” Vera Morgan at the University of Western Australia told me. But a total of zero documented cases among such people to date is striking.
Around 30 per cent of British children fail to meet expected targets in reading or maths at age 11. These children face a future of continuing difficulties in education, as well as poorer mental health and employment success. Understanding why some kids struggle – and providing them with tailored support as early as possible – is clearly vital. Some will be diagnosed with a specific disorder, such as Attention Deficient Hyperactivity Disorder or dyslexia, and get targeted help. But many will not. And even many conventional diagnostic labels may be misleading, and fail to capture the true picture of a child’s problems, according to new work by a team at the MRC Cognition and Brain Sciences Unit at the University of Cambridge, which has come up with a radical, alternative approach.
Researchers are getting closer to understanding the neurological basis of personality. For a new paper in the Journal of Personality, Nicola Toschi and Luca Passamonti took advantage of a recent technological breakthrough that makes it possible to use scans to estimate levels of myelination in different brain areas (until fairly recently this could only be done at postmortem).
Myelin is a fatty substance that insulates nerve fibres and speeds up information processing in the brain – it tends to be thicker in parts of the cortex involved in movement and perception, while it is lighter in brain regions that evolved later and that are involved in more abstract thought and decision making.
The new findings, though preliminary, suggest that people with “healthier”, more advantageous, personality traits, such as more emotional stability and greater conscientiousness, may benefit during development from more enhanced myelination in key areas of the brain where the myelination process is particularly prolonged in humans, continuing through adolescence and into the twenties.
If you want to know about the special relationship between human and canine you need only watch a dog owner slavishly feed, cuddle and clean up after her furry companion, day after day after day. But is this unique cross-species relationship also reflected at a deeper level, in the workings of the canine brain? A recent study in Learning and Behavior suggests so, finding that highly trained dogs have a dedicated neural area for processing human faces, separate from the area involved in processing the faces of other dogs.
The researchers, led by Andie Thompkins at Auburn University, say their results are of theoretical importance (in relation to the evolutionary origin of cognitive abilities) and could have practical use too, potentially paving the way to using brain scans to validate the expertise of trained dogs.
While there’s still a debate about whether we have free will or not, most researchers at least agree that we feel as if we do. That perception is often considered to have two elements: a sense of having decided to act – called “volition”; and feeling that that decision was our own – having “agency”.
Now in a paper in PNAS, Ryan Darby at Vanderbilt University Medical Center and colleagues have used a new technique – lesion network mapping – to identify for the first time the brain networks that underlie our feelings of volition and for agency. “Together, these networks may underlie our perception of free will, with implications for neuropsychiatric diseases in which these processes are impaired,” the researchers write.
Curiosity is a welcome trait in many respects and is the fuel that powers science. Yet literature is filled with fables that warn of the seductive danger of curiosity (think of how Orpheus loses his wife Eurydice forever after he succumbs to the temptation to glimpse at the underworld). In real life too, we all know the regret that can follow if we give in to curiosity – glancing at a private message that we shouldn’t have, for instance; reading a TV review when we know it contains spoilers; or trying out what happens if you put metal in a microwave (tip: don’t).
From whence does curiosity derive such power over us? One answer lies in the brain. In a pair of brain-imaging studies published as a preprint at bioRxiv – aptly titled Hunger For Knowledge: How The Irresistible Lure of Curiosity Is Generated In the Brain – Johnny King Lau and his colleagues have shown that curiosity appears to be driven by the same neurobiological process as physical hunger.
The head of a brown lion. Multiple tiny, green, spinning Catherine wheels with red edges. Colourful fragments of artillery soldiers and figures in uniform and action. Unfamiliar faces of well-groomed men… These are just a few of the hallucinations reported by a group of people with macular degeneration (MD), a common cause of vision loss in people aged over 40.
About 40 per cent of people with MD – who lose vision in the centre of their visual field but whose peripheral vision is generally unaffected – develop Charles Bonnet syndrome (CBS), reporting hallucinations that vary from simple flashes of light and shapes to faces, animals and even complex scenes.
It has been suggested that CBS might arise as a result of over-responsiveness – “hyper-excitability” – of certain visual regions of the cortex, after they are deprived of normal retinal input. But whether this really is the case –and why some people with reduced vision or blindness develop them, while others don’t – has not been clear. Now new work by a team of psychologists at the University of Queensland, Australia, led by David Painter, and published in Current Biology, offers some answers.
In recent years, researchers have sought to look under the hood to understand the neural correlates of the changes brought about by psychotherapy. Not only can such understanding help us hone in on the precise processes that are being acted upon in therapy, thus helping us focus on these gains, they could also show where pharmacological interventions might be complementary, and where they could directly obstruct the therapeutic work. Now a systematic review and meta-analysis in Psychiatry Research: Neuroimaging has outlined all we know so far about how therapy changes the depressed brain, and it suggests key changes occur in emotional processing areas.
We all differ in how much empathic brain activity we experience in response to witnessing somebody else in pain. For instance, hospital physicians, who are regularly exposed to other people’s suffering, tend to show a dampened response – perhaps a pragmatic necessity to cope in the job, and might along the way explain the blasé gallows humour seen in the profession. If these differences are found within a job, perhaps they also occur within a lifestyle choice, such as one that involves playing with and consenting to painful activities, such as bondage, discipline, dominance, submission, sadism, and masochism, typically abbreviated to BDSM.
As they report in Neuropsychologia, Siyang Luo at Sun Yat-Sen University and Xiao Zhang at Jinan University explored this issue by first running a preliminary online study on a Chinese BDSM web forum, finding that across genders and BDSM roles, female submissives showed the clearest differences from controls in terms of their having a diminished response to other people’s pain and lower scores on aspects of an empathy questionnaire. (Female doms didn’t show a reliably different response to pain, and male BDSM practitioners barely differed from controls.)
Educational neuromyths include the idea that we learn more effectively when taught via our preferred “learning style”, such as auditory or visual or kinesthetic (hear more about this in our recent podcast); the claim that we use only 10 per cent of our brains; and the idea we can be categorised into left-brain and right-brain learners. Belief in such myths is rife among teachers around the world, according to several surveys published over the last ten years. But does this matter? Are the myths actually harmful to teaching? The researchers who conducted the surveys believe so. For instance, reporting their survey results in 2012, Sanne Dekker and her colleagues concluded that “This [belief in neuromyths] is troublesome, as these teachers in particular may implement wrong brain-based ideas in educational practice”. (Full disclosure: I’ve made similar arguments myself.)
But now this view has been challenged by a team at the University of Melbourne, led by Jared Horvath, who have pointed out that this is merely an assumption: “Put simply,” they write in their new paper in Frontiers in Psychology, “there is no evidence to suggest neuromyths have any impact whatsoever on teacher efficacy or practice”.
Horvath’s team tested the assumption that belief in neuromyths harms teaching by comparing belief in the neuromyths among 50 award-winning teachers from the UK, USA and Australia with the belief in these same myths shown by hundreds of trainee and non-award-winning teachers (as recorded in the earlier surveys) – the logic being that if belief in neuromyths has an adverse effect on teaching then presumably the award-winning teachers will show significantly lower rates of endorsement of the myths than their less celebrated counterparts.