We often think of sleep as a chance to switch off from the outside world, leaving us blissfully ignorant of anything going on around us. But neuroscience research has shown this is a fantasy – we still monitor the environment and respond to particular sounds while we’re sleeping (at least in some stages of sleep) – a fact that will be unsurprising to anyone who has woken up after hearing someone say their name.
Now a study published in Nature Human Behaviour has revealed more about the brain’s surprisingly sophisticated levels of engagement with the outside world during sleep. Not only does the sleeping brain respond to certain words or sounds – it can even select between competing signals, prioritising the one that is more informative.
For most of us, it is difficult to imagine what it must be like to be a synesthete – that is, someone who experiences a crossing over of their senses, such as seeing sounds as colours, or perceiving shapes as having tastes. However, according to a new study in Consciousness and Cognition, it is actually relatively easy for people with normal perception to have a synesthetic experience (of the sound-to-vision variety). It merely takes a few minutes of visual deprivation, followed by a visual imagery task. The findings are not merely intriguing – and a fun idea for a psychology class experiment – they also have a bearing on the main theories for how synesthesia occurs.
Think about the concepts of “red” and “justice” and you’ll notice a key difference. If you’re sighted, you’ll associate “red” most strongly with the sensory experience, which relates to signals from cone cells in your eyes. “Justice”, in contrast, doesn’t have any associated sensory qualities – as an abstract concept, you’ll think about its meaning, which you learnt via language, understanding it to be related to other abstract concepts like “fairness” or “accountability”, perhaps. But what about blind people – how do they think about “red”?
A brain-imaging study of 12 people who had been blind from birth, and 14 sighted people, published recently in Nature Communications, shows that while for sighted people, sensory and abstract concepts like “red” and “justice” are represented in different brain regions, for blind people, they’re represented in the same “abstract concept” region.
We’re taught from an early age that it is polite and assertive to look people in the eyes when we’re talking to them. Psychology research backs this up – people who make plenty of eye contact – as long as it’s not excessive – are usually perceived as more competent, trustworthy and intelligent. If you want to make a good impression, then, it’s probably a good idea to meet the gaze of the person you’re talking to. However, following this advice is not necessarily straight-forward for everyone. It’s well-documented that mutual gaze can be emotionally intense and distracting, even uncomfortably so for some.
If this is your experience, you may welcome a study published recently in the journal Perception that documents a phenomenon known as the “eye contact illusion” – put simply, we are not that good at telling whether an interlocutor is looking us in the eye or not. In fact, we tend to think they are, even when they’re not (a bias that is magnified after we’ve been rejected). Thanks to this illusion, you can give the impression of making eye contact simply by ensuring you are looking in the general direction of your conversant’s face.
How accurately or not we are able to judge the size of our own bodies and specific body parts is an important topic in clinical psychology because a distorted body image is thought to play part in eating disorders, body dysmorphia and other related conditions. However, research has until now been limited in always involving one- or two-dimensional judgments, with volunteers asked to estimated the length of various body parts, for instance, or asked to judge which of various 2-dimensional visual depictions of their body is most accurate. In reality, of course, we don’t just have a sense of how our body looks in two dimensions from the outside but also how it feels from the inside, including how much space it occupies.
A new study published in Cortex is the first to examine how accurately people of healthy weight can estimate the volume of their entire body and specific body parts. Renata Sadibolova at Goldsmiths, University of London, and her colleagues write that “these findings … highlight the importance of studying the perceptual distortions ‘at the baseline’, i.e., in healthy population, given their potential to further elucidate the nature of perceptual distortions in clinical conditions.”
The head of a brown lion. Multiple tiny, green, spinning Catherine wheels with red edges. Colourful fragments of artillery soldiers and figures in uniform and action. Unfamiliar faces of well-groomed men… These are just a few of the hallucinations reported by a group of people with macular degeneration (MD), a common cause of vision loss in people aged over 40.
About 40 per cent of people with MD – who lose vision in the centre of their visual field but whose peripheral vision is generally unaffected – develop Charles Bonnet syndrome (CBS), reporting hallucinations that vary from simple flashes of light and shapes to faces, animals and even complex scenes.
It has been suggested that CBS might arise as a result of over-responsiveness – “hyper-excitability” – of certain visual regions of the cortex, after they are deprived of normal retinal input. But whether this really is the case –and why some people with reduced vision or blindness develop them, while others don’t – has not been clear. Now new work by a team of psychologists at the University of Queensland, Australia, led by David Painter, and published in Current Biology, offers some answers.
This is Episode 14 of PsychCrunch, the podcast from the British Psychological Society’s Research Digest, sponsored by Routledge Psychology. Download here.
Can psychology help your cooking taste better? Our presenter Ginny Smith hears about the importance of food presentation, pairing and sequencing, and how our perception of food is a multi-sensory experience. She and her friends conduct a taste test using “sonic seasonings” that you can also try at home.
The idea that the language that you speak influences how you think about and experience the world (the so-called Sapir-Whorf hypothesis) has a long and storied history. A lot of research into the issue has focused on colour perception, and evidence has accumulated that people whose native languages have different colour categories don’t see the world in quite the same way.
Now in a new paper, published in Psychological Science, Martin Maier and Rasha Abdel Rahman at the Humboldt University of Berlin report that by affecting visual processing at an early stage, such linguistic differences can even determine whether someone will see a coloured shape – or they won’t. “Our native language is thus one of the forces that determine what we consciously perceive,” they write.
It’s well-known that we can miss apparently obvious objects in our visual field if other events are hogging our limited attention. The same has been shown for sounds: in a nod to Daniel Simons’ and Christopher Chabris’ famous gorilla/basketball study that demonstrated “inattentional blindness”, distracted participants in the first “inattentional deafness” study failed to hear a man walking through an auditory scene for 19 seconds saying repeatedly “I am a gorilla”. Now, two new studies separately show that a very similar effect occurs in relation to touch (inattentional numbness) and to smell(inattentional anosmia).
Now a team led by Sarah Ketay at the University Hartford have shown how this absorption of friends into our self-concept can manifest at a visual level, affecting our ability to distinguish their faces from our own. Writing in the Journal of Social and Personal Relationships, Ketay’s team said “The present research supports the idea that close others are processed preferentially and may overlap with the self.”