This Sunday marks the official end of British Summer Time, and once the clocks have gone back, it will of course begin to get dark even earlier in the afternoon. Now new research suggests that if you find yourself feeling uncomfortably cold as you head home from work through dimmer light, the light change itself could have something to do with it. The study, led by Giorgia Chinazzo at the Swiss Federal Institute of Technology in Lausanne and published in Scientific Reports, shows for the first time that levels of daylight affect our perceptions of temperature.
All human cultures feature music. But the majority of studies of perceptions of music have been conducted on Western university students. This can make it hard to know whether the findings are biologically-driven, and common to all people, or the result of cultural influences.
To disentangle these two possibilities, you need a society that hasn’t really been exposed to Western music, for comparison. They’re not easy to find. But in 2016, a team led by Josh McDermott at MIT reported that the Tsimane’, a group of people living in the remote Bolivian rainforest, showed some unexpected differences in their musical perceptions compared to Western listeners. For example, while a chord comprised of an A and an F sharp sounded horribly grating to Western ears, for the Tsimane’ it was just as pleasant as a C with a G, which Westerners also enjoyed. Culture had to explain these differences.
Now a new study, led by Nori Jacoby at the Max Planck Institute for Empirical Aesthetics, Germany, has found that the Tsimane’ don’t perceive pitch in the same way as Americans, either. This work adds to other research finding cultural variations in perceptions that had once been assumed to be universal, such as colour perception.
You see a pedestrian about to step out in front of an oncoming car. Is it better to calmly call out a warning, or to scream?
Of course, it’s better to scream — but not just because a scream is loud. Car alarms, police sirens and smoke alarms are all loud, too. But, like screams, they also feature fast but perceptible fluctuations in loudness, usually at frequencies of between 40 and 80 Hz, making them acoustically “rough”. Quite why such sounds should be so attention-grabbing, and even unbearable, hasn’t been clear. Now a team led by Luc Arnal at the University of Geneva has found that this type of sound triggers activity in brain areas related not just to hearing but also to aversion and to pain. This makes them impossible to ignore.
Picture yourself sitting in a Zen garden, surrounded by low, rounded bushes and gravel raked into rippling swirls. Now imagine standing in front of a brutalist building, all straight lines and sharp edges. If you think you’d feel more relaxed in the Zen garden, there could be a low-level perceptual reason — one that could explain everything from why you’re far more likely to find a jagged script on the cover of a death metal album than on a romance novel, to why clouds and lullabies seem to go together.
The new study, published in Proceedings of the Royal Society B, suggests that we automatically associate variations in one particular property of images or sounds with variations in levels of emotional arousal. This gives us an instinctive understanding, just from the tone of someone’s voice or watching their movements, of whether they are angry or sad, excited or calm. But it seems that these associations between perception and emotion are so automatic and fundamental that we apply them to inanimate objects, as well.
A newborn baby knows almost nothing about the world it comes into. To make sense of the onslaught of incoming sensory information, she or he must start to notice meaningful patterns and categorise them: that particular combination of visual data signifies a “face”, for example, while that noise is a “voice”. As the authors of a new paper in Developmental Science point out, “without this fundamental categorisation function, our nervous systems would be overwhelmed by the sheer diversity of our experience.”
It had been thought that infants form these categories using information from just one sense, whichever is the most relevant. Following this account, the category of “faces” results from an accumulation of visual information about what faces look like. However, an intriguing new study, involving four-month-old infants and their mothers’ smelly t-shirts, suggests that babies’ early acquisition of the faces category is a truly multi-sensory process.
Philosophers and mathematicians have long held that maths can be aesthetically pleasing. “Mathematics, rightly viewed, possesses not only truth, but supreme beauty,” wrote Bertrand Russell, while Carl Friedrich Gauss proclaimed that “The enchanting charms of this sublime science reveal themselves in all their beauty only to those who have the courage to go deeply into it”.
But a study published recently in Cognition suggests that even those whose lives don’t revolve around logic and numbers also have an appreciation for mathematical “beauty”. People tend to see similarities between mathematical proofs and certain paintings or pieces of music, the study finds, suggesting we all share an intuition for the aesthetics of mathematics.
You spend about 10 per cent of your waking hours with your eyes shut, simply because of blinking. Every few seconds, each time you blink, your retinas are deprived of visual input for a period lasting anywhere between tens to hundreds of milliseconds (500 milliseconds is equivalent to half a second). You don’t usually notice this because your brain suppresses the dark spells and stitches together the bursts of visual information seamlessly. But these dips in visual processing in the brain do have an impact: a new study in Psychological Science finds that, in an important way, they cause your sense of the passing of time to stop temporarily.
We often think of sleep as a chance to switch off from the outside world, leaving us blissfully ignorant of anything going on around us. But neuroscience research has shown this is a fantasy – we still monitor the environment and respond to particular sounds while we’re sleeping (at least in some stages of sleep) – a fact that will be unsurprising to anyone who has woken up after hearing someone say their name.
Now a study published in Nature Human Behaviour has revealed more about the brain’s surprisingly sophisticated levels of engagement with the outside world during sleep. Not only does the sleeping brain respond to certain words or sounds – it can even select between competing signals, prioritising the one that is more informative.
For most of us, it is difficult to imagine what it must be like to be a synesthete – that is, someone who experiences a crossing over of their senses, such as seeing sounds as colours, or perceiving shapes as having tastes. However, according to a new study in Consciousness and Cognition, it is actually relatively easy for people with normal perception to have a synesthetic experience (of the sound-to-vision variety). It merely takes a few minutes of visual deprivation, followed by a visual imagery task. The findings are not merely intriguing – and a fun idea for a psychology class experiment – they also have a bearing on the main theories for how synesthesia occurs.