Photo: The serif font Jubilat was used on signs for Bernie Sanders’ 2016 presidential bid — though a new study suggests that sans serifs are generally seen as more liberal. Credit: Brett Carlsen/Getty Images.
Fonts can be very distinctive indeed. Even if robbed of their original context, it can be easy to identify the fonts used on the front of a Harry Potter book, adorning a Star Wars poster, or on the side of a Coca-Cola can, to name a few examples.
But particular fonts can also leave us with other impressions: the font used to brand a beloved book, for example, has different emotional connotations to the one you use to type emails. And according to new research in Communication Studies from Katherine Haenschen and Daniel Tamul at Virginia Tech, particular fonts may also carry some political connotations, too.
From our earliest moments, our awareness of being physically close to someone else is tied up with perceptions of actual warmth. It’s been suggested that this relationship becomes deeply ingrained, with temperature in turn affecting our social perceptions on into adulthood. However, some of the most-publicised results in this field have failed to replicate, leading critics to query whether the relationship really exists.
Now a new paper, published in Social Psychology, provides an apparently compelling explanation for at least some inconsistencies in the results, and supports the idea that our temperature does indeed affect our social judgements.
It’s been known for centuries that we experience all kinds of optical illusions, and in the past few decades, researchers have shown that some animals, including monkeys, pigeons, and dogs, do too. Now the first ever study of this kind in reptiles has found that even the bearded dragon falls for an optical illusion that we humans succumb to.
Perceptual illusions — subjective interpretations of physical information — are interesting to psychologists because they reveal important insights into how we construct our representations of the world. This new work, published in the Journal of Comparative Psychology, provides evidence that at least one reptile can be counted among the animals don’t simply passively process retinal signals, but actively interpret visual data, too.
In 2017, in my first ever post for the Digest, I wrote about a paper that challenged the popular idea that “now” — also known as the “subjective present” — is three seconds long. It’s just not possible to define the present so strictly, this review concluded.
Instead of trying to explore what constitutes “right now”, another way to get at our conceptions of time is to ask: when does the present end and the future begin? And precisely this question has now been explored in a series of studies by Hal Hershfield at UCLA and Sam Maglio at the University of Toronto. In their paper, published in the Journal of Experimental Psychology, the pair report that these perceptions can vary substantially between people — and can affect the kinds of choices that we make, with potentially significant implications for our future lives. Continue reading “When Does The Present Become The Future? It Depends Who You Ask”→
This Sunday marks the official end of British Summer Time, and once the clocks have gone back, it will of course begin to get dark even earlier in the afternoon. Now new research suggests that if you find yourself feeling uncomfortably cold as you head home from work through dimmer light, the light change itself could have something to do with it. The study, led by Giorgia Chinazzo at the Swiss Federal Institute of Technology in Lausanne and published in Scientific Reports, shows for the first time that levels of daylight affect our perceptions of temperature.
All human cultures feature music. But the majority of studies of perceptions of music have been conducted on Western university students. This can make it hard to know whether the findings are biologically-driven, and common to all people, or the result of cultural influences.
To disentangle these two possibilities, you need a society that hasn’t really been exposed to Western music, for comparison. They’re not easy to find. But in 2016, a team led by Josh McDermott at MIT reported that the Tsimane’, a group of people living in the remote Bolivian rainforest, showed some unexpected differences in their musical perceptions compared to Western listeners. For example, while a chord comprised of an A and an F sharp sounded horribly grating to Western ears, for the Tsimane’ it was just as pleasant as a C with a G, which Westerners also enjoyed. Culture had to explain these differences.
Now a new study, led by Nori Jacoby at the Max Planck Institute for Empirical Aesthetics, Germany, has found that the Tsimane’ don’t perceive pitch in the same way as Americans, either. This work adds to other research finding cultural variations in perceptions that had once been assumed to be universal, such as colour perception.
You see a pedestrian about to step out in front of an oncoming car. Is it better to calmly call out a warning, or to scream?
Of course, it’s better to scream — but not just because a scream is loud. Car alarms, police sirens and smoke alarms are all loud, too. But, like screams, they also feature fast but perceptible fluctuations in loudness, usually at frequencies of between 40 and 80 Hz, making them acoustically “rough”. Quite why such sounds should be so attention-grabbing, and even unbearable, hasn’t been clear. Now a team led by Luc Arnal at the University of Geneva has found that this type of sound triggers activity in brain areas related not just to hearing but also to aversion and to pain. This makes them impossible to ignore.
Picture yourself sitting in a Zen garden, surrounded by low, rounded bushes and gravel raked into rippling swirls. Now imagine standing in front of a brutalist building, all straight lines and sharp edges. If you think you’d feel more relaxed in the Zen garden, there could be a low-level perceptual reason — one that could explain everything from why you’re far more likely to find a jagged script on the cover of a death metal album than on a romance novel, to why clouds and lullabies seem to go together.
The new study, published in Proceedings of the Royal Society B, suggests that we automatically associate variations in one particular property of images or sounds with variations in levels of emotional arousal. This gives us an instinctive understanding, just from the tone of someone’s voice or watching their movements, of whether they are angry or sad, excited or calm. But it seems that these associations between perception and emotion are so automatic and fundamental that we apply them to inanimate objects, as well.
A newborn baby knows almost nothing about the world it comes into. To make sense of the onslaught of incoming sensory information, she or he must start to notice meaningful patterns and categorise them: that particular combination of visual data signifies a “face”, for example, while that noise is a “voice”. As the authors of a new paper in Developmental Science point out, “without this fundamental categorisation function, our nervous systems would be overwhelmed by the sheer diversity of our experience.”
It had been thought that infants form these categories using information from just one sense, whichever is the most relevant. Following this account, the category of “faces” results from an accumulation of visual information about what faces look like. However, an intriguing new study, involving four-month-old infants and their mothers’ smelly t-shirts, suggests that babies’ early acquisition of the faces category is a truly multi-sensory process.