Category: Perception

Psychologists are figuring out why some of us find echolocation easier than others

Blindman's BluffBy Christian Jarrett

Daniel Kish’s life reads like the origins story out of a super hero comic book. To treat his cancer, doctors removed both Kish’s eyes when he was aged one. Later, as a child, he taught himself to echolocate like a bat. Using the echoes from his own clicking sounds he detects the world around him. He can even cycle busy streets and it’s his life’s mission to empower other blind children by teaching them echolocation or what he calls flashsonar.

Psychologists studying the skill have found that it is eminently teachable, for blind people and the sighted. But what’s also become clear is that there is a huge amount of variation between individuals: some people, blind or sighted, seem to pick it up easily while others struggle. A new study in Experimental Brain Research is among the first to try to find out which mental abilities, if any, correlate with echolocation aptitude. The findings could help screening to see who is likely to benefit from echolocation and offer clues to how to help those who struggle.

Continue reading “Psychologists are figuring out why some of us find echolocation easier than others”

Bad news for passport control: face-matching is harder than we realised

Passport Officer at Airport SecurityBy Alex Fradera

Experiments suggest that telling if two unfamiliar faces are the same or different is no easy task. Such research has sometimes presented participants with full body shots, has more commonly used cropped shots of people’s heads, but almost never placed the faces in a formal context, such as on a photographic ID card. But these are the situations in which face-to-photo matching is most relevant, when a shop assistant squints at a driver’s license before selling alcohol to a twitchy youth, or an emigration official scrutinises passports before their holders pass ports. Moreover, it’s plausible that the task is harder when juggling extra information, something already found in the realm of fingerprint matching, where biographical information can lead to more erroneous matches because it triggers observer prejudices. A new article in Applied Cognitive Psychology confirms these fears, suggesting that our real-world capacity to spot fakes in their natural setting is even worse than imagined.

Continue reading “Bad news for passport control: face-matching is harder than we realised”

Autistic people’s social difficulties linked to unusual processing of touch

Man hand pushing a digital screen on office backgroundBy guest blogger Helge Hasselmann

Besides problems with social interactions, it has been known for a while that many people with autism experience sensory difficulties, such as hypersensitivity to sounds, light or touch. With sensory impairment now officially included in diagnostic manuals, researchers have been trying to see if there’s a link between the sensory and social symptoms. Such a link would make intuitive sense: For instance, it is easy to imagine that if someone experienced sensory stimuli more strongly, they would shun social interaction due to their complexity. More specifically, you would expect them to struggle with filtering out and making sense of social cues against the backdrop of sensory overload.

Past research has suggested that tactile hyper-responsiveness in particular may be relevant. The correct processing of tactile information plays an important role in differentiating yourself from others (so-called “self-other discrimination”), a crucial requirement for social cognition. In fact, touch may be unique among the senses because there is a clear difference in the tactile feedback received when you touch something compared to when you see someone else touch something. Now a study in Social Cognitive and Affective Neuroscience has used recordings of participants’ brain waves to provide more evidence that tactile sensations are processed differently in people with autism and that this may contribute to their social difficulties.

Continue reading “Autistic people’s social difficulties linked to unusual processing of touch”

It’s surprisingly difficult to introspect about your own eye movements

eye-movement-giphyBy Christian Jarrett

Eye movements have a profound influence on our conscious experience. Our vision is only high acuity at the centre, so we only see in detail those things that we shift our eyes to focus on. Also, each move of the eyes – known as a saccade – has massive consequences for visual processing in the brain because the incoming information is suppressed during the eye movement (to prevent the experience of blurring) and, on settling gaze on a new location, millions of neurons in our visual cortex must update to reflect the new slice of the visual world that they are now responsible for processing. Given all this, you’d think we’d have a good idea of where we’ve been pointing our eyes. In fact, as shown across three experiments published in The Quarterly Journal of Experimental Psychology, our insight into our own eye movements is virtually non-existent.  Continue reading “It’s surprisingly difficult to introspect about your own eye movements”

Investigating the weird effects treadmills have on our perception

Treadmill workoutBy Christian Jarrett

Anyone who’s been on a treadmill at the gym has probably had that strange perceptual experience afterwards – once you start to walk on stable ground again, it feels for a time as though you’re moving forward more quickly than you really are. The illusion, which is especially striking for treadmill newbies, was first documented scientifically in a Nature paper 20 years ago. Since then psychologists have come to better understand what’s going on and the ways the effects can manifest. Continue reading “Investigating the weird effects treadmills have on our perception”

A surprising number of people are born with a problem recognising familiar voices

Gossip girl eavesdropping with hand to ear.

By Christian Jarrett

You may have heard of face-blindness (known formally as prosopagnosia), which is when someone has a particular difficulty recognising familiar faces. The condition was first noticed in brain-damaged soldiers and for a long time psychologists thought it was extremely rare and primarily caused by brain damage. But in recent years they’ve discovered that it’s actually a relatively common condition that some (approximately two per cent of the population) otherwise healthy people are born with. Now research on the related condition of phonagnosia – an impairment in recognising familiar voices – is catching up. A new survey reported in Brain and Language, the largest of its kind published to date, estimates that just over three per cent of the population are born with phonagnosia, many of them probably without even realising it. Continue reading “A surprising number of people are born with a problem recognising familiar voices”

After rejection, your brain performs this subtle trick to help you make friends

14409450783_e7b0079828_kImmediately after we’ve been shunned, a new study shows our brains engage a subtle mechanism that alters our sense of whether other people are making eye contact with us, so that we think it more likely that they are looking our way. As friendly encounters often begin with a moment of joint eye contact, the researchers, writing in The Quarterly Journal of Experimental Psychology, think this “widening of the cone of gaze” as they call it could help the ostracised to spot opportunities for forging new relationships.  Continue reading “After rejection, your brain performs this subtle trick to help you make friends”

Psychologists still don’t know how the brain deals with blinks

blinking-eyeIf you were sat in a dark room and the lights flickered off every few seconds, you’d definitely notice. Yet when your blinks make the world go momentarily dark – and bear in mind most of us perform around 12 to 15 of these every minute – you are mostly oblivious. It certainly doesn’t feel like someone is flicking the lights on and off. How can this be?

A new study in Journal of Experimental Psychology: Human Perception and Performance has tested two possibilities – one is that after each blink your brain “backdates” the visual world by the duration of the blink (just as it does for saccadic eye movements, giving rise to the stopped clock illusion); the other is that it “fills in” the blanks created by blinks using a kind of perceptual memory of the visual scene. Neither explanation was supported by the findings, which means that the illusion of visual continuity that we experience through our blinks remains a mystery.

One experiment involved students making several judgments about how long a letter ‘A’ was presented on a computer screen (the actual durations were between 200ms to 1600ms; 1000ms equals 1 second). Sometimes the ‘A’ appeared at the beginning or end of a voluntary eye blink, other times it appeared during a period when the participant did not blink. If we backdate visual events that occur during blinks, then the ‘A’s that appeared at the beginning or end of a blink should have been backdated to the onset of the blink, giving the illusion that they’d been presented longer than they actually had, as compared with ‘A’s that appeared when there was no blink. In fact, the researchers found no evidence that the students overestimated the duration of ‘A’s that appeared during blinks.

Figure one from Irwin and Robinson 2016

Another experiment involved students making a voluntary blink while a letter ‘A’ was already onscreen and making a judgment of how long the ‘A’ was visible, and also making judgments about the duration of other ‘A’s that were onscreen during non-blink periods. If backdating or perceptual “filling in” occurs during blinks, then the students should have judged the time onscreen of an ‘A’ of a given duration as the same whether they blinked during its appearance or they didn’t. But this isn’t want the researchers found – rather, the students consistently underestimated the duration of ‘A’s if they blinked during their appearance.

We do know from past research that the brain to some extent shuts down visual processing during blinks – a study from the 80s shone a light up through people’s mouths and found their ability to detect changes in its brightness was reduced during blinks, even though the blinks obviously didn’t impede the light source. But what the new research shows is still unclear is how the brain weaves the loss of visual input during blinks into a seamless perceptual experience.

Summing up, the University of Illinois researchers David Irwin and Maria Robinson said the brain seems to ignore the perceptual consequences of blinks, but they’re not sure how this is done. “Having ruled out the temporal antedating and perceptual maintenance hypotheses,” they said, “the question still remains: Why does the visual world appear continuous across eye blinks?”.

_________________________________ ResearchBlogging.org

Irwin, D., & Robinson, M. (2016). Perceiving a Continuous Visual World Across Voluntary Eye Blinks. Journal of Experimental Psychology: Human Perception and Performance DOI: 10.1037/xhp0000267

Animated GIF via GIPHY

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free weekly email will keep you up-to-date with all the psychology research we digest: Sign up!

When we draw a face, why do most of us put the eyes in the wrong place?

Go ahead, sketch a face on your note paper. Use a photo of someone as a guide if you want. Unless you’re a trained artist, the chances are that you’ve made an elementary error, placing the eyes too far up the head, when it fact they should be halfway. Research suggests about 95 per cent of us non-artists tend to make this mistake and in a new study in the Psychology of Aesthetics, Creativity and the Arts, psychologists in America have attempted to find out why. The answer it turns out is rather complicated and concerns both our lack of knowledge and basic biases in the way that we pay attention to faces and to space in general.

Justin Ostrofsky and his colleagues asked 75 psychology undergrads to draw two faces shown on a computer screen – both were identical except one had hair and one was bald. Crucially, half the participants were told that the eyes on a human face typically appear halfway down the head, whereas the other participants weren’t given this information.

Overall the participants made the usual error that people make when drawing faces and placed the eyes too far up the head, even though they had the model faces to guide them. But this error wasn’t as extreme in the participants who were given the specific guidance about eye position. This tells us that at least part of the reason that non-artists place the eyes too high is because we don’t know (or we’ve never noticed) their precise schematic location in a face.

However, the fact that the participants given this information still placed the eyes too high suggests that there is more to this than a lack of schematic knowledge. Another factor seems to be that when looking at faces, we tend to ignore the forehead region (this has been shown by prior research that’s tracked people’s gaze while they look at faces). Instead, we pay more attention to the parts of the face that contain features. The relevance of this to drawing was shown by the fact the participants made a smaller error with eye position when drawing the face that had hair than the face that was bald. The researchers explained: “When drawing the bald model, the absence of the hair line creates a larger forehead region to ignore and attenuate, resulting in the eyes drawn even further up the head in the bald model.”

Yet another relevant factor seems to be our natural bias towards ignoring the upper end of vertical space. This is easy to demonstrate by asking people to mark the mid-point of a vertical line – most of us place the mid-point too high, which in neuropsychological jargon is a sign of “altitudinal neglect”, meaning that we neglect to attend to higher space.

In the current study, the researchers asked their participants to perform a vertical line bisection and they found that the greater their altitudinal neglect (marking the line midpoint higher), the higher they tended to place the eyes on the faces they drew. But intriguingly this association was only true for the participants who were given the factual information about the eyes being midway down a human face. It seems being given this schematic knowledge improves our drawing, but only to a point – ultimately we’re still led astray by a basic attentional bias (presumably artists learn to overcome this bias).

It’s amazing that a simple drawing task can reveal so many quirks of the human mind, but it’s not the first time. For instance, last year researchers exposed the foibles of human memory by demonstrating that most people are poor at drawing the Apple logo, even though many of us are exposed to it everyday.

_________________________________ ResearchBlogging.org

Ostrofsky, J., Kozbelt, A., Tumminia, M., & Cipriano, M. (2016). Why Do Non-Artists Draw the Eyes Too Far Up the Head? How Vertical Eye-Drawing Errors Relate to Schematic Knowledge, Pseudoneglect, and Context-Based Perceptual Biases. Psychology of Aesthetics, Creativity, and the Arts DOI: 10.1037/a0040368

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free weekly email will keep you up-to-date with all the psychology research we digest: Sign up!

The bigger you get, the harder it is to tell whether you’ve gained or lost weight

When your waistband feels tighter than usual, or the scales say you’ve put on a few pounds, it’s easy to blame the news on clothes shrinkage or an uneven carpet, especially if your body looks just the same in the mirror. And that lack of visual evidence for weight gain (or loss) is especially a problem for more obese people, according to a new paper in the British Journal of Health Psychology.

The researchers asked female participants to estimate the weight of 120 differently sized women (their weights ranged from 28.2 to 104.9 kg; roughly 4.5 to 16.5 stones). The heavier the women in the photos were, the more the participants tended to underestimate their weight – on average, “an observer who judges the weight of a 100 kg woman will underestimate her weight by ~10 kg” the researchers said.

In a second study, participants had to judge whether pairs of real or CGI women had the same or a different BMI (body mass index). When the women in the pictures had a higher BMI, the difference in their respective BMIs had to be greater for participants to notice a difference (this is actually an example of a basic perceptual phenomenon known as Weber’s law).

“Our results clearly point to the potential for perceptual factors contributing to problems with detecting obesity and weight increase,” concluded the researchers led by Katri Cornelissen at Northumbria University. As people get heavier they will find it more difficult to detect extra weight gain, and conversely they will also struggle to detect when they have lost weight, which may undermine dieting efforts.

Visual biases in judging body weight

_________________________________
   
Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!