Our Brains Represent The Meaning Of Words the Same Way Whether We Read Them Or Hear Them

Enjoying the music and reading a book

By Emma Young

In an era of TED talks, podcasts, and audiobooks, it’s easy to choose to listen to factual information or fiction, rather than to read it. But is that a good thing? Are there any differences in the way the brain processes the meaning of words that are heard rather than read? According to the researchers behind a thorough new study, published in the Journal of Neuroscience, the answer to this last question is “no”. But it may still be too soon to conclude that listening to an audiobook is effectively the same as reading it.

Fatma Deniz at the University of California, Berkeley, and colleagues recruited six men and three women, all aged in their twenties and thirties. Their brains were scanned using fMRI while they listened to stories from a popular podcast, The Moth Radio Hour, and, separately, while they read those same stories.

The researchers then looked at detailed maps of activity in parts of the brain’s cortex that processed the semantic information — the meaning — as participants read or listened to each word. They found that it didn’t matter which way the words were presented: both reading and listening produced virtually identical patterns. In addition, the locations of the discrete cortical regions that processed the meaning of different categories of word (for example “animals” or “emotional” words)  were similar from person to person.

Based on previous findings, the team had expected some differences in how the participants handled the meaning of words that were heard rather than read, so this was a surprise. “We knew that a few brain regions were activated similarly when you hear a word and read the same word, but I was not expecting such strong similarities in the meaning representation across a large network of brain regions in both these sensory modalities,” Deniz commented.

In terms of understanding the brain’s process for making sense of either speech sounds or letter squiggles, it’s an important finding. But the “natural reading stimuli” weren’t entirely naturalistic. So that the team would know when, precisely, a participant was either reading or listening to a given word, they used what’s known as “rapid serial visual presentation”. Each word was presented on a screen, one at a time, at the same rate and for the same duration as they occurred during listening. This isn’t a problem for the key results. But it doesn’t necessarily imply that reading a print novel or listening to it is the same.

A 2016 study did find that participants who had listened to sections of a non-fiction book showed the same levels of comprehension as those who’d read it on an e-reader. But perhaps the readers would have done better if they’d been given print. Anne Mangen at the University of Stavanger, Norway, has led various studies on comprehension and memory for texts read in a booklet or book form compared to on a screen. In one study, 16-year-old participants who read texts in print scored significantly better on comprehension tests than those who read the texts in a digital form. In another, adult participants were almost twice as good at ordering key plot developments in a mystery story if they’d read it in print rather than on a Kindle. It’s thought that this is because a physical book provides more cues about how far through it, and where on a page spread, you read about a given fact or event.

I’d suspect that whether people find that listening to a book is better or worse than reading it will also depend on various individual factors (not least, do you actually like reading?). Personally, perhaps because my education was print-based, I know I remember words and information far better if I’ve read them than if I’ve heard them. And for me, reading a text gives me instant control over it, allowing me to skip or pause with ease. But for some people — perhaps people with dyslexia, for example — audio may lead to richer meaning-maps in the brain than text. Deniz would now like to see studies to investigate this.

– The representation of semantic information across human cerebral cortex during listening versus reading is invariant to stimulus modality

Emma Young (@EmmaELYoung) is Staff Writer at BPS Research Digest