Video Games And Computer-Like Brains: The Week’s Best Psychology Links

Keyboard for idea

Our weekly round-up of the best psychology coverage from elsewhere on the web

The idea that the brain operates like a computer is the latest in a long line of metaphors that scientists have used to try and understand how the organ works. But could that comparison actually be hindering our understanding? In a longread in The Guardian, Matthew Cobb explores the limits of the brain-as-computer metaphor.


What does it mean to “recover” from a mental illness? Although common scales used to measure recovery can determine whether someone’s symptoms have improved, research suggests that they’re not that great at gauging more general improvement in people’s everyday lives, writes Benedict Carey at The New York Times.


More on the problem of AI emotion recognition this week, with a feature by Douglas Heaven at Nature. Psychologists argue over whether particular emotional expressions are universal, and debate how much can be read into someone’s internal state from their face alone. But there’s one thing that many agree on: there are still big question marks around the use of automated algorithms to detect people’s emotions.


Automated surveillance and facial recognition may also affect the way we think about ourselves. When people are observed, they see themselves “as if under a magnifying glass”, writes Janina Steinmetz at The Conversation. People who are observed by cameras while eating feel like they have consumed larger portions, for instance, while people doing tests in front of others think they make more mistakes than unobserved test-takers. Cameras are becoming more and more common in public spaces,  Steinmetz writes,  but “we are only beginning to understand some of the psychological consequences of increased observation”.


A new study has found that the brain processes the lyrics and the melody of songs in opposite hemispheres, reports Jon Hamilton for NPR.  Researchers asked participants listen to a capella songs while having their brains scanned. Some songs were manipulated to remove frequency information, so that the melody was not recognisable but the speech remained; others were changed to remove information about the way sounds changed over time, so that the melody remained but the speech was no longer recognisable. The team found that speech was processed in the left auditory cortex and melody in the right.


What does the replication crisis mean for psychotherapy? At Aeon, Alexander Williams and John Sakaluk argue that the evidence behind many so-called “empirically supported treatments” may not be as strong as we have thought.


Finally, when you play video games, do you invert the Y-axis? That is, when you push the stick on a controller up, do you expect your character’s head to go up or down? Players have long been divided into two camps — and psychology might be able to help explain why, writes Keith Stuart at The Guardian.

Compiled by Matthew Warren (@MattbWarren), Editor of BPS Research Digest