Monday, 6 July 2015

How Do Horror Video Games Work, and Why Do People Play Them?

Horror video games target evolved defence mechanisms
by confronting the player with fright-inducing stimuli
such as darkness and hostile entities. 
By guest blogger Mathias Clasen

The video game industry outpaced the movie industry several years ago, and video games remain a rapidly growing market. In 2014, US consumers spent more than $22 billion on game content, hardware, and accessories. While researchers in media psychology have been busy investigating and discussing the effects of violent video games, another peculiar and persistent game genre—horror—has attracted very little empirical research. What are the effects of horror video games like Amnesia: The Dark Descent and Resident Evil, how do they work, and why do people play them?

A new study addresses these questions. Teresa Lynch and Nicole Martins of Indiana University looked at college students’ experiences with horror video games and found that about half of their sample (53 per cent) had tried playing such games and been frightened by them. They also found that: horror games produce these fright responses by targeting our evolved defence system (evolution has shaped us to be easily scared by the dangers that threatened our ancestors); that there are predictable individual differences in how likely people are to seek out and be scared by horror video games; and that interactivity is crucial to these effects. Moreover, the researchers found that horror video games can have strong spill-over effects, causing disrupted sleep and increased fearfulness after playing.

The researchers had 269 undergrad students complete online forms on their experiences with frightening video games. They were asked to indicate which games had scared them, identify the game stimuli that scared them, and list the kinds of fright reactions they experienced during and after gameplay. Most of the respondents (97 per cent) were 18-24 years old. The researchers used a combination of forced-choice and open-ended questions.

The list of games that had produced fright reactions in players is dominated by so-called survival horror games such as Slender: The Eight Pages. These games typically use a first-person perspective to situate the player in a game world that teems with danger, usually from hostile non-player characters (monsters, more often than not). The game objective is to survive while overcoming a number of challenges, such as finding concealed resources necessary for progressing in the game. The game features that participants identified as particularly scary included darkness, the unknown, and disfigured humans (including zombies). This makes psychological sense because all these features target our evolved defence mechanisms.

Our fearful instincts evolved to protect us from dangers in the real world, so why do horror video games use patently unrealistic stimuli such as zombies and other supernatural monsters? The researchers found that perceived realism in horror video games is important in producing fright responses. Strikingly, though, they found that graphic realism (the quality of a visual representation) is more important in scaring players than is manifest realism (how likely something is to occur in the real world). Even though zombies don’t exist in reality, a realistically rendered representation of a walking, rotting, infectious, homicidal corpse combines stimuli that evoke strong fear-and-disgust emotions in players.

The study found some reliable individual differences in horror video game susceptibility and consumption. Men play more horror video games, and enjoy playing them more, than do women. Contrary to expectations, however, the study found no gender-mediated difference in the frequency of experienced fright. Guys may be more drawn to horror video games, but they are just as spooked by those games, it appears. The study found a weak correlation between sensation-seeking—a personality trait that makes people susceptible to boredom and eager to seek out stimulating experiences—and enjoyment of horror video games. Sensation-seekers enjoy horror video games more and experience less fright while playing, but curiously they don’t seem to spend more time playing horror video games than do non-sensation-seekers.

The researchers also found that player perceptions of interactivity were crucial to the games’ function of producing fright responses. This is perhaps unsurprising, given that horror in whatever medium works by transporting the audience into a fictional world teeming with danger. Horror video games are particularly effective because they ease such imaginative transportation via the illusion of agency—the player interfaces with the game and interacts with the game world, using for example keyboard keys to control the avatar’s movements and actions. You may have noticed that when people recount their game experiences, they tend to use a first-person narrator: “I went into the warehouse and ganked all the zombies with my shotgun.” This suggests that horror video games foster immersion much more strongly than do films and fiction, even those stories that are told from a first-person point-of-view.

This study, however, did not operationalize interactivity. What makes some games feel more interactive than others, and does higher interactivity—for example, having more in-game behavioral options— produce stronger fright responses? The nascent technology of virtual reality suggests that interactivity is not the only route to immersion. Many of the horror video games designed for the Oculus Rift, a portable virtual reality headset, have very little interactivity, but they are still notoriously immersive. There’s a whole YouTube industry of Oculus Rift players filming themselves reacting strongly to primitive horror simulations.

Horror video games are here to stay, but we still know little about their short- and long-term effects, and while the present study makes important inroads, it does not tell us why so many people are attracted to the kinds of video games that are designed to make them feel bad. The researchers suggest that the games may function as a kind of training for real-life emergencies, but that hypothesis awaits experimental investigation.

_________________________________ ResearchBlogging.org

Lynch, T., & Martins, N. (2015). Nothing to Fear? An Analysis of College Students' Fear Experiences With Video Games Journal of Broadcasting & Electronic Media, 59 (2), 298-317 DOI: 10.1080/08838151.2015.1029128

--further reading--
The Lure of Horror
Spook Me, Please: What Psychology Tells Us About the Appeal of Halloween
Our jumpiness at nighttime is not just because it's dark
Could violent video games make players more moral in the real world?

Post written by Dr Mathias Clasen (@MathiasClasen) for the BPS Research Digest. Clasen, an assistant professor of literature and media at the English Department, Aarhus University, has published on horror and evil monsters and is currently working on a book about the biological underpinnings of modern American horror in literature, films, and video games.

Saturday, 4 July 2015

Link Feast

We trawled the web for the week's 10 best psychology and neuroscience links:

How I Overcame the Fear of Public Speaking
Organisational Psychologist Adam Grant shares his experience for the Quiet Revolution website.

It’s Not Just Cricket
As the 2015 Ashes series comes to England, Jamie Barker and Matt Slater consider the psychology at play in an article for The Psychologist.

Why Is It So Hard to Take Your Own Advice?
Melissa Dahl investigates for The Cut and finds psychologists are just as guilty of this as everyone else.

The Data or the Hunch?
"More and more decisions, from the music business to the sports field, are being delegated to data," writes Ian Leslie in a feature for Intelligent Life. "But where does that leave our intuition?"

The Marketing Industry Has Started Using Neuroscience, But the Results Are More Glitter Than Gold
The idea behind neuromarketing is that the brain can reveal hidden and profitable truths, but this is misleading, writes Vaughan Bell for The Observer.

Face It, Your Brain Is a Computer
"Airplanes may not fly like birds, but they are subject to the same forces of lift and drag," writes Gary Marcus in the New York Times. "Likewise, there is no reason to think that brains are exempt from the laws of computation."

New Proof That We're Not As Busy As We Think
Fastcompany reports the results from a new time survey conducted in the US that found that on an average day, 96 per cent of Americans had some time for leisure activities such as watching TV.

Welcome To the Empathy Wars
The "anti-empathy brigade" led by psychologist Paul Bloom "is badly mistaken" argues Roman Krznaric at OpenDemocracy.net

Standard Center for Reproducible Neuroscience
Russell Poldrack and his colleagues launch their new Center which aims to "... harness high-performance computing to make neuroscience research more reliable."

Brainstorm
This new play at the National Theatre's Temporary Theatre is about the adolescent brain and was created in consultation with cognitive neuroscientist Sarah-Jayne Blakemore. Advance tickets are sold out but a limited number will be available on each performance day starting July 21.
_________________________________
   
Post compiled by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Friday, 3 July 2015

Smile at a party and people are more likely to remember seeing your face there

When you smile at a party, your facial expression is emotionally consistent with the happy context and as a consequence other guests will in future be more likely to remember that they've seen your face before, and where you were when they saw you. That's according to a team of Italian researchers led by Stefania Righi who have explored how memory for a face is affected by the emotion shown on that face and the congruence between that emotional expression and its surrounding context.

The researchers first presented 30 participants (11 men) with 64 unfamiliar face and scene pairings. The faces were either smiling or fearful and they were either presented alongside an image of a happy scene (e.g. a party) or a fear-inducing scene (e.g. a car crash). The participants' task at this stage was simply to indicate whether each face-scene pairing was emotionally congruent or not.

Next came the memory test. Different faces (some previously seen, some new) were flashed up on-screen against a black background and the participants had to say whether they'd seen the face before or if it was entirely new. After each face, three scenes appeared of the same genre (e.g. three party scenes), and the participants had to say which specific scene the face had previously appeared alongside.

Previously seen happy faces were better remembered than fearful faces, but only when they appeared alongside a happy scene. Memory for fearful faces, by contrast, was unaffected by the congruence of the accompanying scene. Why should smiling faces at a party or other happy context be better remembered than a fearful face? The researchers think the combination of a smiling face and happy scene has a broadening effect on observers' attention, enhancing their memories for the face. From a methodological point of view, it's a shame the study didn't also feature neutral faces: without these, we can't be certain whether smiling faces in a happy context were enhancing memory or if fearful faces in that context were harming memory, or a bit of both.

Figure 3 from Righi et al, 2015.
The researchers also propose that smiling faces have a "unitising effect" whereby the face and its context are bound together in memory. This idea also appeared to be supported by the results: participants were better at remembering the accompanying scenes (happy and fearful) for smiling faces than fearful faces.

Put these two key results together and it means that we're particularly likely to remember a smiling face we saw at a party, and the specific context we saw it in. Righi and her colleagues said it made sense for memory to work this way. "A smiling person communicates a social bond and the ability to remember, not only the face identity, but also the context of the first encounter with that 'potential friend', could reflect an adaptive behaviour in view of future social relations." The new results also complement past research on memory for face-name pairings: presented with a name, participants were better at remembering when it was earlier paired with a happy face than a neutral one.

_________________________________ ResearchBlogging.org

Righi, S., Gronchi, G., Marzi, T., Rebai, M., & Viggiano, M. (2015). You are that smiling guy I met at the party! Socially positive signals foster memory for identities and contexts Acta Psychologica, 159, 1-7 DOI: 10.1016/j.actpsy.2015.05.001

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Thursday, 2 July 2015

How social anxiety manifests on Facebook

For many shy people, online social networking sites have an obvious appeal – a way to socialise without the unpredictable immediacy of a face-to-face encounter. However, a new study finds that people who are socially anxious betray their awkwardness on Facebook, much as they do in the offline world. The researchers Aaron Weidman and Cheri Levinson said their findings could hint at ways for socially anxious people to conceal their nervousness and attract more online friends.

Seventy-seven students (average age 19; 77 per cent of them female) completed a measure of social anxiety. High scores were given to those who agreed with statements like "I have difficulty talking with other people" and "I am tense mixing in a group". Before the students left the psych lab, the researchers took screen grabs of their Facebook pages.

Several aspects of the students' Facebook pages correlated with their social anxiety scores. Unsurprisingly perhaps, those with fewer Facebook friends had higher social anxiety scores, so too did those who showed their relationship status as "single" (versus married or status not shown) and those whose page did not show a status update or quote (a sign of self-disclosure). These markers largely reflect offline signs of social anxiety – it's well established for example that people who are socially anxious share less information and tell fewer stories in conversation.

So, socially anxious people betray signs of their personalities on their Facebook pages, but would a stranger looking at their page pick up on these cues? Next, the researchers showed the students' Facebook pages to six other students and asked them to rate the social anxiety of the owners of the pages (if the observing students recognised any of the people in the Facebook pages, they didn't rate those pages).

There was a modest correlation between the observers' ratings of the Facebook owners' social anxiety and the owners' actual (self-reported) social anxiety scores. The observers picked up on some cues correctly, including lack of Facebook friends. But other cues they misread. Observers tended to rate Facebook pages with fewer photos and fewer people in the profile photo as more socially anxious, even though neither of these factors actually correlated with the owners' social anxiety scores. Observers also failed to pick up on the significance of a lack of self-disclosure or the owners' relationship status.

Unfortunately, some of the tell-tale signs of social anxiety can lead shy people to appear awkward and to make a negative impression on people they meet – the very outcome that they fear. These new results suggest the same problem may apply on Facebook, but they also point at a way to help by addressing the signs that strangers will read as evidence of social awkwardness.

The researchers said people high in social anxiety "may benefit from interventions aimed at forcing them to befriend more individuals on Facebook, post more photos of themselves, and to choose a profile picture that depicts them in the presence of others, all of which might cause observers to view [them] more positively as potential friends."

_________________________________ ResearchBlogging.org

Weidman, A., & Levinson, C. (2015). I’m still socially anxious online: Offline relationship impairment characterizing social anxiety manifests and is accurately perceived in online social networking profiles Computers in Human Behavior, 49, 12-19 DOI: 10.1016/j.chb.2014.12.045

--further reading--
The Psychology of Facebook, Digested

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Wednesday, 1 July 2015

What kind of a person volunteers for a free brain scan?

When psychologists scan the brains of a group of people, they usually do so in the hope that the findings will generalise more widely. For example, if they find that there are correlations between localised brain shrinkage and mental performance in a group of healthy older participants, they will usually infer that such correlations apply in healthy older people more generally. But there's an important problem with this logic (one that applies to other fields of psychology): what if the people who volunteer for brain scans are systematically different from those who don't?

To explore this issue, Mary Ganguli and her colleagues turned to 1,982 older participants (aged 65+) who were participating in a large, long-running study into ageing. This study excluded participants who were severely mentally impaired or in long-term care.

Helpfully, the researchers already had a good deal of data from all the participants, including their demographics, health and mental skills. Next they asked the participants if they'd be interested in taking part in a free brain scan study at their local hospital in return for a cash incentive.

Nearly half the sample (46.2 per cent) stated flat out that they would not be interested. The others gave answers ranging from definitely to maybe. Those who expressed an interest in volunteering for a brain scan differed from those who were definitely not keen in many ways: the willing were more likely to be younger, male, better educated, married, employed, free from depressive symptoms, mentally fitter, subjectively healthier, on fewer meds and living unsupervised. There were no differences between the groups in terms of subjective memory concerns or ethnicity.

Next, the researchers conducted an actual brain scan on 48 of those participants who'd expressed an earlier interest. This revealed the expected correlations between grey matter volume in specific brain areas and cognitive performance.

Now the researchers made some adjustments so that the results from each brain scan participant were weighted according to how similar they were to the averaged group of 1,982 participants involved in the larger ageing study. This was a proof of principle, to see if it's possible to correct for the bias introduced by relying on volunteers rather than truly random samples. The adjustment certainly made a difference to the findings – now grey matter volume in fewer regions showed correlations with cognitive test scores, which the researchers attributed to a reduction in bias.

This isn't the most exciting brain scan study you'll read about this year, and the specific findings might only apply to older adults, but it addresses an important issue in neuroimaging and contributes to the gradual refining of psychological methods, helping our science become more reliable by avoiding biased results.

_________________________________ ResearchBlogging.org

Ganguli, M., Lee, C., Hughes, T., Snitz, B., Jakubcak, J., Duara, R., & Chang, C. (2015). Who wants a free brain scan? Assessing and correcting for recruitment biases in a population-based sMRI pilot study Brain Imaging and Behavior, 9 (2), 204-212 DOI: 10.1007/s11682-014-9297-9

--further reading--
Just how representative are the people who volunteer for psychology experiments?
Beware the "super well" - why the controls in psychology research are often too healthy
How burnt-out students could be skewing psychology research

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Tuesday, 30 June 2015

What the textbooks don't tell you about psychology's most famous case study

Image: Photograph by Jack Wilgus of
a daguerreotype of Phineas Gage
in the collection of Jack and Beverly Wilgus.
It's a remarkable, mythical tale with lashings of gore – no wonder it's a favourite of psychology students the world over. I'm talking about Phineas Gage, the nineteenth century railway worker who somehow survived the passing of a three-foot long tamping iron through the front of his brain and out the top of his head. What happened to him next?

If you turn to many of the leading introductory psychology textbooks (American ones, at least), you'll find the wrong answer, or a misleading account. Richard Griggs, Emeritus Professor of Psychology at the University of Florida, has just analysed the content of 23 contemporary textbooks (either released or updated within the last couple of years), and he finds most of them contain distortions, omissions and inaccuracies.

It needn't be so. Thanks to painstaking historical analysis of primary sources (by Malcolm Macmillan and Matthew Lena) – much of it published between 2000 and 2010 – and the discovery during the same time period of new photographic evidence of post-accident Gage (see image, right), it is now believed that Gage made a remarkable recovery from his terrible injuries. He ultimately emigrated to Chile where he worked as a horse-coach driver, controlling six horses at once and dealing politely with non-English speaking passengers. The latest simulations of his injury help explain his rehabilitation – it's thought the iron rod passed through his left frontal lobe only, leaving his right lobe fully intact.

Image: From Van Horn et al 2012
Yet, the textbooks mostly tell a different story. Of the 21 that cover Gage, only 4 mention the years he worked in Chile. Only three detail his mental recovery. Fourteen of the books tell you about the first research that attempted to identify the extent of his brain injuries, but just four of the books give you the results from the most technically advanced effort, published in 2004, that first suggested his brain damage was limited to the left frontal lobe (watch video). Only 9 of the books feature either of the two photos to have emerged of Gage in recent times.

So the textbooks mostly won't tell you about Gage's rehabilitation, or provide you with the latest evidence on his injuries. Instead, you might hear how hear never worked again and became a vagrant, or that he became a circus freak for the rest of his life, showing off the holes in his head. "The most egregious error," says Griggs, "seems to be that Gage survived for 20 years with the tamping iron embedded in his head!".

Does any of this matter? Griggs argues strongly that it does. There are over one and half million students enrolled in introductory psychology courses in the US alone, and most of them are introduced to the subject via textbooks. We know from past work that psychology textbook coverage of other key cases and studies is also often distorted and inaccurate. Now we learn that psychology's most famous case study is also misrepresented, potentially giving a misleading, overly simplistic impression about the effects of Gage's brain damage. "It is important to the psychological teaching community to identify inaccuracies in our textbooks so that they can be corrected, and we as textbook authors and teachers do not continue to 'give away' false information about our discipline," Griggs concludes.

_________________________________ ResearchBlogging.org

Griggs, R. (2015). Coverage of the Phineas Gage Story in Introductory Psychology Textbooks: Was Gage No Longer Gage? Teaching of Psychology, 42 (3), 195-202 DOI: 10.1177/0098628315587614

--further reading--
Phineas Gage - Unravelling the Myth
Coverage of Phineas Gage in "Great Myths of the Brain"
Foundations of Sand - the lure of academic myths and their place in classic psychology.

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Monday, 29 June 2015

We're more likely to cheat when we think it's our last chance to do so

Imagine spending your school half-term week with a forgetful relative who always leaves money scattered around the house. Would you pinch any? If so, when, and why? A new paper suggests that we are most likely to “cheat at the end”, and uses a neat method to find out why.

A number of theories predict we are likelier to cheat later than earlier. Perhaps we award ourselves moral credits for being good earlier, and later spend them like Catholic indulgences for guilt-free sin. Or maybe the struggle with temptation wears down our self-control, or we become desensitised to the thought of cheating. The job of psychological science is to distinguish between explanations, and Daniel Effron’s team developed a method that argues against these, in favour of an alternative that’s based on “anticipatory regret” or the fear of missing out.

Participants sat alone in a room and tossed a coin 13 times, supposedly as part of an experiment on psychokinesis (the ability to control objects with the mind). Before each toss, participants predicted whether the coin would land heads or tails and then they recorded the outcome themselves using a computer. They were told each “correct” toss would earn them 10 cents. Crucially, the experimenters made it clear that they were depending on the participants to honestly report their successes. So, cheating was both possible and profitable.

The researchers looked for signs of cheating, not in any one individual, but by examining average performance across all 847 participants. For any given toss, when the group’s average success rate exceeded the 50/50 success rate you’d expect based on chance, this was taken as a sign that cheating was at play.

Effron’s team were particularly interested in the success rate on the seventh coin toss. They’d told some participants they would have 13 tosses in total: their seventh toss had a similar success rate to the previous six, at around chance. This suggests that the first six tosses hadn’t eroded willpower, or built up moral credits ready to be cashed in. By contrast, the researchers had told other participants they would only have seven tosses of the coin. What was striking was that these participants appeared to cheat more on the seventh toss, collectively achieving significantly more successes than would be expected based on chance. This result suggests it wasn’t the build-up of prior events that mattered, but the fact that this seemed to be the final opportunity… and if they didn’t act now, they never could.

Indeed, when these cheating participants were informed there would in fact be more tosses to follow, their honesty suddenly popped back up on toss eight and onwards. This suggests their willpower hadn’t been used up, nor were they desensitised to cheating. The researchers also conducted a meta-analysis taking in data from this experiment, a further replication, and other work, with the overall results suggesting that we are three times more likely to cheat at what we believe to be the final opportunity than at any other time.

This may remind some of you of research using the “Prisoner’s Dilemma” economic game, which shows that “defection” or mistreatment of others rises towards the end of a period of interaction. But as Efron’s team notes, that pattern is due to the to-and-fro of the game: if I swindle you at the start of our interaction, I can expect the same from you every time. Here, there was no ongoing interaction and so no reason why cheating on trial one or seven should have any different consequences. So this "cheating at the end" effect isn’t about how others treat you, but how you expect to feel about yourself.

The authors conclude that knowing when cheating is likely to occur – on the last day of a period where a work supervisor is absent, for instance – could be useful in organising the timing and targeting of anti-cheating strategies, such as reminding people of moral standards just before a “peak time” period.

_________________________________ ResearchBlogging.org

Effron, D., Bryan, C., & Murnighan, J. (2015). Cheating at the End to Avoid Regret. Journal of Personality and Social Psychology DOI: 10.1037/pspa0000026

Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.

Google+