Category: evolutionary psych

Men who can tell a good story are seen as more attractive and higher status

The results fit with evolutionary theory

Stories can change how we think about the world, about the people they describe, and even ourselves. According to new research, they also influence our attitude to the storyteller. An article published in the journal Personal Relationships suggests that people portrayed as stronger storytellers are considered as higher status than those that aren’t – and this status can make them more romantically attractive, at least in the eyes of women. Cue editing of Tinder bios across the globe.

John Donahue and Melanie Green ran experiments with US undergraduate samples (388 in total, 55 per cent women, two-thirds Caucasian, average age 20) who were asked to rate the attractiveness of a potential partner of the opposite sex based upon basic printed information. In the first experiment, participants received a photo and a short biography of a would-be partner which included information on their storytelling abilities. Participants in the strong storytelling condition, for example, heard that the person “often tells really good stories…he makes the characters and settings come alive.” Other conditions emphasised the mediocrity of the person’s storytelling or did not mention it at all. Stronger female storytellers did not tempt male participants, nor did male raconteurs foster extra female interest in short-term dating. But women were more interested in talented male story-tellers as long-term partners.

A further experiment held the design but added another category of attraction – “Do you think this person would make a good spouse?” – and a measure of the person’s perceived status. Both male and female participants considered storytellers to have higher status than non-storytellers. But for men, that didn’t translate into finding women more attractive, whereas for female raters, there was a clear route from men’s storytelling ability to status to desirability as a long-term partner or spouse.

To examine other explanations for the lure of the story-teller beyond the effect of status, the researchers ran another experiment where participants actually read a story, supposedly recounted by the potential partner. Some stories were fluid with lively vocabulary, and, as hoped, participants rated them as better and more involving than others that told the same facts in a hesitating and digressive manner.

But surprisingly, attraction didn’t depend on being swept up in the story – that is, would-be partners who’d produced a more engrossing story were not rated as more attractive than the bores. I should note, however, that a short oral anecdote transcribed onto paper is not the strongest way to entangle someone in the magic of story, and the researchers acknowledged that other unmeasured qualities of the story, such as personal identification, or sheer enjoyment, may well affect attraction.

Donahue and Green advance an evolutionary theory for their findings: females, with a biologically high investment into producing young, have evolved to seek mates with resources, and storytelling aptitude reflects advantages that prehistorically meant the difference between life and death. But there are other explanatory lenses: for example, that men are socialised to be suspicious of women who take space and focus, considering that active status a threat that masks any liking they might have for storytelling traits, whereas women are socialised to appreciate first impressions of male competence. I suspect there is a rich, specific picture of when and why storytellers appeal, a picture that will depend on looking across cultures and at the specific effects their stories arouse in us. For now, this evidence suggests that young western males who can spin a good yarn are seen, on first blush, as a better catch.


DONAHUE, J., & GREEN, M. (2016). A good story: Men’s storytelling ability affects their attractiveness and perceived status Personal Relationships DOI: 10.1111/pere.12120

Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.

Our free weekly email will keep you up-to-date with all the psychology research we digest: Sign up!

The psychology of who we find creepy and why

Maybe they’re sitting too close, or just smiling weirdly. Whatever, you know it’s creeping you out! Finding certain people creepy is a common experience yet psychologists, before now, haven’t investigated this emotion.

Francis McAndrew and Sara Koehnke, the authors of a new exploratory paper in New Ideas in Psychology, say that creepiness is what we feel when we think someone might be a threat, but we’re not sure – the ambiguity leaves us “frozen in place, wallowing in unease”.

The pair conducted an online survey of 1341 people (312 were men; average age 29, mostly based in the US), including asking them to rate the likelihood of a creepy person exhibiting 44 different patterns of behaviour (e.g. avoiding eye contact), and to rate the creepiness of different occupations and hobbies.

Several behaviours and aspects of appearance were consistently rated as characteristic of creepy people, including: standing too close; greasy hair; peculiar smile; bulging eyes; having a mental illness; long fingers; unkempt hair; pale skin; bags under eyes; odd/dirty clothes; licking lips frequently; laughing at odd times; steering conversation toward one topic (especially sex); making it impossible to leave without seeming rude; displaying unwanted sexual interest; asking to take a picture of you; being very thin; and displaying too much/little emotion. Men and women alike overwhelmingly said it was more likely that a typical creepy person would be male.

“While they may not be overtly threatening, individuals who display unusual patterns of nonverbal behaviour, odd emotional characteristics or highly distinctive physical characteristics are outside of the norm, and by definition unpredictable. This may activate our ‘creepiness detector’,” the researchers said.

The four most creepy professions, in order, were clown, taxidermist, sex shop owner and funeral director (least creepy was meteorologist). The creepiest hobbies were those that involved collecting (especially body parts like finger nails, or insects) or watching or photographing other people.

Consistent with the researchers’ theory that creepiness stems from ambiguity, participants said the typical creepy person makes them feel uncomfortable because they cannot predict how he or she will behave.

You probably think this research isn’t about you, but note, the researchers found most participants believed creepy people usually don’t realise that they’re creepy.

On the nature of creepiness
Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

This woman went hitchhiking in a hijab, for science

According to evolutionary psychology, just as animals and birds sing and dance and build houses to communicate their sexual interest to others, we humans do things like wear red, tell jokes, drive fancy cars and, well yes, we sing and dance too. A consistent finding in this area is that people’s attractiveness to others depends on whether their appearance communicates an interest in short or long-term sexual commitment, and moreover, whether this matches what a potential suitor is looking for. For example, there’s evidence that heterosexual women interested in casual sex are more likely to wear clothing that they think will attract men (not the most surprising research finding), and that this kind of clothing increases their attractiveness to men as a partner for casual sex, but not as a partner for marriage. The vast majority of this research has so far been conducted in the West, but a new field study out of Iran bucks the trend.

Farid Pazhoohi and Robert Burriss asked a 25-year-old woman to stand on the same busy, well-lit street in Shiraz, Iran on two consecutive Monday nights until 1000 cars has passed. The first week she wore relatively liberal clothing – a black hijab and tight black clothing that revealed her body shape. The second week she wore a black chador which conceals the entire head and body (except the face) beneath a black cloak. The idea was to see how many drivers would stop to offer the woman a lift. When the woman wore a chidor, only 39 drivers stopped for her, compared with 214 drivers who stopped when she wore the more liberal costume (all drivers who stopped were male). This nearly 7-fold increase in interest is similar to, but much larger than, the effect seen in French research in which male drivers were more likely to stop for a woman who was smiling, had large breasts, wore red or makeup.

The researchers said: “Our results extend the findings of previous studies in Europe and North America on male helping behavior and female attractiveness to Iran, a nation where courtship behav- ior and dress are constrained by stricter social mores and laws than apply in the West.”


Pazhoohi, F., & Burriss, R. (2016). Hijab and “Hitchhiking”: A Field Study Evolutionary Psychological Science, 2 (1), 32-37 DOI: 10.1007/s40806-015-0033-5

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

Your brain treats LEGO people as if they’re alive

Participants were especially good at spotting scene changes that involved LEGO people. Image from LaPointe et al.

Our brains are wired such that we pay extra attention to anything that seems to be alive. This makes sense from an evolutionary point of view – after all, other living things might be about to eat us, or maybe we could eat them.

Consistent with this evolutionary perspective, prior research has shown that at a very basic level, we pay more attention to images of animals and people than we do to cars and trucks, even though in modern life, it is cars and trucks that are more of an everyday threat than animals. But now a study in the Canadian Journal of Experimental Psychology has shown that this bias for processing living things extends to LEGO people, despite the fact that they are inanimate and were obviously never encountered by our distant ancestors.

Mitchell LaPointe and his colleagues tested dozens of undergrad students across several studies. The basic challenge was similar throughout. On each experimental trial, the participants looked at a pair of black and white, static images of LEGO scenes that alternated rapidly on a computer screen (each scene appeared for a quarter of a second before it flicked to the other scene in the pair and back again), and they had to indicate as fast as possible when they’d spotted the difference between the two scenes, and which side of the screen the change was on. After the participant responded, the next pair of images appeared and began flicking back and forth until a response was made.

Both LEGO scenes within each of the image pairs was identical but for one small difference, which was either the addition of an extra LEGO person or some other feature, such as a tree or a small tower of LEGO blocks of similar size to a LEGO person. The main finding is that participants were significantly quicker by two or more seconds, on average, at spotting scene changes that involved a LEGO person as compared with some other LEGO element. They were also more accurate at reporting where the changes had occurred when they involved LEGO people.

Variations in the methodology showed that the attentional bias for LEGO people was not due to their having faces (the advantage remained even when these were blurred). Even rotating the scenes 180 degrees, or blurring the entire scene, failed to fully eliminate the participants’ superior performance for spotting changes involving LEGO people.

This finding for LEGO people was very similar to that shown previously in terms of people spotting scene changes more quickly and accurately when they involve humans and animals as opposed to motorised vehicles (in fact, the LEGO person advantage was in some ways more robust – the attentional bias for humans and animals over vehicles disappeared when scenes were rotated 180 degrees).

The researchers who conducted the new research said, “it is clear that our participants treated LEGO people differently than LEGO nonpeople. The explanation that we favour for this difference in performance is that the animate category was generalised to the LEGO people, perhaps because the LEGO people contain some feature overlap with animate objects.” In other words, your brain thinks the little LEGO characters are alive!

What’s not clear from this research is if experience with LEGO figures is required for the attentional bias for LEGO people to be observed (no detail is given in the study on whether or how much the participants had played with LEGO as children, or adults). We also don’t know if these results say something special about LEGO people or if a similar effect would be found for other toy figures.


LaPointe, M., Cullen, R., Baltaretu, B., Campos, M., Michalski, N., Sri Satgunarajah, S., Cadieux, M., Pachai, M., & Shore, D. (2016). An Attentional Bias for LEGO® People Using a Change Detection Task: Are LEGO® People Animate? Canadian Journal of Experimental Psychology/Revue canadienne de psychologie expérimentale DOI: 10.1037/cep0000077

further reading
LEGO figures are getting angrier
When psychologists become builders

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

By age four, children are already willing to make sacrifices in the name of group loyalty

Psychologists interested in the way group loyalty develops through childhood have largely focused on young children’s preference for other kids who demonstrate loyalty. For example, one study found that four- and five-year-olds rated other children as nicer and more trustworthy if they pledged continued allegiance to their losing team, compared with when they said they wanted to switch to the winning side. Other research has found that children aged five to eight years say they will be loyal to their group, for example by continuing to support a losing sports team, but before now such loyal proclamations have not been put to the test.

Published in The Journal of Experimental Child Psychology, a new study is the first to test young children’s willingness to make sacrifices in the name of group loyalty. Antonia Misch and her colleagues recruited 48 five-year-olds and 48 four-year-olds, with an equal number of girls and boys in each age group. The researchers tested each child individually and began by introducing them to four hand puppets.

Puppets used in the study. Figure from Misch et al

Next, yellow and green scarfs pulled from a box were used to allocate two puppets and the child to one team (either the green or yellow team) and the other two puppets to the other coloured team. The child then left the room with the researcher, ostensibly to help look for something, and on returning discovered two puppets, either from their own team or the other team (depending on the in-group or out-group condition) hiding a book. The puppets explained the book was a secret and urged the child not to tell anyone where it was. A short time later, a fifth puppet with no group affiliation (they wore no scarf) appeared and said they knew a secret had been hidden in the room. This puppet attempted to bribe the child with stickers to get them to reveal the secret.

Overall, 61 per cent of the children resisted the temptation to reveal the secret. But the key finding is that more children chose to keep the secret when they were urged to do so by puppets in their own team as opposed to the other team (75 per cent vs. 48 per cent). This in-group loyalty effect was present at both ages, suggesting the motivating effect of group loyalty is already present by age 4. However, there was an overall effect of age on the ability or willingness to keep the secret – averaging across the in-group and out-group conditions, 71 per cent of five-year-olds kept the secret compared with 52 per cent of four-year-olds.

The researchers said that their findings “extend previous research on children’s verbal predictions of their own loyalty and children’s attitudes about other people’s loyalty in demonstrating that even in direct social interactions where children are tempted to be disloyal, they can choose to remain loyal.” The findings are all the more striking given that the groups were so arbitrary and had only been formed a few minutes earlier. Misch and her team said that it’s likely children would feel an even keener sense of loyalty and identification with their real life groups.


Misch A, Over H, & Carpenter M (2016). I won’t tell: Young children show loyalty to their group by keeping group secrets. Journal of experimental child psychology, 142, 96-106 PMID: 26513328

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

Can evolutionary psychology explain why we love to hate evil villains?

By guest blogger David Robson

Darth Vader. Hannibal Lecter. Lord Voldemort. In literature and in film, it’s often the villains who steal the show. In John Milton’s Paradise Lost, the beautiful, charming Satan even manages to upstage God. No matter how diabolical their schemes, we seem to have an unnerving fascination with the wicked.

Why are our stories filled with these vile characters, people so evil that we love to hate them? While the question has long been debated by literary scholars, a new paper published in Evolutionary Behavioural Sciences by Jens Kjeldgaard-Christiansen at Aarhus University in Denmark is the first to cast light on these shadowy figures through the prism of evolutionary psychology.

To understand the appeal of evil, Kjeldgaard-Christiansen argues that you first need to examine its opposite – altruism. In our evolutionary past, individuals living in tight-knit groups needed ways of working out who was pulling their weight, and punishing those who didn’t. Today, we continue to make these value judgments using gut reactions rather than deliberate, rational thinking, normally based on a calculation of how much a person is willing to sacrifice for the overall good of the group (the so-called “welfare trade-off ratio”). Someone with a low welfare trade-off ratio, who gives little (or nothing) but takes a lot, immediately rings our brains’ alarm bells, telling us that they are not to be trusted.

Clearly, keeping these people in our groups would have put us all in danger so they provoke the most potent emotional responses, such as disgust, fear and anger. Our reactions may be so strong that we may even feel justified in killing them to remove the threat from society. It is these people we consider “evil”. Importantly, these intuitions are hunches and can be easily swayed by a number of factors. For example, there is research showing that physical disgust (fear of illness, say) can spillover and influence our moral decisions. Crucially, Kjeldgaard-Christiansen says authors can and do make the most of these kinds of features to build the most spine-chilling villains.

Kjeldgaard-Christiansen predicts a handful of qualities that his evolutionary approach says should characterize the most iconic villains. It goes without saying that they should have a particularly low “welfare trade-off ratio”, for instance – but it also makes evolutionary sense if there is a threat that their immoral behaviour could spread like a disease in society more generally. Supporting this idea, he quotes Father Merrin from The Exorcist, who said: “I think the demon’s target is not the possessed; it is us . . . the observers . . . every person in this house. And I think—I think the point is to make us despair; to reject our own humanity.” This perfectly fits a very real threat in our ancestral past – that one unpunished act could sow the seeds for wider anarchy.

Hannibal. Image via Wikipedia

To heighten the instinctive fear response in readers, Kjeldgaard-Christiansen says the author may also choose to maintain a cloak of mystery around the villain; after all, if we know too much about their motives, we may stop feeling that intuitive, impulsive revulsion and instead see their point of view. The trick with someone like Hannibal Lecter, then, is to make them just psychologically deep enough to be believable, without bursting the disconcerting aura of evil. What’s more, the most evil villains will also be marked as outsiders, since strangers from competing groups were the biggest threat in our past – the reason, perhaps, that so many Hollywood villains (including Hannibal) have a foreign, often English, accent.

Leatherface. Image via Wikipedia

Pointing to the aforementioned research on disgust, Kjeldgaard-Christiansen also believes it is no coincidence that fictional villains are often deformed – like Leatherface in The Texas Chainsaw Massacres – somehow, our revulsion at their appearance primes us to feel repelled not just physically but morally too. “His brutish roars and apish gait warn the viewer that something is very wrong with this iconic recluse. Leatherface’s foul exterior becomes the manifestation of a foul essence,” writes Kjeldgaard-Christiansen. The same could equally be said about Lord Voldemort’s mutated, foetal appearance in Harry Potter, or the scars on Javier Bardem’s Raoul Silva in Skyfall.

Far from being escapist titillation, Kjeldgaard-Christiansen thinks that creating these tales may in fact have an evolutionary purpose. Taking these short trips into the dark sides of our natures, and seeing good triumph over evil, helps us to reaffirm our altruistic tendencies, leading to better cooperation overall.

It’s an intriguing idea – and it would be interesting to see if Kjeldgaard-Christiansen can test his theory. You can imagine that he could show participants an extract from Hannibal, say, and then ask them to play games (such as the Prisoner’s Dilemma) that involve cooperation in the lab, to see if they are more likely to play fairly.

If so, it will add to a growing body of work exploring the ways that fiction can shape our behavior: for example, some fascinating work by Travis Proulx at Tilburg University has shown that absurdist authors such as Franz Kafka or Lewis Carroll, whose stories violate the laws of the real world, can have an unsettling effect, leading us to look for confirmation of our existing beliefs. We also know that urban legends that break taboos (such as incest) appear to be more widely shared than those that are simply disgusting or frightening – which some researchers believe may reflect the evolutionary role of story-telling as a means of teaching social norms.

Censors sometimes worry that depictions of evil will lead us all the way of the devil. In fact, If Kjeldgaard-Christiansen is right, the opposite may be true: by making us peer into the darkness, these villains may just make us better people.

Kjeldgaard-Christiansen, J. (2015). Evil Origins: A Darwinian Genealogy of the Popcultural Villain. Evolutionary Behavioral Sciences DOI: 10.1037/ebs0000057

further reading
Spook Me, Please: What Psychology Tells Us About the Appeal of Halloween
Can relationships with fictional characters aid our self development?
How Do Horror Video Games Work, and Why Do People Play Them?

Post written by David Robson (@d_a_robson) for the BPS Research Digest. David is BBC Future’s feature writer.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

The adaptive mind: Children raised in difficult circumstances show enhanced mental flexibility in adulthood

According to a stream of psychological research, a tumultuous upbringing sets you up for a raw deal later in life. Being raised in households that lack wealth or stability is associated with outcomes that include altered decision-making abilities, memory and general cognitive function. These changes are usually considered impairments, but does a bad childhood really make you less capable, or just different?

The research on decision-making, for instance, reveals “sub-optimal” decisions made by people raised in stressful environments – opting for a small reward now rather than holding out for a big one later – that actually make sense given the person’s history: if nothing is guaranteed in your world, it’s a smart decision to grab what you can now.

One way to look at this is that stressful conditions don’t lessen you so much as condition you. Chiraag Mittal, a grad student at the Carlson School of Management, and his colleagues suspected this might apply to more than decision making. They recently published an article in the Journal of Personality and Social Psychology where they look at cognitive function, specifically executive functioning, to see if the story isn’t only one of deficits.

Executive function is what allows us to process and manage complex behaviour, including paying attention and making decisions. In fact, it contains so many facets that Mittal’s team chose to focus on two. Inhibition is the ability to stay on task in the face of distractions, measured here by accuracy on a simple judgment task (which way points the arrow briefly flashed on-screen?) while being distracted by flashes elsewhere on-screen. This ability is associated with delaying gratification, which is less useful in real-world unpredictable contexts where the “big reward next year” may never come.

Meanwhile, the second facet they looked at, shifting, is the ability to turn from one goal to another as effortlessly as possible – here measured by the efficiency of switching from categorising on-screen targets one way then another (e.g. by colour and then by shape) on given trials, depending on the current rule. Shifting ability is an important skill for anyone living in unpredictable circumstances, and Mittal’s team predicted that adults with that background would do better at this task, and worse at the inhibition task, than those from stable backgrounds.

The data bore out these predictions – when it came to mental flexibility, people with a history of childhood adversity actually outperformed their more fortunate peers. There was a wrinkle – a small-sample replication threw up anomalies, so the researchers ran a more robust third study with 181 student participants. This confirmed the general pattern: participants who said they’d had an unpredictable early life (changes in residence, movement of other cohabitants in and out of home, and changes in parents’ employment status) performed worse at inhibition, but better at shifting. However, this effect only reared its head when the tests were preceded by a task stoking a sense of uncertainty – reading an alarming newspaper account of “Tough Times Ahead”. This fits with past research showing that effects tied to a stressful upbringing often seem only to be elicited in conditions of current unpredictability (a rule that is also true in animal research).

In a follow-up with a smaller sample, the researchers made use of an ongoing collection of data from a group of people born into poverty between 1975 and 1976 . Using recorded details on their upbringing at multiple time points between birth and ten years old, coders could produce more reliable ratings of the participants’ childhood experience of unpredictable circumstances. Due to time constraints, only shifting ability could be looked at, but again the earlier finding was replicated: when primed with uncertainty, people who had been raised in greater turmoil performed better. A meta-analysis combining the results from all four of the researchers’ studies strongly confirmed this effect.

The mental process of inhibition allows people to pursue goals and underlies the willpower to stick with things, characteristics that encourage personal success. But shifting ability is also associated with a higher-order ability that’s important in life: creativity. People from disadvantaged, unstable backgrounds undoubtedly face challenges, but this research suggests, if not a bright side, a more nuanced one. People aren’t passively victimised by their circumstances – they adapt to them, sometimes in ways that make it easier to thrive in challenging conditions.


Mittal, C., Griskevicius, V., Simpson, J., Sung, S., & Young, E. (2015). Cognitive adaptations to stressful environments: When childhood adversity enhances adult executive function. Journal of Personality and Social Psychology, 109 (4), 604-621 DOI: 10.1037/pspi0000028

further reading
Poverty shapes how children think about themselves
Why is poverty associated with mental health problems for some people, but not others?
Testing the American Dream – can the right mix of personality and IQ compensate for poverty?
When depressed mothers give birth to thriving babies

Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

New genetic evidence suggests face recognition is a very special human skill

Example stimuli from Shakeshift and Plomin, 2015.

A new twin study, published today in PNAS, of the genetic influences on face recognition ability, supports the idea that face recognition is a special skill that’s evolved quite separately from other aspects of human cognition. In short, face recognition seems to be influenced by genes that are mostly different from the genes that influence general intelligence and other forms of visual expertise.

The background to this is that, for some time, psychologists studying the genetics of mental abilities have noticed a clear pattern: people’s abilities in one domain, such as reading, typically correlate with their abilities in other domains, such as numeracy. This seems to be because a person’s domain-specific abilities are strongly associated with their overall general intelligence and the same genes that underlie this basic mental fitness are also exerting an influence on various specific skills.

Nicholas Shakeshaft and Robert Plomin were interested to see if this same pattern would apply to people’s face recognition abilities. Would they too correlate with general intelligence and share the same or similar genetic influences?

The researchers recruited 2,149 participants, including 375 pairs of identical twins who share the same genes, and 549 non-identical twins, who share roughly half the same genes, just like typical siblings (overall the sample was 58 per cent female with an average age of 19.5 years). The participants completed a test of their face processing skills, including memorising unfamiliar faces, and also tests of their ability to memorise cars, and their general intelligence, in terms of their vocabulary size and their ability to solve abstract problems.

Comparing the similarities in performance on these different tests between identical and non-identical twin pairs allowed the researchers to estimate how much the different skills on test were influenced by the same or different genes.

All the abilities – face recognition, car recognition and general mental ability – showed evidence of strong heritability (being influenced by genetic inheritance), with 61 per cent, 56 per cent, and 48 per cent of performance variability in the current sample being explained by genes, respectively.

Crucially, performance on face recognition was only moderately correlated with car recognition ability (r = .29 where 1 would be a perfect correlation) and modestly correlated with general mental ability (r = .15), and only 10 per cent of the genetic influence on face recognition ability was the same as the genetic influence on general mental ability (and likewise, only 10 per cent of the genetic influence on face memory was shared with the genes affecting memory for cars).

Essentially, this means that most of the genetic influences on face recognition ability are distinct from the genetic influences on general mental ability or on car recognition ability. Shakeshaft and Plomin said this “striking finding” supports the notion that there is something special about human facial recognition ability. These results add to others that have suggested face recognition is a special mental ability – for instance, some have argued that faces alone trigger brain activity in the so-called “fusiform face area” (although this claim has been challenged); and unlike our ability to recognise other objects or patterns, our ability to recognise faces is particularly impaired when faces are inverted, consistent with the idea that we use a distinctive “holistic” processing style for faces.

The story is complicated somewhat by the researchers’ unexpected finding that recognition ability for cars was also linked with distinct genetic influences that mostly did not overlap with the genetic influences on general mental ability. Perhaps, the researchers surmised, the tests of general mental ability used here (a vocab test and the well-used Raven’s Progressive Matrices) did not adequately tap the full range of what we might consider general mental abilities. Whatever the reason, it remains the case that this new research suggests that face recognition ability is influenced by a set of genetic influences that are largely distinct from those implicated in a similar form of visual recognition (for cars) and implicated in vocab ability and abstract reasoning. Based on this, the researchers concluded they’d shown for the first time that “the genetic influences on face recognition are almost entirely unique.”


Nicholas G. Shakeshaft, & Robert Plomin (2015). Genetic specificity of face recognition PNAS

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

Looking for the brain basis of chimp personality

Some chimps are more outgoing than others. Some like trying out new foods and games while their friends stick to the tried and tested. In short, chimps have different personalities, just like people do. What’s more, psychologists investigating chimp personality have found that their traits tend to coalescence into five main factors, again much like human personality. Three of these factors are actually named the same as their human equivalents: Extraversion, Openness and Agreeableness. The other two are Dominance (a bit like the opposite of the human trait of Neuroticism) and Reactivity/Undependability (opposite to the human trait Conscientiousness).

Now a team of psychologists and primatologists has scanned the brains of 107 chimpanzees to try to find the neural correlates of personality differences in our evolutionary cousin. The neurobiological basis of human personality is a thriving area of research, but this study published in NeuroImage is the first to look for the brain basis of chimp personality.

The chimp participants were residents at The University of Texas MD Anderson Cancer Center, housed in groups of 5 to 14. There were 50 males with an average age of 22 and 57 females with an average age of 20. Robert Latzman and his colleagues relied on colony staff to rate the chimps’ personalities using a 41-item personality questionnaire which tapped the chimp equivalent of the five main personality traits. The chimps then had their brains scanned by MRI, for which they were sedated.

Given the evidence in humans that that has linked many aspects of personality to features of the frontal cortex, the researchers decided to focus their investigation in that part of the chimp brain. After controlling for age and sex (older chimps were less reactive and less extravert; males tended to be more extravert and dominant), they found that the more grey matter volume a chimp had in his or her frontal cortex, the more dominant, open and extravert the chimp tended to be. The researchers said this potentially reflects the broad role of the frontal cortex in “the control of emotions in the service of goal-directed behaviour”.

Zooming in on a particular sub-structure in the frontal cortex, the anterior cingulate cortex (ACC; an area associated in humans with motivation and expectations, among other things), higher Extraversion and Openness were associated with more grey matter in this structure. The researchers also looked at asymmetries in grey matter volume between the chimps’ two brain hemispheres. Here they found that more grey matter volume in the right hemisphere was associated with higher Extraversion and Dominance, which contradicts human research which has linked approach behaviours with the left-hemisphere. However, when the researchers looked at asymmetries in specific structures, including the ACC and the medial prefrontal cortex, they found that more grey matter in the right hemisphere was associated with more reactivity, while a left-hemisphere bias was associated with more dominance, which is more consistent with human evidence.

The study makes a good to start at exploring the neurobiology of chimp personality but it does have some problems, including the cross-sectional design (were the brain differences a cause or consequence of personality differences?), the exclusive focus on the frontal cortex, and the way the researchers translated each chimp’s brain structure onto a common template, thus losing some of the individuality between chimps. Nonetheless, Latzman and his colleagues said their findings added further evidence to the idea that human personality has an evolutionary and biological basis, and confirmed “the importance of neuroscientific approaches to the study of basic dispositions (i.e. personality) … suggest[ing] that many of these associations are comparable in chimpanzees.”

  ResearchBlogging.orgLatzman, R., Hecht, L., Freeman, H., Schapiro, S., & Hopkins, W. (2015). Neuroanatomical correlates of personality in chimpanzees (Pan troglodytes): Associations between personality and frontal cortex NeuroImage, 123, 63-71 DOI: 10.1016/j.neuroimage.2015.08.041

further reading
Chimps and toddlers lend a helping hand
Baboons like to hang out with other baboons who are similar

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

Images of ultra-thin models need your attention to make you feel bad

By guest blogger Tom Stafford

We all know that fashion models have unrealistic bodies. Even if they aren’t photoshopped, most of us could never be that thin, at least not without making ourselves ill. Previous research has suggested that viewing pictures of unrealistically thin female models makes young women feel bad – leaving them dissatisfied with their own bodies, more sad, angry and insecure.

A crucial question is whether the effect of these thin-ideal images is automatic. Does the comparison to the models, which is thought to be the key driver in their negative effects, happen without our intention, attention or both?  Knowing the answer will tell us just how much power these images have, and also how best we might protect ourselves from them.

There are at least two plausible reasons psychologists have for suspecting such comparisons might be automatic. One is evolutionary – we’re a social species, so perhaps we instinctively compare ourselves against the people around us, to figure out where we are in the social hierarchy. Our ancestors didn’t have photographs, but it’s possible that modern media is hijacking a process that was once useful in our biological history. The second reason is practice: things we do again and again become automatic. Maybe, especially for some people, comparing themselves with media images has become a habit.

To test the idea that comparing ourselves to thin-ideal images is an automatic process, Stephen Want and colleagues at Ryerson University, Canada, invited 116 female Canadian undergraduate students, with an average age of 19, and an average body mass index of 21 (the healthy range is between about 19 and 26), to take part in what they were told was an experiment testing how short-term memory is affected by mood, personality and exposure to images.

Under this cover story, participants were given a difficult memory task (remembering a string of different digits, such as 78639946) or an easy memory task (remembering an easy string, such as 11111111). They were told they would have to keep these digits in mind while they looked at images of several fashion models or cats. All participants were, in fact, shown pictures of fashion models, 12 of them for 10 seconds each. The purpose of this deception was to prevent the participants guessing that the researchers were interested in the effect of the images – this reduces the likelihood that the participants performed in a certain way to meet the researchers’ expectations.

But the memory task wasn’t just part of the cover story, it was central to the study design. One feature of automatic processes is their efficiency: automatic processes can occur when you are mentally occupied whereas non-automatic processes require your attention. Following this definition, the idea was that the difficult memory task group would only be able to make automatic social comparisons, whereas the easy memory task group would be able to make both automatic and effortful social comparisons. If images have negative effects automatically, they should be seen in both groups – they might even be stronger in the harder memory task group, if the distraction of the task stopped participants from having reasonable thoughts about why they shouldn’t compare themselves to the models.

The results were clear. Participants who were preoccupied by the difficult memory task were unaffected by the images – afterwards their mood and satisfaction with their appearance was indistinguishable from their feelings at the beginning of the experiment. But the easy memory task group showed the classic effect of thin-ideal images – they felt worse in terms of their mood, and they felt worse about their appearance specifically.

A second experiment, recruiting 177 participants, replicated the first with different images, and also showed that it wasn’t merely having the mental capacity to think about appearance that produced the effect. A group that was given the easy memory task but shown pictures of coloured rectangles rather than fashion models didn’t suffer any negative effects on their mood or appearance satisfaction.

In addition, participants in this second experiment rated the importance of media images as a source of information about appearance, and the pressure they felt to emulate celebrities and other media figures. The researchers’ logic was that if the comparison process is automatic through practice, it is participants who scored highly on this questionnaire who would be most likely to show the harmful comparison effect from looking at models.

As you would expect, in the easy memory condition, it was those participants who rated themselves most highly on this questionnaire who were most affected by the images. But in the difficult memory condition – the one in which only efficient, automatic, processes could be generating comparisons to the thin-ideal images – the media-engaged participants’ mood and satisfaction with their appearance was unaffected, just like their peers who felt less pressure to emulate celebs.

Taken together, the two experiments are a strike against the idea that we automatically compare ourselves to thin-ideal media images, even those of us most likely to feel like we ought to – young women who rate themselves as preoccupied with the media and their appearance.

Perhaps this is grounds for optimism. It might mean we can starve these images of their negative power by not paying them attention. But despite this, it shows again that when we do focus on these images they make us feel bad. Thin-ideal images are so prevalent in our society that even a temporary effect could produce a consistent load of misery for individuals who attend to them. So the deeper question is how society would need to change so that such images are less prevalent, or so that paying attention to them is no longer celebrated as a priority.


Want, S., Botres, A., Vahedi, Z., & Middleton, J. (2015). On the Cognitive (In)Efficiency of Social Comparisons with Media Images Sex Roles DOI: 10.1007/s11199-015-0538-1

further reading
Video protects girls from the negative effects of looking at ultra-thin models
By age three, girls already show a preference for thin people
How do women and girls feel when they see sexualised or sporty images of female athletes?

Post written by Tom Stafford, a psychologist from the University of Sheffield who is a regular contributor to the Mind Hacks blog. He is on twitter as @tomstafford and his latest book is ‘For argument’s sake: evidence that reason can change minds‘.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!