Category: Genetics

It’s now possible, in theory, to predict life success from a genetic test at birth

one week old newborn girl on daddy's hand.By guest blogger Stuart Ritchie

For decades, we’ve known from twin studies that psychological traits like intelligence and personality are influenced by genes. That’s why identical twins (who share all their genes) are not just more physically similar to each other than non-identical twins (who share half their genes), but also more similar in terms of their psychological traits. But what twin studies can’t tell us is which particular genes are involved. Frustratingly, this has always left an ‘in’ for the incorrigible critics of twin studies: they’ve been able to say “you’re telling me these traits are genetic, but you can’t tell me any of the specific genes!” But not any more. Continue reading “It’s now possible, in theory, to predict life success from a genetic test at birth”

Genetic research can promote peace or conflict, depending on how it’s used

It’s becoming easier than ever to research the genetic roots of different ethnic groups and these findings can be framed differently to either emphasise that groups are similar or different. For example, a BBC headline from 2000 stated “Jews and Arabs are ‘genetic brothers’” while a 2013 Medical Daily headline claimed “Genes of most Ashkenazi Jews trace back to indigenous Europe, not Middle East“. As political leaders have started citing this kind of evidence to promote their particular agenda, be that to unite or divide peoples, a new study in Personality and Social Psychology Bulletin has investigated whether genetic information could be a tool for promoting peace or a weapon to stir conflict.

Sasha Kimel and her colleagues began by asking 123 Jewish and 57 Arab participants in the US to read either the BBC “genetic siblings” article from 2000 or an adapted “genetic strangers” version which reversed the findings to suggest that Arabs and Jews are genetically very dissimilar. The participants had no idea that they’d been recruited based on their ethnicity, and to further disguise the aims of the research they were told that they would be tested on their memory of the article after completing a series of distracting psychological tests. In reality, some of these tests were used to reveal any effects of the articles on the participants’ attitudes and this included a measure of their views of a typical Arab- or Jewish-American and a test of their implicit (subconscious) attitudes towards Arabs and Jews.

As the researchers expected, Jews and Arabs rated each other more positively after reading about their genetic similarities compared with reading about their differences, although there were no effects on implicit attitudes.

A second study was similar but involved Jewish participants only, and this time the researchers showed that reading about genetic similarities between Jews and Arabs led the participants to display less aggression towards an Arab opponent called Mohammed in a reaction time contest. That is, on winning trials, the Jewish participants had the chance to blast their Arab opponent with white noise, and those participants who’d read about genetic similarities chose weaker noise blasts than those who’d read about genetic differences.

A third study with more Jewish participants was also similar but added a third baseline neutral condition in which participants read an article that had nothing to do with genetics or ethnic groups. This time the main outcome measure was support for Israeli peacekeeping. These results suggested that the “genetics strangers” article wasn’t having much influence on participants compared with the neutral condition, but that the “genetic siblings” article was boosting support for peacekeeping via its effect of improving attitudes towards Arabs.

Based on these initial results the researchers said that they “encourage interventions that create greater awareness of the considerable amount of genetic overlap that exists between all of the world’s ethnic and racial groups”.

But would these benefits translate to Israel, a nation that lives with ongoing interethnic conflict? The fourth and arguably most important study tested the effects of the same three news articles (“genetic siblings”, “genetic strangers” and a neutral story) translated into Hebrew and adapted so they appeared to have been published in the Israeli newspaper Ynet. The researchers recruited nearly 200 Jewish Israeli’s on commuter trains in North and South Israel and had them read one of these three stories before completing tests of their attitudes towards Palestinians and their support for different policies. The worrying finding this time was that the “genetic siblings” article appeared to have no benefit, but that the “genetic strangers” article reduced support for peaceful policies via increasing antipathy towards Palestinians.

Based on their last study, the researchers warned that “…learning about how you are genetically different from an enemy group may have a particularly menacing effect in the contexts of war”. They added: “Based on our findings, we suggest that crisis-monitoring organisations (e.g. International Crisis Group, Genocide Watch) go on heightened alert when conflict-rhetoric begins emphasising genetic differences.”


Kimel, S., Huesmann, R., Kunst, J., & Halperin, E. (2016). Living in a Genetic World: How Learning About Interethnic Genetic Similarities and Differences Affects Peace and Conflict Personality and Social Psychology Bulletin, 42 (5), 688-700 DOI: 10.1177/0146167216642196

–further reading–
Could lessons in genetic variation help reduce racial prejudice?

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free weekly email will keep you up-to-date with all the psychology research we digest: Sign up!

Twin study raises doubts about the relevance of "grit" to children’s school performance

Grit is in vogue. US psychologist Angela Duckworth’s TED talk on grit is one of the most popular recorded. And her forthcoming book on the subject, subtitled “the power of passion and perseverance” is anticipated to be a bestseller. On both sides of the pond, our governments have made the training of grit in schools a priority.

To psychologists, “grit” describes how much perseverance someone shows towards their long-term goals, and how much consistent passion they have for them. It’s seen as a “sub-trait” that’s very strongly related to, and largely subsumed by, conscientiousness, which is known as one of the well-established “Big Five” main personality traits that make up who we are.

The reason for all the interest in grit, simply, is that there’s some evidence that people who have more grit do better in life. Moreover, it’s thought that grit is something you can develop, and probably more easily than you can increase your intelligence or other attributes.

But to a team of psychologists based in London and led by behavioural genetics expert Robert Plomin, the hype around grit is getting a little out of hand. There just isn’t that much convincing evidence yet that it tells you much about a person beyond the Big Five personality traits, nor that it can be increased through training or education.

Supporting their view, the researchers have published an analysis in the Journal of Personality and Social Psychology of the personalities, including grit, and exam performance at age 16 of thousands of pairs of twins. Some of the twins were identical meaning they share the same genes, while others were non-identical meaning they share roughly half their genes just like non-twin siblings do. By comparing similarities in personality and exam performance between these two types of twin, the researchers were able to disentangle the relative influence of genes and the environment on these measures.

The main finding is that the participants’ overall personality scores were related to about 6 per cent of the variation seen in their exam performance. Grit specifically was related to just 0.5 per cent of the differences seen in exam performance. Given the small size of this relationship, the researchers said “we believe that these results should warrant concern with the educational policy directives in the United States and the United Kingdom.”

Also relevant to the hype around grit, the researchers found that how much grit the participants had was to a large extent inherited (about a third of the difference in grit scores were explained by genetic influences), and that none of the difference in grit was explained by environmental factors that twin pairs shared, such as the way they were raised by their parents and the type of schooling they had (this leaves the remaining variance in grit either influenced by so-called “non-shared environmental factors” – those experiences in life that are unique to a person and not even shared by their twin who they live with – or unexplained). This is a disappointing result for grit enthusiasts because it suggests that the experiences in life that shape how much grit someone has are not found in the school or the home (at least not for the current sample). Bear in mind, though, that this doesn’t discount the possibility that a new effective home- or school-based intervention could be developed.

The researchers concluded that once you know a child’s main personality scores, knowing their amount of grit doesn’t seem to tell you much more about how well they’ll do at school. This study doesn’t rule out the idea that increasing children’s grit, if possible, could be beneficial, but the researchers warned that “more research is warranted into intervention and training programs before concluding that such training increases educational achievement and life outcomes.”


Rimfeld, K., Kovas, Y., Dale, P., & Plomin, R. (2016). True Grit and Genetics: Predicting Academic Achievement From Personality. Journal of Personality and Social Psychology DOI: 10.1037/pspp0000089

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

Psychologists study twins to learn more about the roots of procrastination

With so many digital distractions a mere mouse click away, procrastination is easier than ever. You want, nay need, to work on an important project, yet find yourself browsing Twitter, making coffee, checking email – basically anything other than doing what you should be doing. Daniel Gustavson and his colleagues – the authors of a new twin study of procrastination published in the Journal of Experimental Psychology: General – sum it up as “the irrational delay of an intended course of action”.

Much has been written about why we fall prey to this habit in the moment (the all-important job is perceived as too challenging, the other tasks and distractions seem easier, and so on), but Gustavson and his colleagues wanted to learn more about why some of us are generally more prone to procrastination than others. Do we inherit a predisposition for procrastination in our genes, and what other mental abilities are related to the procrastination habit?

The researchers recruited 386 pairs of same-sex twins, 206 of whom were identical twins, meaning they have the same genes, and 179 were non-identical, meaning they share on average half their genes. After missing data were removed, the final sample included 401 women and 350 men (average age 23). The twins completed a questionnaire about their proclivity for procrastination (this involved rating their agreement with statements like “I am continually saying ‘I’ll do it tomorrow'”), and they answered questions about their proneness to “goal failures” (tested through questions like “Do you find you forget what you came to the shops to buy?”).

The twins also completed several measures of their “executive function”, including their powers of inhibition (e.g. one task involved resisting the reflex to glance at a square that appeared on-screen, and looking instead in the opposite direction), their ability to shift mind-sets (e.g. categorising shapes on a coloured background by their shape one minute, then by their colour, depending on changing task instructions), and their ability to juggle information in memory over short periods of time.

By comparing similarities in executive function performance, procrastination proneness and goal failures between identical and non-identical twins, the researchers were able to deduce how much of an influence genes have on these traits and abilities, and how much overlap there is in the genetic influence on the different measures. In simple terms, a higher correlation on a particular measure among identical twins compared with non-identical twins would indicate a greater role for genes.

Here are some of the key findings. The tendency to procrastinate was found to be partly inherited – 28 per cent of variability in this trait was explained by genetic influences (though note, this includes gene-environment interactions, such as a procrastinator choosing a job – like being a blog editor – that makes procrastination easier). Moreover, 17 per cent of the procrastination variability that was explained by genes overlapped with the genetic influences on goal failures – that is, many of the same genes influencing procrastination appear to play a role in the ability to manage goals. Also, environmental influences common to both procrastination and goal management explained a further 28 per cent of variation in procrastination.

The tendency to procrastinate also correlated with overall executive function ability – that is, people who said they procrastinated more tended to achieve an overall poorer score on the executive function tests. And again there was genetic overlap: many of the genetic influences on executive function were found to be the same as those shared by both procrastination and goal management.

There was one caveat in the association between procrastination proneness and executive function. Procrastinators actually tended to perform better on the ability to shift mind-sets, presumably because having a butterfly mind gives you a certain mental flexibility even though it makes it difficult to focus.

The findings help to pick apart the root causes of procrastination. At a genetic and behavioural level, they show that a tendency to procrastinate tends to go hand in hand with an ability to manage goals, and mostly a poorer ability to control one’s own mind, in terms of inhibition and juggling information.

Gustavson and his team warned that identifying the actual genes involved in procrastination, executive function and goal management remains a long way of, and that many hundreds or thousands of gene variants are likely involved. They also cautioned that their study can’t tell us about the causal relationships, if any, between the studied traits – it’s tempting to assume that poor executive function or goal management causes procrastination, for example, but it’s theoretically possible the influence could run the other way, both ways, and/or that other factors not studied here are more relevant, such as personality or intelligence. Nonetheless, the researchers did offer some brief practical advice on the back of their findings:

“Training subjects on how to set good goals may improve their ability to manage these goals and avoid procrastination … Moreover, helping subjects retrieve their important long-term goals and use those goals to avoid getting side-tracked by short-term temptations (e.g. developing implementation intentions) might also be effective at reducing procrastination.”


Gustavson, D., Miyake, A., Hewitt, J., & Friedman, N. (2015). Understanding the Cognitive and Genetic Underpinnings of Procrastination: Evidence for Shared Genetic Influences With Goal Management and Executive Function Abilities. Journal of Experimental Psychology: General DOI: 10.1037/xge0000110

–further reading–
The cure for procrastination? Forgive yourself!
Psychologists investigate a major, ignored reason for our lack of sleep – bedtime procrastination
Forgive yourself for relaxing in front of the TV and the couch time might actually do you some good

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

New genetic evidence suggests face recognition is a very special human skill

Example stimuli from Shakeshift and Plomin, 2015.

A new twin study, published today in PNAS, of the genetic influences on face recognition ability, supports the idea that face recognition is a special skill that’s evolved quite separately from other aspects of human cognition. In short, face recognition seems to be influenced by genes that are mostly different from the genes that influence general intelligence and other forms of visual expertise.

The background to this is that, for some time, psychologists studying the genetics of mental abilities have noticed a clear pattern: people’s abilities in one domain, such as reading, typically correlate with their abilities in other domains, such as numeracy. This seems to be because a person’s domain-specific abilities are strongly associated with their overall general intelligence and the same genes that underlie this basic mental fitness are also exerting an influence on various specific skills.

Nicholas Shakeshaft and Robert Plomin were interested to see if this same pattern would apply to people’s face recognition abilities. Would they too correlate with general intelligence and share the same or similar genetic influences?

The researchers recruited 2,149 participants, including 375 pairs of identical twins who share the same genes, and 549 non-identical twins, who share roughly half the same genes, just like typical siblings (overall the sample was 58 per cent female with an average age of 19.5 years). The participants completed a test of their face processing skills, including memorising unfamiliar faces, and also tests of their ability to memorise cars, and their general intelligence, in terms of their vocabulary size and their ability to solve abstract problems.

Comparing the similarities in performance on these different tests between identical and non-identical twin pairs allowed the researchers to estimate how much the different skills on test were influenced by the same or different genes.

All the abilities – face recognition, car recognition and general mental ability – showed evidence of strong heritability (being influenced by genetic inheritance), with 61 per cent, 56 per cent, and 48 per cent of performance variability in the current sample being explained by genes, respectively.

Crucially, performance on face recognition was only moderately correlated with car recognition ability (r = .29 where 1 would be a perfect correlation) and modestly correlated with general mental ability (r = .15), and only 10 per cent of the genetic influence on face recognition ability was the same as the genetic influence on general mental ability (and likewise, only 10 per cent of the genetic influence on face memory was shared with the genes affecting memory for cars).

Essentially, this means that most of the genetic influences on face recognition ability are distinct from the genetic influences on general mental ability or on car recognition ability. Shakeshaft and Plomin said this “striking finding” supports the notion that there is something special about human facial recognition ability. These results add to others that have suggested face recognition is a special mental ability – for instance, some have argued that faces alone trigger brain activity in the so-called “fusiform face area” (although this claim has been challenged); and unlike our ability to recognise other objects or patterns, our ability to recognise faces is particularly impaired when faces are inverted, consistent with the idea that we use a distinctive “holistic” processing style for faces.

The story is complicated somewhat by the researchers’ unexpected finding that recognition ability for cars was also linked with distinct genetic influences that mostly did not overlap with the genetic influences on general mental ability. Perhaps, the researchers surmised, the tests of general mental ability used here (a vocab test and the well-used Raven’s Progressive Matrices) did not adequately tap the full range of what we might consider general mental abilities. Whatever the reason, it remains the case that this new research suggests that face recognition ability is influenced by a set of genetic influences that are largely distinct from those implicated in a similar form of visual recognition (for cars) and implicated in vocab ability and abstract reasoning. Based on this, the researchers concluded they’d shown for the first time that “the genetic influences on face recognition are almost entirely unique.”


Nicholas G. Shakeshaft, & Robert Plomin (2015). Genetic specificity of face recognition PNAS

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

Why do more intelligent people live longer?

227a6-senior2bchessBy guest blogger Stuart Ritchie

It’s always gratifying, as a psychologist, to feel like you’re studying something important. So you can imagine the excitement when it was discovered that intelligence predicts life expectancy. This finding is now supported by a large literature including systematic reviews, the most recent of which estimated that a difference of one standard deviation in childhood or youth intelligence (that’s 15 IQ points on a standardised scale) is linked to a 24 per cent lower mortality risk in the subsequent decades of life. That’s a pretty impressive link, but it immediately raises a critical question: why do brighter people live longer?

A new study (pdf) published in the International Journal of Epidemiology attempts to provide new, biological evidence to answer this question. But first, let’s think through the possibilities. We know that people with higher IQ scores tend to be healthier, possibly because they eat better, exercise more, are better able to understand health advice, are less likely to be injured in accidents and deliberate violence, and also because they tend to have better jobs. Here, the causal arrow is pointing from IQ to longevity – the effects of being smarter cause you to die later. But there are other explanations: what if having a lower IQ is just an indicator of an underlying health condition that’s the real cause of earlier death? Or what if the genes for having a healthier body are also the genes for having a healthier brain, and the causal pathway is from this third variable (i.e. genetics) to both IQ and longevity?

The authors of the new study, Rosalind Arden and colleagues, tested this last hypothesis, known as “genetic pleiotropy” (the idea that the same genes influence multiple different traits). They took three twin datasets, selecting in total 1,312 twin pairs where one or both of the twins had died. Then they correlated the twins’ IQ scores with the lengths of their lives (or their life expectancies, for those still living).

As they expected, the researchers found an overall lifespan-IQ correlation, albeit a small one (r = 0.12, where 1.00 would be a perfect match). Importantly, by comparing the correlations in identical twins (who share all their genes) versus fraternal twins (who share approximately half), they were also able to estimate the “genetic correlation” – the overlap in the two traits that’s caused by genetic differences. They found that, overall, 95 per cent of the correlation in IQ and longevity was due to genetics.

So, is this a final answer to the debate over the IQ-mortality connection? Does this show that, perhaps depressingly, the link isn’t due to changeable lifestyle factors, but actually some kind of genetic “system integrity” that underlies brightness and longer lives?

Ritchie’s critically acclaimed
new book is out now.

Not so fast. The important part is in the phrase “due to genetics”. In a 2013 Nature Reviews Genetics article, geneticist Nadia Solovieff and colleagues outlined all the potential causal mechanisms that might make two traits genetically correlated. They drew a critical distinction between “biological” and “mediated” pleiotropy. The former is the “obvious” inference, which is that the same genes cause both intelligence and longevity. But the latter possibility is that the variables only appear to be genetically correlated, because genes cause one factor, which then goes on to cause the other. That is, if genes cause intelligence, and intelligence (via lifestyle choices etc.) causes a longer lifespan, we’d still see the same genetic correlation, even if those genes have no direct effect on lifespan itself. If true, this would still be pleiotropy of a sort: the genes linked to intelligence are having an indirect effect on lifespan. But as the authors acknowledge in their paper, this “pleiotropy-lite” interpretation of the new findings would mean we don’t yet have knockdown evidence for the genetic “system integrity” idea.

So how do we tease apart the two possible explanations for the genetic correlation? In the paper, the authors suggest we study non-human animals (for which the literature on cognitive ability is growing fast) where we can more readily control the “lifestyle” factors, thereby isolating any potential direct effects of the same genes on both intelligence and longevity. Really, though, we might have to wait until we have a long list of genes that are reliably linked to human intelligence. If we knew a good number of those, we could test whether they also influence health and lifespan – if they did, this would be evidence for true “biological” pleiotropy. We’d know then that the link between IQ and lifespan is down to some people simply winning the genetic lottery, rather than to lifestyle factors that any of us could change.

Conflict of interest: Stuart Ritchie is a postdoc in the lab of Ian Deary, one of the co-authors of the paper discussed here.


Arden, R., Luciano, M., Deary, I., Reynolds, C., Pedersen, N., Plassman, B., McGue, M., Christensen, K., & Visscher, P. (2015). The association between intelligence and lifespan is mostly genetic International Journal of Epidemiology DOI: 10.1093/ije/dyv112

further reading
How do you prove that reading boosts IQ?

Post written by Stuart J. Ritchie, a Research Fellow in the Centre for Cognitive Ageing and Cognitive Epidemiology at the University of Edinburgh. His new book, Intelligence: All That Matters, is available now. Follow him on Twitter: @StuartJRitchie

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

Why fathers might want to thank their handsome sons

Women rated men’s faces as more attractive when they were shown alongside a good-looking son

If you’re the father to a good-looking boy, you might want to give him your thanks – his handsome looks apparently mean women will tend to find you more attractive. That’s according to a new study by Pavol Prokop at Trnava University in Slovakia, who says the result is consistent with the established idea from evolutionary psychology that women instinctively pick up on cues to the quality of a man’s genes.

Just as past research has shown that women, on average, find taller men with symmetrical faces more attractive because such features are indicators of good genes, the new finding suggests a man’s offspring also influence women’s judgments about his attractiveness. If a man can sire a handsome boy, the instinctual logic goes, then he must be in possession of valuable genes.

In the first experiment, Prokop presented dozens of young women with several triads of photographs showing an attractive or unattractive young man alongside pictures of two boys, one attractive, the other less attractive. For each triad, the women’s task was to say which boy the man was father to. The finding here was that the women were more likely to assume that attractive men were fathers to attractive boys (and unattractive men the fathers of less attractive boys). This simple test lay the groundwork for the remainder of the study, confirming that women generally assume that attractive fathers have attractive sons.

Next, nearly three hundred more young women rated the attractiveness of a series of attractive and unattractive men’s faces, each of which was presented alongside a boy (also attractive or unattractive), who was supposedly the man’s son. In truth, but unbeknown to the participants, none of the pictured men and boys were actually related. A further detail was that each man was described either as the biological father or step-father to the boy shown alongside him.

When a man’s face was presented alongside what participants believed to be his handsome son, he (the putative father) tended to receive higher attractiveness ratings from the participants, than if he was depicted with an unattractive son. There was some evidence that this effect was greater for unattractive men, and the effect was more apparent when men were described as biological fathers than as stepfathers. A weakness in the methodology (there were no sons of neutral attractiveness), means we can’t know how much attractive sons were making their fathers appear more handsome to the women, compared with how much unattractive sons were having the opposite effect.

If handsome men are more likely to sire handsome children, and those handsome children exaggerate their fathers’ attractiveness still further, a self-perpetuating cycle could be set in motion that might help explain a previous finding: attractive men tend to have more children (within the same marriage) than less attractive men. Of course there’s also the possibility that the attractiveness boost gained by having a handsome son could leave a man more open to advances from his partner’s female rivals (known as mate-poaching in evolutionary psychology), a possibility that awaits further research.

The main finding of this research – that fathers are rated more attractive when their sons are good-looking – is open to some counter-interpretations. For example, perhaps there was a simple priming effect at play and seeing any attractive image alongside a man’s face would lead that man’s face to receive higher attractiveness ratings.

Prokop tested that possibility in a further experiment in which men’s faces were presented alongside attractive or unattractive non-human pictures, such as nature scenes and buildings (e.g. a beautiful beach versus a dirty beach). This time, women’s judgments about the attractiveness of handsome men were unaffected by whether a beautiful or ugly scene or object appeared alongside them, suggesting the effect of a handsome son on a father’s attractiveness is unique.

However, unattractive men did benefit from higher attractiveness ratings when their faces were shown alongside a beautiful scene or object. This is good news for men who don’t have film-star looks – after all, while the influence of genetic inheritance means they are less likely to have the chance to bask in the reflected beauty of a handsome son, this result says they can easily turn to other means of boosting their attractiveness instead. For example, Prokop said they could try “wearing fashionable clothes.”


Prokop, P. (2015). The Putative Son’s Attractiveness Alters the Perceived Attractiveness of the Putative Father Archives of Sexual Behavior, 44 (6), 1713-1721 DOI: 10.1007/s10508-015-0496-2

further reading
You hunky smile magnet
The downside of being good-looking AND wealthy
Shiny, swanky car boosts men’s appeal to women, but not women’s appeal to men
Men feel more physically attractive after becoming a father
Freud was right: we are attracted to our relatives

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

Optimism and pessimism are separate systems influenced by different genes

“... the optimist sees the rose and not its thorns; the pessimist stares at the thorns, oblivious to the rose,” Kahlil Gibran.

Optimists enjoy better health, more success, more happiness, and longer lives, than pessimists. No surprise, then, that psychologists are taking an increasing interest in our outlook on life. An unresolved issue is whether optimism and pessimism are two ends of the same spectrum, or if they’re separate. If the traits are separate, then in principle, some people could be highly optimistic and pessimistic – to borrow the poet Gibran’s analogy, they would be keenly aware of both the rose and its thorns.

Timothy Bates at the University of Edinburgh has turned to behavioural genetics to help settle this question. He’s analysed data on optimism and pessimism gathered from hundreds of pairs of identical and non-identical twins. These were participants from a US survey and their average age was 54. The twins rated their agreement with various statements as a way to reveal their optimism and pessimism such as “In uncertain times, I usually expect the best” and “I rarely count on good things happening to me.” They also completed a measure of the “Big Five” personality traits: extraversion, neuroticism etc.

The reasoning behind twin studies like this is that if optimism and pessimism are highly heritable (i.e. influenced by inherited genetic factors), then these traits should correlate more highly between pairs of identical twins, who share all their genes, than between non-identical twins, who share approximately half their genes. And if optimism was found to be more heritable than pessimism, or vice versa, this would indicate different genetic influences on optimism and pessimism.

Another insight from twin studies is to disentangle the relative influence of shared and unique environmental factors – these are the aspects of a twin’s upbringing that they share with their sibling, such as parenting style, and those that are unique, such as the friends they keep.

Bates’ analysis indicates that optimism and pessimism are subject to shared genetic influences (with each other, and with other personality traits), but also to independent genetic influences, thus supporting the notion that optimism and pessimism are distinct traits, not simply two sides of the same coin.

“Optimism and pessimism are at least partially biologically distinct, resulting in two distinct psychological tendencies,” Bates said. He added that this dovetails with neuroscience evidence that’s indicated there are separate neural systems underlying optimism and pessimism.

The new findings also suggested there is a “substantial” influence of upbringing on optimism and pessimism (i.e. increasing one and lowering the other, and/or vice versa). This raises the intriguing possibility that optimism might to be some extent a malleable trait that can be encouraged through a child’s upbringing.


Bates, T. (2015). The glass is half full half empty: A population-representative twin study testing if optimism and pessimism are distinct systems The Journal of Positive Psychology, 1-10 DOI: 10.1080/17439760.2015.1015155

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Psychologists and psychiatrists feel less empathy for patients when their problems are explained biologically

The idea that mental illness is related to brain abnormalities or other biological factors is popular among some patients; they say it demystifies their experiences and lends legitimacy to their symptoms. However, studies show that biological explanations can increase mental health stigma, encouraging the public perception that people with mental illness are essentially different, and that their problems are permanent. Now Matthew Lebowitz and Woo-young Ahn have published new evidence that suggests biological explanations of mental illness reduce the empathy that mental health professionals feel towards patients.

Over two hundred psychologists, psychiatrists and social workers were presented with vignettes of patients with conditions such as social phobia, depression or schizophrenia. Crucially, some of these vignettes were accompanied by purely biological explanations focused on factors like genes and brain chemistry, while other vignettes were accompanied by psychosocial explanations, such as a history of bullying or bereavement. Next, the mental health professionals reported their feelings by scoring how far a range of adjectives – such as “sympathetic”, “troubled” and “warm” – fitted their current state.

Vignettes accompanied by biological explanation provoked lower feelings of empathy from the clinicians, and this was true regardless of their specific profession. Both biological and psychosocial explanations triggered similar levels of distress, so the reduced empathy associated with biological explanation was not simply due to psychosocial explanations being more upsetting. The mental health professionals rated the biological explanations less clinically useful; biological explanation also prompted them to have less faith in psychotherapy and more confidence in drug treatments.

Similar results were found in a follow-up study in which clinicians and social workers were presented with vignettes and explanations that reflected a combination of psychosocial and biological factors, but with one approach more dominant than the other. The idea was that this would better reflect real life. In this case, explanations dominated by biological factors prompted lower empathy from clinicians.

Lebowitz and Ahn suggest biological explanations provoke reduced empathy because they have a dehumanising effect (implying patients are “systems of interacting mechanisms”) and give the impression that problems are permanent. With biological approaches to mental illness gaining prominence in psychology and psychiatry these are potentially worrying results. A silver lining is that both medically trained and non-medical clinicians and social workers in the study saw biological explanations as less clinically useful than psychosocial explanations.

A weakness of the research is the lack of a baseline no-explanation control condition – this means we can’t know for sure if psychosocial explanations increased empathy or if biological explanations reduced it. Also, as the researchers admitted, the vignettes and explanations were greatly simplified. Nonetheless, the findings may still give reason for concern. Lebowitz and Ahn suggest reductions in empathy may be avoided if clinicians understand that “even when biology plays an important etiological role, it is constantly interacting with other factors, and biological ‘abnormalities’ do not create strict distinctions between members of society with and without mental disorders.”


Lebowitz, M., & Ahn, W. (2014). Effects of biological explanations for mental disorders on clinicians’ empathy Proceedings of the National Academy of Sciences, 111 (50), 17786-17790 DOI: 10.1073/pnas.1414058111

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

How do you prove that reading boosts IQ?

By guest blogger Stuart Ritchie.

A recent study on whether reading boosts intelligence attracted global media attention: “Reading at a young age makes you smarter,” announced the Daily Mail. “Early reading boosts health and intelligence,” said The Australian.

In the race for eye-catching headlines, this mainstream media coverage arguably missed the more fascinating story of the hunt for cause and effect. Here lead author Dr Stuart Ritchie explains the science:

“Causality, it turns out, is really difficult to prove. Correlational studies, while interesting, don’t give us information about causation one way or another. The randomised controlled trial is the ‘gold standard’ method of telling whether a manipulation has an effect on an outcome. But what if a randomised experiment isn’t possible, for practical or ethical reasons? Thankfully, there is an entire toolkit of study designs that go beyond correlation, and can be used to take steps up the ladder closer to causation.

Say you wanted to find interventions that cause intelligence to increase. Since childhood intelligence test scores are so powerfully predictive of later educational success, as well as health and wealth, it’s of great importance to find out how they might be improved. All sorts of nutritional supplements and training programmes have been tried, but all have failed (so far) to reliably show benefits for IQ. However, one factor that has been convincingly shown to cause improvements in intelligence test scores is education. It wouldn’t exactly be ethical to remove some children from school at random and see how they do in comparison to their educated peers. But in a step up the aforementioned causal ladder, researchers in 2012 used a ‘natural experiment’ in the Norwegian education system (where compulsory years of education were increased in some areas but not others) to show that each year’s worth of extra education added 3.6 IQ points.

What is it about education that’s driving these effects? Could it be that a very basic process like learning to read is causing the improvements in IQ? Keith Stanovich and colleagues showed, in a number of studies in the 1990s, that earlier levels of reading interest (though not ability) were predictive of later levels of verbal intelligence, even after controlling for children’s initial verbal intelligence. In a 1998 review, they concluded that “reading will make [children] smarter”.

On the ladder of causation, a control for pre-existing ability in a non-experimental design is important, but problems remain. For instance, since we know that common genes contribute to reading and intelligence, any study that fails to measure or control for genetic influences can’t rule out that the possibility that the early reading advantage and the later intelligence benefit are due simply to a shared genetic basis that is, say, expressed at different times in different areas of the brain. If only there were a way of cloning children – comparing one “baseline” version of each child against a second version with improved reading ability, and then seeing if the better reading translated to higher intelligence later in development…

This sounds like a far-fetched fantasy experiment. But in a recent study, my colleagues and I did just that, though we left it to nature to do the cloning. Tim Bates, Robert Plomin, and I analysed data from 1,890 pairs of identical twins who were part of the Twins Early Development Study (TEDS). The twins had their reading ability and intelligence tested on multiple measures (averaged into a composite) at ages 7, 9, 10, 12, and 16. For each twin pair at each age, we calculated the difference between one twin and the other on both variables. Since each pair was near-100 per cent identical genetically, and was brought up in the same family, these differences must have been caused purely by the ‘non-shared environment’ (that is, environmental influences experienced by one twin but not the other).

We found that twins who had an advantage over their co-twin on reading at earlier points in their development had higher intelligence test scores later on. Because this analysis controls for initial IQ differences, as well as genetics and socioeconomic circumstances, it is considerably more compelling than previous results that used less well-controlled designs. It’s important to note that we found associations between earlier reading ability and later nonverbal intelligence, as well as later verbal intelligence. So, beyond the not-particularly-surprising finding that being better at reading might help with a child’s vocabulary, we made the pretty-surprising finding that it might also help with a child’s problem solving and reasoning ability. Why?

We now enter the realm of speculation. It might be that reading allows children to practise the skills of assimilating information and abstract thought that are useful when completing IQ tests. The process of training in reading may also help teach children to concentrate on tasks—like IQ tests—that they’re asked to complete. Our research doesn’t shed light on these mechanisms, but we hope future studies will.

One should not give our study a criticism-free ride just because it tells a cheery, ‘good news’ story. A step up toward causation is not causation. Could there have been alternative explanations for our findings? Certainly. It is possible that, for instance, teachers spot a child with a reading advantage and give them additional attention, raising their intelligence ‘without’, as we say in the paper, ‘reading doing the causal “work”‘. It may also have been that our controls were inadequate – as I said above, identical twins are nearly genetically identical, but a small number of unique genetic mutations might occur within each pair. The largest lacuna in our study, though, was the cause of the initial within-pair reading differences. Whether these were caused by teaching, peers, pure luck, or some other process, we couldn’t tell, and it’s of great interest to find out.

We hope that our study encourages researchers in three ways. First, in the eternal quest for intelligence-boosters, instead of looking to flashy new brain-training games or the like, they might wish to examine, and maximise, the potentially IQ-improving effects of ‘everyday’ education. Second, they could attempt to answer the questions raised by our study. Why do identical twins differ in reading, and are the reasons under a teacher’s control? What are the specific mechanisms that might lead from literacy to intelligence? Third, and more generally, we hope it will inspire them to consider new methods, including the twin-differences design, that edge further up the causal ladder, away from the basic correlational study. The data are, of course, far harder to collect, but the stronger inferences found there are well worth the climb.”


Ritchie, S., Bates, T., & Plomin, R. (2014). Does Learning to Read Improve Intelligence? A Longitudinal Multivariate Analysis in Identical Twins From Age 7 to 16 Child Development DOI: 10.1111/cdev.12272

Post written by Stuart J. Ritchie, a Research Fellow in the Centre for Cognitive Ageing and Cognitive Epidemiology at the University of Edinburgh. Follow him on Twitter: @StuartJRitchie