Category: biological

Is creativity something you inherit from your parents?

6930271257_36904725a1_bBy Alex Fradera

Jeb Bush’s failure to secure a Presidential triple-play is memorable perhaps because it’s an exception to a familiar routine: the family dynasty. It’s a routine especially common in the arts, where a writer’s family tree is apt to contain a couple of actors, a director, and maybe a flower arranger to boot. This might simply reflect upbringing – or maybe the powers of nepotism – but creative success also owes to temperament and talents, some of which may have their origins in our genetic makeup. The journal Behavioural Genetics has recently published a heritability study that explores how deeply a creative vocation sits in our DNA.

Continue reading “Is creativity something you inherit from your parents?”

Many of the same genes that influence our personality also affect our mental health

Prototype of womenBy Christian Jarrett

We know from twin and family studies that our personality is to a large degree – probably around 40 per cent – inherited. Geneticists are busy trying to find the specific gene variants involved, but because each one on its own only exerts a modest influence, this is challenging research requiring huge samples. A new study in Nature Genetics has made a significant contribution, using the technique of Genome Wide Analysis to look for genetic variants that correlate with personality. The researchers led by Min-Tzu Lo at the University of California, San Diego have identified variations in six genetic loci that correlate with different personality trait scores, five of which were previously unknown. In a separate analysis, the researchers also showed that many of the genetic variants involved in personality overlap with those involved in the risk of developing mental health disorders.

Continue reading “Many of the same genes that influence our personality also affect our mental health”

Broca and Wernicke are dead – it’s time to rewrite the neurobiology of language

By Christian Jarrett

Flick through any neuropsychology textbook and you’ll hear about the nineteenth century pioneers Paul Broca and Carl Wernicke, who showed that language production and comprehension are subserved by two distinct brain regions, which came to be known as Broca’s and Wernicke’s area, respectively. You’ll learn too about another neurology pioneer, Norman Geschwind who described how these two regions are joined by a key connective tract – the arcuate fasciculus.

This is the “Classic Model” of the neurological basis of language function – a revolution in our understanding at the time, and hugely influential to this day. But according to a compelling new paper in Brain and Language, the Classic Model is obsolete and no longer fit for purpose. What’s more, its legacy and the continued use of its terminology is hampering progress in the field, in terms of research and medical practice. Continue reading “Broca and Wernicke are dead – it’s time to rewrite the neurobiology of language”

New clues about the way memory works in infancy

Neurons in the beautiful background. 3d illustration of a highBy Alex Fradera 

Can we form memories when we are very young? Humans and non-humans alike show an “infantile amnesic period” – we have no memory of anything that happens during this time (usually up to age three or four in humans) which might suggest we can’t form very early memories. But of course it might be that we can form memories in these early years, it’s just that they are later forgotten. The idea that at least something is retained from infancy is consistent with the fact that disorders present in adult life can be associated with very early life events.

Now Nature Neuroscience has published a paper confirming that in rats some kind of memories are created during the amnesic period, but that these operate differently and are produced by different brain chemistry from adult memories. What’s more, such events may have a role in kickstarting memory system maturation. Continue reading “New clues about the way memory works in infancy”

It’s now possible, in theory, to predict life success from a genetic test at birth

one week old newborn girl on daddy's hand.By guest blogger Stuart Ritchie

For decades, we’ve known from twin studies that psychological traits like intelligence and personality are influenced by genes. That’s why identical twins (who share all their genes) are not just more physically similar to each other than non-identical twins (who share half their genes), but also more similar in terms of their psychological traits. But what twin studies can’t tell us is which particular genes are involved. Frustratingly, this has always left an ‘in’ for the incorrigible critics of twin studies: they’ve been able to say “you’re telling me these traits are genetic, but you can’t tell me any of the specific genes!” But not any more. Continue reading “It’s now possible, in theory, to predict life success from a genetic test at birth”

Smartphone study reveals the world’s sleeping habits

Middle aged men get the least sleep, the research found

Researchers in the USA have used a smartphone app to see how people’s sleep habits vary around the world. More specifically they’ve investigated how much the timing of sunrise and sunset affect people’s sleep times or if social and cultural factors are more important. “Quantifying these social effects is the next frontier in sleep research,” they write in the paper in Science Advances.

The study involved the ENTRAIN smartphone app which helps people recover from jet lag by recommending ideal levels of light exposure based on a user’s typical sleep routine. Users have the option to make their information available for research. Olivia Walch and her colleagues at the University of Michigan began collecting data from the app in 2014 and the new analysis is based on information sent in by 8070 users around the world during the first year.

Overall the data showed that a later sunrise goes hand in hand with later waking-up times, and that a later sunset is associated with people going to bed later, just as predicted based on how light affects the suprachiasmatic nucleus – the bundle of neurons behind the eyes that controls our sleep cycle, also known as the circadian rhythm. But crucially, the link between sunset and bedtime was weaker than biological explanations would predict.

Put differently, the time we get up is strongly influenced by the timing of sunrise, but the time we go to bed is not as strongly influenced by sunset, suggesting other social and cultural factors are involved. Consistent with this account, most of the cross-cultural differences in sleep – for example, the Dutch reported the most sleep and Singaporeans the least – were explained by later bed times in the countries getting less sleep.

The difference between the countries with the most and least sleep wasn’t huge: just under 7.5 hours for Singapore and just over 8.1 hours for The Netherlands. But the researchers emphasised that even a 30-minutes difference is meaningful, especially when you consider that sleep debt can have a cumulative effect over time.

Users of the app from the UK averaged about 8 hours sleep (a healthy amount) with average wake time just after 7 am and average bed time just before 11.15.

The researchers were also able to use the smartphone data to compare sleep habits by age, gender, and time spent exposed to natural light. Age was the most important factor with older people tending to go to sleep earlier. There was also much less variability in the sleep times of older users, which could because of biological mechanisms that narrow the window of opportunity for when it’s easy for older people to fall asleep.

This age-related finding could have everyday relevance – “being careful about how much light affects your circadian clock could be more and more important to sleep as you get older,” the researchers said. If your body’s only willing to sleep between fairly limited hours, you’re best off listening to it and switching off that TV.

Meanwhile, women were found to get more sleep than men – 30 minutes more, on average – thanks both to going to bed earlier and waking up later. The gender difference was greatest in mid-life so that middle-age men are the demographic group getting the least sleep, on average.

In terms of exposure to outdoor, natural light, app users who had more of this tended to report going to sleep earlier and sleeping more, which is as you’d expect based on the effect of daylight hours on the brain’s circadian clock.

The researchers concluded that their results “point to the suppression of circadian signaling at bedtime as an important target for clinical sleep intervention; and suggest that age-related differences in the window during which sleep can occur are evidenced on a global scale”. Aside from these specific insights into sleep, the group also said their findings show the power of modern smartphone technologies as a research tool. “”This is a cool triumph of citizen science,” said co-author Daniel Forger in a press release.

A global quantification of “normal” sleep schedules using smarphone data

_________________________________
   
Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free weekly email will keep you up-to-date with all the psychology research we digest: Sign up!

Literally "knowing by heart": Signals from heart to brain prompt feelings of familiarity

The idea that the body affects the mind is not new. The eminent American psychologist William James famously proposed that it is actually the physical sensation of fear that causes us to feel afraid. In more recent years, researchers have extended this principle, exploring the possibility that physical sensations play an important role in moral decisions and other processes usually seen as more purely cognitive or cerebral, such as memory.

It’s already known that physical markers of arousal such as dilated pupils correlate with feelings of familiarity, but this could be because of the mental effort of remembering, rather than because physical arousal triggers the feeling of familiarity. Now a pioneering study in Journal of Experimental Psychology: General gets around this problem by testing people’s memory for faces presented at specific phases of the heart beat. The results provide compelling evidence that our feelings and judgments about familiarity are influenced by signals arising from the heart.

Chris Fiacconi at Western University and his colleagues began by asking 37 undergrads to look at 68 fearful faces, presented for 1.5 seconds each. Next, the participants looked at 136 more faces – half had appeared previously and half were new – and their task was to say whether they had seem them before. Crucially, the participants were wired up to an heart monitor and during the memory test some of the faces were presented at the precise moment that the heart had just pumped a burst of blood into the arteries – the so-called systole phase – while the other faces were presented while the heart was relaxing, known as the diastole phase.

This is important because the systole phase increases blood pressure, which is detected by baroreceptors in the heart’s arteries, and in turn the baroreceptors signal this change in pressure to various regions in the brain, including the brain stem but also higher brain areas involved in cognition. The amazing revelation from this first study was that participants were significantly more likely to say that a face was familiar if it was presented during the systole phase. This was true for faces that were old and also for those that were actually new. A follow-up study using neutral faces made the same findings.

A final study made things a little more elaborate. During the memory test for the faces, whenever participants thought they’d seen a face before, they were asked to clarify whether they actually recollected seeing it, or if it just felt familiar but they did not actually remember seeing it. Intriguingly, presenting a face during the heart’s systole phase increased participants’ tendency to report a sense of familiarity, but did not affect their claims to actually recollect having seen it.

These are dramatic results because they suggest that when making memory judgments, we don’t necessarily rely only on our actual memory traces in the brain, but that we also interpret our physiological sensations. By presenting faces at a specific moment in the heart beat cycle, the current research effectively hacks into this system to trick participants into thinking they’ve seen new faces before. As the researchers state – in such cases “participants may interpret the transient increase in arousal that results from baroreceptor mediated feedback as owing to the familiarity of the stimulus probe.”

Where next for this line of research? Fiacconi and his colleagues said that an important future goal is “to determine whether other epistemic feelings, such as feelings of knowing, tip-of-the-tongue-states, and deja-vu experiences are also shaped by this type of visceral feedback.”

_________________________________ ResearchBlogging.org

Fiacconi, C., Peter, E., Owais, S., & Köhler, S. (2016). Knowing by heart: Visceral feedback shapes recognition memory judgments. Journal of Experimental Psychology: General, 145 (5), 559-572 DOI: 10.1037/xge0000164

–further reading–
People make more moral decisions when they think their heart is racing
Neuroscience lessons in body awareness from the man with two hearts
Does your heart rate hold the secret to getting in “the zone”?

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free weekly email will keep you up-to-date with all the psychology research we digest: Sign up!

This one physiological measure has a surprisingly strong link with men’s and women’s propensity for violence

By guest blogger Richard Stephens

I have a professional interest in the naughty. In my recent book Black Sheep The Hidden Benefits of Being Bad I explored in a light hearted fashion the psychology around the upsides of various antisocial behaviours – things like swearing, drinking, affairs and untidiness to name a few. However, this post is about physical violence, a much more serious form of bad behaviour for which I see no upside at all.

Thankfully there is some fascinating psychology into the factors that may lead people to violence, and that may yet help society to curb such negative behaviour. While it’s true that many of the factors associated with violence are situational – things like poverty, unemployment and educational attainment – there are also personal characteristics that are strongly associated with the chances that a person will behave violently. One of these is a person’s heart rate.

A recent study published in The International Journal of Epidemiology is just the latest to find a link between lower resting heart rate and more violent behaviour. Joseph Murray at the University of Cambridge and his colleagues measured resting heart rate in over 3,000 male and female children growing up in Pelotas, a relatively poor city in a relatively rich southern state of Brazil.

This was longitudinal study in the style of the British “Seven Up!” documentary series, with children born in 1993 periodically called back for interviews and testing as they grew up. The researchers were specifically interested in their participants’ resting heart rate, which is the heart’s beats per minute after 10 minutes of sitting still quietly. The researchers measured this three times – when the children were aged 11, 15 and 18.

A novel aspect of this study compared with earlier research (including in the UK in the 1950s and 1970s and more recent US and Swedish studies), is the sheer frequency of extreme violence in Pelotas: in 2011 the city had a murder rate of 18.9 per 100,000 population, almost 20 times higher than in England and Wales and Sweden. The new study also included women whereas the earlier research focused only on men.

The researchers identified criminal behaviour through a combination of asking the young people at the age of 18 if they had committed any crimes during the past year, and by checking with legal agencies to see if they had a criminal record. Crimes were flagged as violent if they involved assault, robbery, weapons, murder, kidnapping, non-consensual sex, serious personal threats and other rare violent acts.

For males there were clear links between resting heart rate at age 11, 15 and 18 and participation in violent crime. Males with a lower resting heart rate averaging around 59-65 beats per minute were between one-and-a-half-times and two-times as likely to have committed violent crimes compared with males with a higher resting heart rate averaging around 90-92 beats per minute. Women with a lower resting heart rate were twice as likely to have committed violent crimes than women with a higher resting heart rate.

I should add that the researchers did some additional work checking whether several situational contexts known to be associated with violent behaviour – things like unplanned pregnancy, the mother’s years in education and the family income – had any bearing on the results. But the findings stood even after taking these situational factors into account.

Why might resting heart rate be linked with violence? One theory is that having a low resting heart rate is very unpleasant to the extent that it drives individuals to seek stimulation, which may manifest as antisocial behaviour. A similar explanation was put forward by Hans Eysenck in the 1960s to explain the extravert personality trait.

Another theory is that low resting heart rate is a sign of fearlessness. Children lacking fear may be more likely to commit antisocial acts because they are unconcerned about the possible adverse consequences such as admonishment by a parent or teacher. The current study did not have any means of testing these competing theories but earlier US research found no effect of fearlessness when looking at resting heart rate and aggressive antisocial behaviour. On balance then, fearlessness seems less likely to be the underlying cause.

As the authors of the new research point out, it is surprising that a personal, physical characteristic like resting heart rate can have such a clear cut link with violent behaviour for both men and women, above and beyond societal influences like poverty, inequality, gangs, drug trafficking and corrupt justice systems.

I asked study author Joseph Murray, what the direct impacts of these findings might be – could we use what we know about resting heart rate to prevent violent outbreaks before they happen? Professor Murray said “While these were fascinating findings, I do not think that there are any direct implications for practice”, adding, “the level of current understanding about the mechanisms involved does not permit more than speculation.”

Still, this study provides a clear illustration that if we want to understand societal problems like crime and antisocial behaviour, we should look closely at the psychological and biological factors that are involved, as well as the social and societal contexts in which these behaviours are played out.

_________________________________ ResearchBlogging.org

Murray, J., Hallal, P., Mielke, G., Raine, A., Wehrmeister, F., Anselmi, L., & Barros, F. (2016). Low resting heart rate is associated with violence in late adolescence: a prospective birth cohort study in Brazil International Journal of Epidemiology DOI: 10.1093/ije/dyv340

Post written by Richard Stephens for the BPS Research Digest. You can read more of Richard’s work in his critically acclaimed popular science book: Black Sheep The Hidden Benefits of Being Bad.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

Twin study raises doubts about the relevance of "grit" to children’s school performance

Grit is in vogue. US psychologist Angela Duckworth’s TED talk on grit is one of the most popular recorded. And her forthcoming book on the subject, subtitled “the power of passion and perseverance” is anticipated to be a bestseller. On both sides of the pond, our governments have made the training of grit in schools a priority.

To psychologists, “grit” describes how much perseverance someone shows towards their long-term goals, and how much consistent passion they have for them. It’s seen as a “sub-trait” that’s very strongly related to, and largely subsumed by, conscientiousness, which is known as one of the well-established “Big Five” main personality traits that make up who we are.

The reason for all the interest in grit, simply, is that there’s some evidence that people who have more grit do better in life. Moreover, it’s thought that grit is something you can develop, and probably more easily than you can increase your intelligence or other attributes.

But to a team of psychologists based in London and led by behavioural genetics expert Robert Plomin, the hype around grit is getting a little out of hand. There just isn’t that much convincing evidence yet that it tells you much about a person beyond the Big Five personality traits, nor that it can be increased through training or education.

Supporting their view, the researchers have published an analysis in the Journal of Personality and Social Psychology of the personalities, including grit, and exam performance at age 16 of thousands of pairs of twins. Some of the twins were identical meaning they share the same genes, while others were non-identical meaning they share roughly half their genes just like non-twin siblings do. By comparing similarities in personality and exam performance between these two types of twin, the researchers were able to disentangle the relative influence of genes and the environment on these measures.

The main finding is that the participants’ overall personality scores were related to about 6 per cent of the variation seen in their exam performance. Grit specifically was related to just 0.5 per cent of the differences seen in exam performance. Given the small size of this relationship, the researchers said “we believe that these results should warrant concern with the educational policy directives in the United States and the United Kingdom.”

Also relevant to the hype around grit, the researchers found that how much grit the participants had was to a large extent inherited (about a third of the difference in grit scores were explained by genetic influences), and that none of the difference in grit was explained by environmental factors that twin pairs shared, such as the way they were raised by their parents and the type of schooling they had (this leaves the remaining variance in grit either influenced by so-called “non-shared environmental factors” – those experiences in life that are unique to a person and not even shared by their twin who they live with – or unexplained). This is a disappointing result for grit enthusiasts because it suggests that the experiences in life that shape how much grit someone has are not found in the school or the home (at least not for the current sample). Bear in mind, though, that this doesn’t discount the possibility that a new effective home- or school-based intervention could be developed.

The researchers concluded that once you know a child’s main personality scores, knowing their amount of grit doesn’t seem to tell you much more about how well they’ll do at school. This study doesn’t rule out the idea that increasing children’s grit, if possible, could be beneficial, but the researchers warned that “more research is warranted into intervention and training programs before concluding that such training increases educational achievement and life outcomes.”

_________________________________ ResearchBlogging.org

Rimfeld, K., Kovas, Y., Dale, P., & Plomin, R. (2016). True Grit and Genetics: Predicting Academic Achievement From Personality. Journal of Personality and Social Psychology DOI: 10.1037/pspp0000089

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

New review prompts a re-think on what low sugar levels do to our thinking

Glucose. Fuel for our cells, vital for life. But how fundamental is it to how we think?

According to dual-systems theory (best known from Nobel laureate Daniel Kahneman’s work), low blood glucose favours the use of fast and dirty System One thinking over the deliberative, effortful System Two. Similarly, the ego depletion theory of Roy Baumeister sees glucose as a resource that gets used up whenever we resist a temptation.

But the authors of a new meta-analysis published in Psychological Bulletin find these claims hard to swallow. Their review suggests that glucose levels may change our decisions about food, but little else.

Jacob Orquin at Aarhus University and Robert Kurzban at the University of Pennsylvania searched the decision-making literature, finding 36 articles that directly investigated glucose by measuring blood concentration, providing participants with sugar solution, or via interventions such as wafting food smells, which triggers some amount of glucose production.

The authors pored through the articles and tabulated every effect, its direction as well as its size. They found the effects were very variable, often operating in different directions from study to study. But when the data was organised according to a key factor, a consistent pattern began to emerge. That factor? Food.

In payment tasks – involving hypothetical purchases (“how much would you pay for …”) and actual purchases while shopping – low blood glucose did increase people’s willingness to overspend … on food. But it actually made them less willing to spend money on non-food products. When it came to persistence on tasks (such as time spent trying to complete a puzzle), low glucose decreased willingness to work for non-food rewards, but led to more tenacious work towards food-related goals. And when people were given the choice between receiving a small amount now or a large one later, low glucose led to a large bias towards immediate gratification when food was the payoff, compared to a much smaller bias for non-food.

This pattern of results doesn’t fit the notion of glucose as willpower-fuel. It suggests instead that low glucose is a signal that, to ensure future wellbeing, food should be prioritised – by paying more for it, working harder for it, and grabbing a little now rather than taking the promise of more in the future. This signaling account also explains the recent discovery that you don’t need to consume glucose to produce some cognitive effects, simply tasting it is enough (by swishing around the mouth); no fuel has been received, but presumably the signaling system is temporarily fooled by the taste receptors.

Kahneman can sleep easy – the findings from this meta-analysis aren’t a blow to his dual process theory as a whole, merely the specific claim that glucose has a role in switching between thinking smart and slow. The meta-analysis is a more substantial problem for the claims of ego depletion, which are intimately related to the idea that willpower is a finite resource that depends on glucose.

Based on the prior glucose research and theory, some publications have recommended strategies like eating chocolate before tense marital discussions or stacking emergency Jelly Belly’s in the office desk drawer. But according to this meta-analysis, these strategies will yield little benefit; the main implication of being low on glucose is a greater preoccupation with finding something to eat. There’s a lot of strong psychological science out there to help with building everyday habits and making better decisions, so if you’re looking for a dose of something, we recommend you check those out instead.

_________________________________ ResearchBlogging.org

Orquin, J., & Kurzban, R. (2015). A Meta-Analysis of Blood Glucose Effects on Human Decision Making. Psychological Bulletin DOI: 10.1037/bul0000035

further reading
Labs worldwide report converging evidence that undermines the low-sugar theory of depleted willpower
New research challenges the idea that willpower is a “limited resource”

Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!