Wednesday, 6 May 2015

Story envy: When we borrow other people's personal anecdotes

In the study, men admitted "borrowing" other people's stories more than women
Admit it, have you ever told a cracking story to your friends but failed to include the crucial (but perhaps boring) caveat that the amusing events actually happened to someone else? A new survey of hundreds of US undergrads finds that borrowing personal memories in this way is common place.

Alan Brown and his colleagues found that nearly half of the 447 undergrads they sampled admitted to having told someone else's personal anecdote in its entirety as their own, and most of them said they'd done it more than once. This figure rose to nearly 60 per cent if you include the borrowing of story details rather than a complete tale. What's more, over half the sample also said they'd had the experience of someone else stealing their stories.

The most common reason the students gave for borrowing another person's story was because they wanted it to be a part of their identity and their past. Other reasons, from most to least commonly cited, included: to make the story have more impact (explaining that it was someone else's story was seen as a distracting detail), for convenience, and for status enhancement.

Why is story borrowing so common? One possibility is that after doing it once, we forget the original source. With re-tellings, the original story becomes tailored to our own identity and we begin to believe it's our own. Backing this up, around 30 per cent of the sample admitted to telling a story and only later realising they had borrowed it from another person. Also, more than half the sample said they'd had arguments with other people about ownership of a story, suggesting confusion of story origins is a familiar occurrence.

A further detail: men more often admitted to borrowing other people's stories, or details from them, and said they were involved in more disputes over story ownership. Men also cited status enhancement as a motive more often than women.

The researchers said their research had wider implications for the study of memory and that "by understanding the ways in which we tinker with our autobiographical records, we may gain insights into how we inadvertently alter our life stories."


Brown, A., Croft Caderao, K., Fields, L., & Marsh, E. (2015). Borrowing Personal Memories Applied Cognitive Psychology DOI: 10.1002/acp.3130

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Tuesday, 5 May 2015

Mindful eating makes smaller portions more satisfying

Have you ever been to an exclusive restaurant that serves tiny portions and found that, in spite of the paltry servings, you felt satisfied afterwards and the food seemed unusually tasty? If so, you might have engaged in what psychologists call "savouring" behaviours. Charles Areni and Iain Black have studied savouring under laboratory conditions, and they've found that when we're given smaller portions than normal, we eat differently – more slowly, more mindfully, and we feel more satiated as a result.

The researchers recruited dozens of undergrads for a supposed chocolate-tasting study. Half the participants were shown a tray of six delicious chocolates, to create the expectation that they would be tasting all six. In fact, after they'd tasted the first two, they were told that was the end of the experiment. The other half of the students were shown the tray of six chocolates, but told in advance that they would be tasting just two of them.

The students who knew they were only going to get to taste two chocolates ate more slowly than the students who thought they were going to taste all six, they also paid more attention to the flavour and texture of the chocolates, and they felt more satiated afterwards. They also enjoyed the chocolates just as much.

It's not just that the students who thought they were going to taste all six were in more of a rush. A follow-up study put procedures in place to control for this possibility: all students, whether expecting two or six chocolates, were told the experimenter would be coming in and out of the lab for the next few minutes, regardless of where they were at with their eating, and there were also some easy questionnaires to fill out. In other words, the time commitment of the study (30 minutes) would be the same regardless of how quickly they ate.

A final experiment was similar but this time the participants were filmed to count their number of chews. Also there was an extra condition: some students were only shown two chocolates in the first place (rather than seeing six and being told they'd only get to taste two). Students who knew they were only going to be tasting two chocolates ate more slowly and took more chews (11.5 more on average), compared with students who thought they were going to get to taste six chocolates.

"Consumers compensate for small portions by attending more to the sensory properties of the food, altering their eating behaviour, and slowing their rate of eating," the researchers said, "which has the effect of increasing satiation, hence lessening their desire for more afterwards."

Dietary restraint is usually seen as a battle between our current and future selves – the former wants to eat until full, the latter wants to avoid becoming overweight. However, Areni and Black said food savouring is a case of our current and former selves working in cooperation – by eating slowly, paying attention to tastes and textures, our current selves get just as much satisfaction from smaller portions, and our future selves are left sated and with a healthier figure. They added that it would be interesting for future research to see whether receiving larger portions than expected (just think of those "all you can eat buffets") has the reverse effect, encouraging faster, mindless eating.


Areni, C., & Black, I. (2015). Consumers’ Responses to Small Portions: Signaling Increases Savoring and Satiation Psychology & Marketing, 32 (5), 532-543 DOI: 10.1002/mar.20798

--further reading--
The 'power of one' - why larger portions cause us to eat more
A psychological problem with snacking in front of the telly

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest

Saturday, 2 May 2015

Link feast

Our pick of this week's 10 best psychology and neuroscience links:

The Ultimate Psychology Reading List
For more than seven years, The Psychologist has been asking eminent psychologists to recommend books and journal articles.

The Vice Guide To Mental Health
The Canada-based magazine launches a new vertical devoted to mental health.

The People Who Are Lost in Time
At BBC Future, I reported on some of the strange causes and consequences of amnesia.

The Psychologist Annual Conference Special Edition
An exclusive digital edition produced to mark the British Psychological Society's flagship event, held in Liverpool 5-7 May. It comprises archive material from speakers at this year's conference.

Cutting The Body to Cure the Mind
The Lancet looks back on the rise and fall of treatments like leucotomy and ovariotomy.

Science Shows Humblebragging Doesn’t Even Work
Better to just brag than "humblebrag", says new study.

Is Happiness Worth Measuring?
The Guardian hosts a debate on whether the government should be worried about the state of the nation’s wellbeing or should concentrate instead on what makes people feel unhappy.

Madness and Meaning
Depictions of mental illness through history, from the Paris Review

Living With Being Dead
Erika Hayasaki at Medium explores the cotard delusion.

First Results From Psychology’s Largest Reproducibility Test
Nature reports on an attempt to reproduce the findings from 100 psychology studies.
Post compiled by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Friday, 1 May 2015

Children use time words like "seconds" and "hours" long before they know what they mean

For adults, let alone children, time is a tricky concept to comprehend. In our culture, we carve it up into somewhat arbitrary chunks and attribute words to those durations: 60 seconds in a minute, and 60 of those in an hour and so on. We also have a sense of what these durations feel like. Children start using these time-related words at around the age of two or three years, even though they won't master clocks until eight or nine. This raises the question – what do young children really understand about duration words?

Katharine Tillman and David Barner began by asking dozens of three- to six-year-olds to compare several pairs of durations (e.g. Farmer Brown jumped for a minute. Captain Blue jumped for an hour. Who jumped more?). As well as minutes and hours, other durations used were seconds, days, weeks, months and years. This test showed that by age four, the children were tending to get more of these questions right than would be expected if they were just guessing. With increasing age, the children got better at the task. In other words, from age four and up, children have a sense of the rank order of different duration terms.

What young children don't have, according to the findings from further experiments, is a sense of the actual lengths of time that these terms refer to. When the comparison test was repeated, but with different amounts of each duration, the children were flummoxed. Take, for example, the question "Farmer Brown jumped for three minutes. Captain Brown jumped for two hours. Who jumped more?" As adults, we aren't thrown by the minutes outnumbering the hours by three to two, because we know that an hour feels much longer, and is by definition 60 times longer. However, even five-year-olds, who know well the principle that an hour is longer than a minute, were thrown by these kinds of comparisons. This suggests they don't yet have a very good understanding of the formal definitions of duration words, nor what the different durations feel like.

In another experiment, five- to seven-year-old children were asked to place different duration words along a horizontal line after the far-left end had been described to them as the location for "something very short, like blinking" and the far-right end as "something very long, like the time from waking up in the morning to going to bed at night". Again, before age 6 or 7, the children really struggled with this – even with the order correct, they tended to space them out inappropriately, compared with how an adult would do it. Six and seven-year-olds who knew the formal definitions for the duration words tended to perform better.

These findings mirror what's been found for the way children use words for other concepts like numbers and colours. Before they map the words onto actual perceptual experiences, they understand that words in a given domain are related, and (in the case of numbers and time), they have a sense of the relative magnitude of the concepts. But it's only after using such words for some years, and learning their formal definitions, that they fully connect the experience of the concept (such as the length of an hour, or the physical magnitude of a number) with its corresponding word.

"Our results indicate that proficiency in estimating the absolute time encoded by duration words emerges relatively late," the researchers said, "and may even rely on formal instruction in [primary] school."


Tillman, K., & Barner, D. (2015). Learning the language of time: Children’s acquisition of duration words Cognitive Psychology, 78, 57-77 DOI: 10.1016/j.cogpsych.2015.03.001

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Thursday, 30 April 2015

Why the message – that we're all prone to stereotyping others – is so dangerous

Telling people they are biased in their treatment of others – that they are racist or ageist, for example – can make them defensive and result in backlash. For this reason, change-makers nowadays often spread a different message: that stereotyping others isn’t a personal sin, but near-universal and something we must all aim to resist. However a new paper from researchers Michelle Duguid and Melissa Thomas-Hunt argues that this "Everyone Stereotypes" message, far from reducing bias, may actually encourage it.

In initial experiments, participants were simply asked to rate a particular group, such as women, on a series of stereotypical characteristics, which for women were: warm, family-oriented and (less) career-focused. Beforehand, half of the participants were told that "the vast majority of people have stereotypical preconceptions." Compared to those given no messages, these participants produced more stereotypical ratings, whether about women, older people or the obese.

Another experiment used a richer measure of stereotyping – the amount of clichés used by participants in their written account of an older person’s typical day. This time, those participants warned before writing that “Everyone Stereotypes” were more biased in their writings than those given no message; in contrast, those told that stereotyping was very rare were the least clichéd of all. Another experiment even showed that hearing the “Everyone Stereotypes” message led men to negotiate more aggressively with women, resulting in poorer outcomes for the women.

The reason the “Everyone Stereotypes” message goes wrong can be found in a cornerstone of social psychology: we are more inclined to do something if others in our group are doing it. This means unspoken biases firm up when we believe them to be ubiquitous, and we may even react to counter-examples with greater hostility: as a man, a strong woman leader isn’t just a challenge to my assumptions, but questions the judgment of my entire gender.

Duguid and Thomas-Hunt also suspect their finding may generalise to perverse effects for other types of influencing, such as the use of statistics to emphasise injustice. For instance, after hearing that very few women inhabit CEO roles, business leaders might be galvanised to change – or, they might conclude that since their peers haven’t chosen to tackle the glass ceiling, perhaps there are good reasons to go slow themselves.

If this is true, what can be done? Portraying stereotyping as rare (or fiddling statistics) is simply misleading, while not discussing it at all is defeatist. A further experiment suggests a possible solution. In line with the other studies, men given the “Everyone Stereotypes” message were less likely to hire a hypothetical female job candidate who was assertive in arguing for higher compensation. But other men told that everyone tries to overcome their stereotypes were fairer than those who received no information at all. The participants were adjusting their behaviour to fit the group norms, but this time in a virtuous direction.

This approach essentially adopts the method found in many personal and organisational change philosophies, such as Appreciative Inquiry, of framing messages around a desired outcome rather than an unwanted situation. By uniting around what is good in us, we’re more likely to get welcome results.


Duguid, M., & Thomas-Hunt, M. (2015). Condoning stereotyping? How awareness of stereotyping prevalence impacts expression of stereotypes. Journal of Applied Psychology, 100 (2), 343-359 DOI: 10.1037/a0037908

Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.

Wednesday, 29 April 2015

People are overly optimistic about the benefits of optimism

"It is our attitude at the beginning of a difficult task which, more than anything else, will affect its successful outcome." The sentiment articulated here by psychology pioneer William James is currently in vogue, if its preponderance in self-help books, motivational posters, and memes is anything to go by. But are we pinning too much on positive thinking?

A research team led by Elizabeth Tenney asked participants to guess how much a given task is affected by optimism, then compared this to how people actually fared when they were feeling more or less optimistic. So in one instance, "task completers" attempted a maths task, having been given false feedback that told them, based on their training performance, they were likely to do well or poorly, thus influencing their optimism. "Predictor" participants then guessed how the completers would perform, knowing that these people didn’t differ in calibre, only in the artificial feedback they’d received. Predictor participants expected the optimistic completers to do significantly better than those feeling pessimistic, but the reality is they didn’t.

Another experiment used a "Where’s Waldo?" task where task completers could study each complex image for as long as they wanted as they sought to pick out the figure hidden within. We might expect optimism to deliver results through sheer tenacity, and indeed the optimistic task completers did persist for about 20 per cent longer on the task. But this translated into a scant 5 per cent (statistically non-significant) improvement, not the hefty 33 per cent improvement expected by the predictors. Once again, people were shown to expect optimism to produce results in situations where the reality was otherwise.

A final experiment demonstrated that even when attention isn’t drawn artificially to people’s optimism, we still overrate its importance. Here, nine participants were each asked to estimate how 99 task completers had fared on a task, guided by character profiles of the completers, which included, among a host of other information, their level of optimism. Each profile characteristic gave participants more or less insight into the completers’ true performance: for instance, enjoyment of the test was a good, but not perfect, indicator that the person had performed well on the test. Participants were quite accurate in how much weight they gave to these cues – except for optimism, which they treated as a much more powerful factor than it truly was. This result suggests it wasn’t the way the earlier experiments were framed that led predictors to make too much of optimism; they are happy to do that all on their own.

This work doesn’t suggest that optimism is ineffective as a broad strategy for approaching life, or at helping us fulfil objectives at a broad scale. But it does suggest that we put more on the shoulders of optimism that it can bear. If you do badly at a test, rather than fretting that the cause was your negative mental attitude, it might be better to simply focus on your knowledge and approach.


Tenney, E., Logg, J., & Moore, D. (2015). (Too) optimistic about optimism: The belief that optimism improves performance. Journal of Personality and Social Psychology, 108 (3), 377-399 DOI: 10.1037/pspa0000018

--further reading--
Optimism and pessimism are separate systems influenced by different genes

Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.

Tuesday, 28 April 2015

Can feeling lonely make you hungry?

Loneliness is bad for you. Some experts have even likened it to a kind of disease. What's unclear is how being being lonely leads to these adverse effects on our health. A new study looks at one possibility – that loneliness makes people feel hungrier than normal, thus increasing their food intake and putting them at risk of obesity with all its associated health problems.

Lisa Jaremka and her colleagues asked 42 women (average age 53) to fast for 12 hours before visiting the psych lab. On arrival in the morning, the women were asked to eat an entire 930 calorie meal consisting of eggs, turkey sausage, biscuits and gravy. Before they ate the meal and several times during the seven hours afterwards the women rated their hunger. Their feelings of extreme loneliness had been recorded five months earlier as part of a different study. Their ghrelin levels (ghrelin is a hormone that's associated with hunger and promotes eating) were recorded by blood test before the meal and 2 and 7 hours later.

Among only the women with a healthy weight (based on their BMI), those who reported feeling more lonely exhibited higher ghrelin levels at the end of the day they visited the lab, and said they felt hungrier. This result is consistent with another recent study by the same researchers, that found women who'd experienced more interpersonal stress had higher ghrelin levels and lower leptin (an appetite-suppressing hormone).

Why should loneliness be associated with feeling more hungry? Jaremka and her team speculate that it could be an evolutionary hangover and adaptive in the sense that hunger encourages eating, which encourages greater social bonding. "Eating was a highly social activity throughout human evolution, and today meals are often eaten with other people," they explain. Other research has shown that eating comfort food prompts thoughts of relationships. "Consequently," the researchers said, "people may feel hungrier when they feel socially disconnected because they have either implicitly or explicitly learned that eating helps them feel socially connected and/or provides them with an opportunity for social connection." It's unclear why the same process wasn't triggered in the overweight women.

The study has some methodological issues, among them: the lack of a current loneliness measure; no male participants; and half the small sample of women were cancer survivors recruited from an earlier study. This may limit the generalisability of the findings, but note the main results from this study were the same for all participants regardless of their medical history.


Jaremka, L., Fagundes, C., Peng, J., Belury, M., Andridge, R., Malarkey, W., & Kiecolt-Glaser, J. (2015). Loneliness predicts postprandial ghrelin and hunger in women Hormones and Behavior, 70, 57-63 DOI: 10.1016/j.yhbeh.2015.01.011

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest