Fussy eating – also referred to as “selective eating” in scholarly research – is incredibly common among children, with upper estimates placing the prevalence at 50 per cent. Despite this, many parents understandably fret when their kids avoid a lot of foods, won’t try new things and/or will only eat certain meals. They worry whether their child is getting enough vitamins and if their child’s fussiness is some kind of precursor to later more serious eating problems.
A new, small study in Eating Behaviors is the first to document how fussy eating develops in the same individuals over time into early adulthood and may provide a crumb (sorry) of comfort for anxious parents. It’s true that 60 per cent of fussy eating children in the study were also fussy eaters at age 23, but fussy eating young adults were no more likely to report signs of eating disorder than their non-fussy peers.
Crisps, coke, and chocolate bars. What might be a special treat for some of us, is now a multi-billion pound industry and a staple of many people’s diets. Advertising campaigns from the snack food companies, often starring sports stars, send the message that we can offset any adverse effects of consuming their products simply by getting more physical exercise. But you can’t really “run off” a burger – recent studies show a lack of exercise is not to blame for rising obesity rates, bad diets are the real driver.
Interventions to help reduce junk food consumption are especially important for children and adolescents – prevention is better than cure in this context because obesity is so difficult to treat. Unfortunately, while health education in the classroom has shown some success among young children, adolescents have been notoriously hard to reach.
But now a large-scale study published in PNAS has tried an innovative approach to change teenagers’ attitudes towards healthy eating, and the results are promising. The researchers, led by Christopher Bryan at the University of Chicago and David Yeager at the University of Texas at Austin, argued that previous interventions have probably been unsuccessful because of a major flaw: they focused on a future, healthier you and assumed that this would be enough motivation for adolescents. In contrast, the new intervention cleverly exploits teenagers’ instinct for rebelliousness and autonomy, and the value they place on social justice. Continue reading “Teens reject junk food when healthy eating is framed as rebellion”→
These days it’s hard to avoid the message that thin is best. From advertising billboards to the Oscar red carpet, we are inundated with images of successful ultra-thin women.
Past research has already shown that this ideal is filtering through to our children, even preschoolers. But before now, there has been little study of just how early pro-thin bias (and prejudice against fat people) appears, and how it develops with age.
Jennifer Harriger tested 102 girls from the South Western US, aged between three and five. She first asked the girls to consider 12 adjectives (six positive and six negative, including nice, smart, mean stupid) and to allocate each one to whichever of three female figures they felt the adjective was most suited. The precise wording was “Point to the girl that you think is/has …”. Crucially, one of the female figures was very thin, one was very fat, and the other average, with no other differences between them. Three-year-olds, four-year-olds and five-year-olds all tended to allocate more negative adjectives to the fat figure and more positive adjectives to the thin figure.
Another test involved the children looking at nine figures (three fat, three average and three thin) and choosing their first three preferences for playmates, and finally to choose their best friend from the selection. Children at all ages tended to choose a thin figure for their first choice, a thin or average for the second choice, with no bias in their third choice. Best friend choices tended to be thin.
Age differences were few, but there was some evidence that three-year-olds were showing more of a bias for thinness, as opposed to a bias against fat people, with fat prejudice increasing with age. For example, only the youngest girls allocated more negative adjectives to average and fat figures than to the thin figures, consistent with their believing “thin is good” rather than “fat is bad”.
The research needs to be replicated in other countries, with boys, and with even younger children. Harriger also noted that it would be interesting to look at the influence of children’s own weight and the beliefs of their parents, siblings and peers. For now, she said her findings illustrated “an increasing preference for thinness and intolerance for fatness in preschool age girls …” and that the promotion of size acceptance “must begin even earlier than we once believed.”
Harriger, J. (2014). Age Differences in Body Size Stereotyping in a Sample of Preschool Girls Eating Disorders, 23 (2), 177-190 DOI: 10.1080/10640266.2014.964610
Health experts say we aren’t eating enough fruit. Perhaps psychology can help. Try this. Picture yourself eating a portion of fruit tomorrow – an apple, say, or a couple of plums. Take your time. Focus on the colours, the consistency, the flavour. Visualise where you are at the time, and what you are doing.
Do you think this simple imagery task will have increased the likelihood you will eat fruit tomorrow? A new study led by Catherine Adams attempted to find out. Over two hundred volunteers were split into three groups. One performed the fruit imagery task, another group did the same thing but for a biscuit bar of their choice (examples they were given included flapjacks, Kellogg’s Elevenses and Jaffa Cake bars), and a final group did not perform an imagery task.
Straight after, the participants answered questions about their food preferences, future consumption intentions, and they were offered a reward from a basket of fruits and biscuit bars. Two days later they were also asked by email whether they had any eaten fruit or a biscuit bar the day before (35 per cent of them answered this).
Once the researchers controlled for background factors (such as the possibility there were more fruit lovers in one condition or the other), they found that the fruit imagery task made no difference to participants’ intentions to eat fruit, no difference to their choice of fruit as a reward, nor their consumption of fruit the next day, as compared with the control group who didn’t perform the imagery. For the biscuit bar group, the imagery task increased their intentions to eat biscuit bars in the future, but didn’t actually alter their consumption (as compared against the no-imagery control group).
“These effects suggest different effects for different visualised behaviours,” the researchers said. “Further investigation is needed before recommending visualisation for increasing fruit consumption.”
As the researchers’ acknowledged, there are some issues with the study that mean caution is needed in interpreting the results. For instance, just one brief imagery session may well be inadequate. Also, other research suggests imagery works best when combined with other strategies, such as “if-then” implementation plans (e.g. If I am hungry, then I will snack on some fruit). The response rate to the follow-up email was also disappointing, and bear in mind that participants may have felt the food they chose immediately after the imagery was a form of reward, and therefore this behaviour may not reflect their usual eating choices. These issues show how difficult health behaviour research can be.
_________________________________ Adams C, Rennie L, Uskul AK, and Appleton KM (2013). Visualising future behaviour: Effects for snacking on biscuit bars, but no effects for snacking on fruit. Journal of health psychology PMID: 24217063
Imagine you are the driver & your chocolate cravings are unruly passengers
If someone gave you a bag of 14 chocolates to carry around for five days, would you be able to resist eating them and any other chocolate? That was the challenge faced by 135 undergrads in a new study that compared the effectiveness of two different “mindfulness” resistance techniques.
Kim Jenkins and Katy Tapper taught 45 of their participants “cognitive defusion”, the essence being that “you are not your thoughts”. The students were told to imagine that they are the driver of a mindbus and any difficult thoughts about chocolate are to be seen as awkward passengers. The students chose a specific method for dealing with these difficult thoughts/passengers and practised it for five minutes – either describing them, letting them know who is in charge, making them talk with a different accent, or singing what they are saying.
Another group of students were taught an acceptance technique known as “urge surfing”. They were instructed to ride the wave of their chocolate cravings, rather than to sink them or give in to them. A final group of students acted as controls and were taught a relaxation technique.
As well as trying to resist the bag of chocolates, the students in all conditions were asked to avoid eating any other chocolate as far as possible, and to keep a diary of any chocolate they did eat over the five days.
The key finding is that the mindbus group ate fewer chocolates from their bag as compared with students in the control group. By contrast, the urge surfing group ate just as many of their chocolates as the controls. Diary records showed the differences between groups in their other chocolate consumption were not statistically significant, although there was a trend for the mindbus group to eat less (13g vs. 52g in the urge surfing group and 44g in the control condition). Another way of describing the results is to say that 27 per cent of the mindbus group ate some chocolate over the five-day period, compared with 45 per cent of the urge surfers and 45 per cent of controls.
A habits questionnaire suggested the mindbus technique was more effective because it reduced the students’ mindless, automatic consumption of chocolate more than the other interventions. Jenkins and Tapper said their results show the mindbus “cognitive defusion” technique is a “promising brief intervention strategy” for boosting self-control over an extended time period.
The serious chocaholics among you may not be so convinced. Although the students were recruited on the basis that they wanted to reduce their chocolate consumption, they appeared to show saintly levels of abstinence. On average, even the control group participants ate just 0.69 chocolates from their bag over the five day period (compared with an average of 0.02 chocolates in the mindbus condition; 0.27 in the urge surfing condition). The controls’ other chocolate consumption amounted to the equivalent of little more than four individual chocolates over five days. You’ve got to wonder – how serious were these participants about chocolate and just how tasty were the chocolates in that bag*?
Another thing – the researchers included a measure of “behavioural rebound”. After the students returned to the lab on day five, they were presented with a bowl of chocolates and invited to eat as many as they liked. The groups didn’t differ in the amount of chocolates they consumed, which the researchers interpreted as a good sign – after all, the mindbus group hadn’t compensated for their restricted intake during the week. But hang on, they also showed no evidence of greater resistance to the chocolate. Sounds to me like the passengers had taken over the bus.
_________________________________ Jenkins, K., and Tapper, K. (2013). Resisting chocolate temptation using a brief mindfulness strategy. British Journal of Health Psychology DOI: 10.1111/bjhp.12050
*Co-author Katy Tapper got in touch on Twitter to tell us: “The chocolates were very tempting Cadbury’s Celebrations!”
People who self-identify as dieters are an unhappy bunch on the whole. They usually score high on measures of depression and anxiety and low on self-esteem. A new study provides a clue as to why. Jessie de Witt Huberts and her colleagues tested three groups of female students and found the “restrained eaters” (they reported dieting more often and being conscious of their food intake) ate just as much as other students. They also experienced a lot more guilt, especially in relation to eating. In essence, these are people who seem to constantly set themselves up for failure, while also robbing themselves of the pleasures of eating. “Despite their good intentions,” the researchers said, “restraint eaters seem to gain nothing and lose twice.”
The research took place across three studies, all following a similar procedure. Dozens of female undergrads were invited to a lab to take part in what they thought was a food-tasting session for a supermarket chain. They were left alone for ten minutes to sample high and low calorie food, like chips and apple slices. Then they were asked questions about their emotions, including their guilt, and about their attitudes towards food, including how much they diet and how often they worry about what they’re eating.
Checking the food afterwards, the researchers found that the restrained eaters – those who dieted often and who fretted about their consumption – had eaten just as much as the other participants, including just as much high-calorie food. But crucially, they felt more guilty afterwards, especially in relation to their recent indulgence.
This study doesn’t prove that being a restrained eater causes increased guilt. It’s possible there’s one or more other factors that cause a person to watch what they eat and to experience more guilt. One could also argue that the set-up was a little unfair on the restrained eaters – they’d been asked to taste the food, after all; perhaps they do exert more control over their intake in everyday life.
Nonetheless, the results are certainly intriguing, and help explain why restrained eaters tend to experience psychological problems and why they tend to develop problematic eating habits. In effect, it appears these people are locked into a vicious circle. Guilt after over-eating likely encourages them to renew their promises to eat less. And when they fail again to reduce their eating, yet more guilt ensues, this time more intense than before. Given that “45 per cent of young girls currently report dieting”, the researchers said it’s imperative that we learn more about why so-called restrained eaters experience such negative outcomes.
_________________________________ de Witt Huberts, J., Evers, C., and de Ridder, D. (2012). Double trouble: restrained eaters do not eat less and feel worse. Psychology and Health, 1-15 DOI: 10.1080/08870446.2012.751106
The breakthrough 2007 study showed that ego-depleted participants had low blood glucose levels, but those who subsequently consumed a glucose drink were able to sustain their self-control on a second task. In the intervening years the finding has been replicated and the glucose-willpower link has come to be stated as fact.
“No glucose, no willpower,” wrote Baumeister and his journalist co-author John Tierney in their best-selling popular psychology book Willpower: Rediscovering Our Greatest Strength (Allen Lane, 2012). The claim was also endorsed in a guide to willpower published by the American Psychological Association earlier this year. “Maintaining steady blood-glucose levels, such as by eating regular healthy meals and snacks, may help prevent the effects of willpower depletion,” the report claims.
But now two studies have come along at once (following another published earlier in the year) that together cast doubt on the idea that depleted willpower is caused by a lack of glucose availability in the brain. In the first, Matthew Sanders and his colleagues in the US report what they call the “Gargle effect”. They had dozens of students look through a stats book and cross out just the Es, a tiresome task designed to tax their self-control levels. Next, they completed the famous Stroop task – naming the ink colour of words while ignoring their meaning. Crucially, half the participants completed the Stroop challenge while gargling sugary lemonade, the others while gargling lemonade sweetened artificially with Splenda. The participants who gargled, but did not swallow, the sugary (i.e. glucose-containing) lemonade performed much better on the Stroop task.
The participants in the glucose condition didn’t consume the glucose and even if they had, there was no time for it to be metabolised. So this effect can’t be about restoring low glucose levels. Rather, Sanders’ team think glucose binds to receptors in the mouth, which has the effect of activating brain regions involved in reward and self-control – the anterior cingulate cortex and striatum.
The other study that’s just come out was conducted by Martin Hagger and Nikos Chatzisarantis based in Australia and the UK. Their approach was similar to Sanders’ except that participants gargled and spat out a glucose or artificially sweetened solution prior to performing a second taxing task, rather than during. Also, this research involved a series of 5 experiments involving many different ways of testing people’s self-control, including: resisting delicious cookies; reading boring text in an expressive style; unsolvable puzzles; and squeezing hand-grips. But the take-home finding was the same – participants who gargled, but did not swallow, a glucose drink performed better on a subsequent test of their willpower; participants who gargled an artificially sweetened drink did not. So again, willpower was restored without topping up glucose levels. Moreover, the benefit of gargling glucose was displayed only by participants who’d had their self-control taxed in an initial task. It made no difference to participants who were already in an untaxed state.
Hagger and Chatzisarantis agree with the interpretation of the Sanders’ group, except they make a distinction. The effect of glucose binding to receptors in the mouth could either stimulate activity in brain regions like the anterior cingulate that tend to show fatigue after a taxing task. Or they say that glucose in the mouth could trigger reward-related activity that prompts participants to interpret a task as more rewarding, thus boosting their motivation. The explanations are complementary and need not be mutually exclusive.
The key point is the new results suggest depleted willpower is about motivation and the allocation of glucose resources, not about a lack of glucose. These findings don’t prove that consuming glucose has no benefit for restoring willpower, but they suggest strongly that it’s not the principle mechanism. It’s notable that the new findings complement previous research in the sports science literature showing that gargling (without ingesting) glucose can boost cycling performance.
“While our findings are consistent with the predictions of the resource-depletion account, they also contribute to an increasing literature that glucose may not be a candidate physiological analog for self-control resources,” write Hagger and Chatzisarantis. “Instead ego-depletion may be due to problems of self-control resource allocation rather than availability.” An important next step is to conduct brain-imaging and related studies to observe the physiological effects of gargling glucose on the brain, and on motivational beliefs. There are also tantalising applications from the new research – for example, could the gargle effect (perhaps in the form of glucose-infused chewing gum) be used as a willpower aid for dieters and people trying to give up smoking?
_________________________________ Hagger, M., and Chatzisarantis, N. (2012). The Sweet Taste of Success: The Presence of Glucose in the Oral Cavity Moderates the Depletion of Self-Control Resources. Personality and Social Psychology Bulletin DOI: 10.1177/0146167212459912 Sanders, M., Shirk, S., Burgin, C., and Martin, L. (2012). The Gargle Effect: Rinsing the Mouth With Glucose Enhances Self-Control. Psychological Science DOI: 10.1177/0956797612450034
According to an influential and controversial theory, autism is the manifestation of an “Extreme Male Brain“. The reasoning goes something like this – the condition is far more prevalent in males than females; people with autism think in a distinctive style that’s more commonly observed in men than women (that is, high in systematising and low in empathising); and greater testosterone exposure in the womb appears to go hand in hand with an infant exhibiting more autism-like traits in later childhood.
Simon Baron-Cohen, the psychologist who first proposed the theory, always conjectured that there may also be such a thing as an “Extreme Female Brain”. Now in a new paper, a pair of researchers in the USA have made the case that the Extreme Female Brain exists, it’s highly empathic, and it comes with its own problematic consequences, in terms of a fear of negative evaluation by others, and related to that, a greater risk of eating disorders (which are known to be far more prevalent in women than men).
Supporting their claims, Jennifer Bremser and Gordon Gallup Jr surveyed hundreds of male and female undergrads and found that men and women with more dysfunctional attitudes towards eating, and more fears of being negatively evaluated by others, also tended to score more highly on self-reported measures of empathising. A fear of being negatively evaluated was also associated with lower scores in systematic thinking.
In other words, people with a thinking style more often observed in women, and opposite to that seen in people with autism (high in empathising, low in systematising), tended to be at greater risk for eating disorders and social anxiety.
The results got a bit messier with objective measures. Among female participants, dysfunctional attitudes towards eating were associated with higher scores on an objective measure of empathising, one that involved interpreting emotions from pictures of people’s eyes. But for males, dysfunctional attitudes to eating actually predicted lower scores on the test.
The researchers surmised that perhaps these men were over-interpreting the pictures – “hyper-mentalising” – and seeing emotions that weren’t there, which would be consistent with their central thesis about the Extreme Female Brain. Supporting this, further studies found that dysfunctional attitudes towards eating and fear of negative evaluation by others also tended to go hand in hand with higher self-reported scores on schizotypy, including exaggerated suspiciousness, magical thinking and paranoia – arguably all signs of “hyper-mentalising”, and the opposite of what’s seen in autism.
What about objective measures of systematising? Dysfunctional attitudes toward eating and fear of negative evaluation weren’t associated with understanding the laws of physics, but they were associated with poorer mental rotation performance scores.
“Evidence from all four studies converge to show that a combination of disordered eating and negative evaluation anxiety are associated with a cognitive style that Baron-Cohen predicted for the Extreme Female Brain,” the researchers concluded.
One last thing – Bremser and Gordon Gallup Jr said their ideas suggested a novel explanation for why vegetarianism is particularly prevalent among people with eating disorders. Previously it’s been assumed that vegetarianism is popular for this group as a means of calorie restriction. However, if eating disorders are part of the manifestation of an Extreme Female Brain, one that’s associated with exaggerated empathy, then vegetarianism may be a natural consequence of having enhanced empathy for animals.
_________________________________ Bremser JA, and Gallup GG Jr (2012). From one extreme to the other: Negative evaluation anxiety and disordered eating as candidates for the extreme female brain. Evolutionary psychology : an international journal of evolutionary approaches to psychology and behavior, 10 (3), 457-86 PMID: 22947672
Some morals – such as it being wrong to hurt others – children learn because they see the distress a particular behaviour causes others, or the harm it can bring upon themselves. But other immoral behaviours don’t necessarily have obvious victims. These relate to so-called purity-based morals, such as taboo sexual relations, sacrilegious acts or inappropriate eating behaviours. How do kids learn that these things are wrong, especially if they’ve never actually encountered them?
A new study shows that children are primed to recognise the immorality of certain behaviours by feelings of disgust and beliefs about unnaturalness, especially when these factors are combined. Joshua Rottman and Deborah Kelemen at Boston University manipulated these factors to provoke 7-year-olds into judging novel behaviours by alien characters as immoral.
“This is the first experimental investigation of a clear-cut case of moral acquisition,” Rottman and Kelemen said, “one involving morally naive subjects … and entirely novel and superficially amoral situations.”
Sixty-four 7-year-olds were introduced to the faraway planet “Glinhondo” and its alien occupants. The children were then split into four groups and shown pictures of 12 different scenarios, each accompanied by a short spoken description. The scenarios involved several aliens engaging in behaviours directed at their own bodies (e.g. covering their heads with sticks) or at the environment (e.g. sprinkling blue water into a big puddle). After seeing each scenario, the kids had to say whether the depicted behaviour was “wrong” or if it was “OK”.
Children in the “disgust” condition viewed the pictures in a room sprayed with the stinky but harmless joke-shop product “Liquid ASS”, and the description of the scenarios also highlighted that the alien behaviours were disgusting. Children in the “unnatural” condition viewed the scenarios in a fresh room, but they saw pictures in which only a minority of aliens performed the behaviours and the description highlighted that what they were doing was “unnatural”. Kids in a third group experienced a combination of the disgust and unnaturalness – the room stank and it was a minority of aliens performing the behaviour, which was described as unnatural. Finally, some of the kids formed a control group – the room was fresh, all the aliens performed the behaviours and the description merely said that what they were doing was boring.
Children in the combined disgust and unnaturalness condition judged 65 per cent of alien behaviours as “wrong”, compared with just 19 per cent of behaviours judged that way by the control group. “This demonstrates that moral acquisition can occur rapidly and in the absence of direct experience with moralised behaviour,” the researchers said. “This also speaks against the idea that the primary mechanism guiding moral acquisition is children’s active reasoning about harmful or unjust consequences.”
The children in the disgust-only or the unnatural-only conditions also judged more alien behaviours as wrong, compared with kids in the control condition, but in both cases they tended to answer “wrong” about half the time, so there’s a possibility they were just alternating their answers at random.
The findings show how visceral feelings of disgust combine with intellectual thoughts about what’s “natural” to invoke in children a sense of moral wrongness. Another finding was that environmentally directed actions were more often judged as wrong than self-directed actions. “Ultimately, the degree of plasticity inherent within a young child’s moral repertoire is a crucial area of future exploration, and one that is currently under explored,” Rottman and Kelemen concluded. “The implications of such research will be substantial, promising to answer fundamental questions about the horizons of our moral nature.” _________________________________
Rottman J, and Kelemen D (2012). Aliens behaving badly: Children’s acquisition of novel purity-based morals. Cognition, 124 (3), 356-60 PMID: 22743053
Maybe you’ve tried giving them names – Sally Sprout or Brian the Broccoli. Or perhaps you’ve made noises of gastronomic delight, “hmm, yummy!” Yet still your young child refuses to eat their greens. Maybe it’s because of that slight, but all too visible, sneer on your face. After all, you’re not wild about veggies either. Well, it’s time for you to become a better actor. A new study suggests that young children are particularly sensitive to the emotional expressions of other eaters, and that these emotions are likely to affect their eating habits.
Laetitia Barthomeuf and her team presented 43 5-year-olds, 38 8-year-olds and 42 adults with photographs of two women eating various foods. As they ate, the women either looked happy, disgusted or just had a neutral expression. There were six different foods – three that the participants had earlier said they liked (chocolate, bread and cream cake) and three that they said they disliked (kidney, black pudding, cooked sausage with vegetables). Twenty-seven additional participants had been excluded earlier because their preferences didn’t fit this pattern.
As they looked at each photo, the child and adult participants were asked to say how much, on a scale of 1 to 10, they desired to eat the food that the woman in the photo was eating. The take home finding – the children, especially the five-year-olds, were influenced much more by the facial expressions of the women, than were the adults.
If the woman in the photo had a look of disgust, this reduced the children’s, and to a lesser extent, the adults’, desire to eat foods that they liked. In contrast, if the woman had a look of pleasure on her face, this increased the children’s, and to a lesser extent, the adults’, desire to eat foods they didn’t like (for five-year-olds only, it also increased their desire to eat foods they liked). Even a neutral facial expression in the eating women made a difference – increasing and decreasing the participants’ desire for liked and disliked foods, respectively, especially in the children.
The researchers speculated that the influence of the women’s facial expressions occurred because seeing their expressions led to simulations of those same emotions in the minds of the participants. They further suggested that this process is accentuated in younger children because of the immaturity of their prefrontal cortex.
The study has some obvious weaknesses, acknowledged by the researchers – they didn’t measure actual eating behaviour, and the stimuli were photos, as opposed to a real-life dining situation. Nonetheless, they predicted the effects of other people’s emotional expressions might be even larger in a more realistic situation and that the results therefore have important implications for the encouragement of children’s healthy eating habits. “Adults may unconsciously influence children’s food preferences via their facial expressions of pleasure or disgust,” they said.
Barthomeuf, L., Droit-Volet, S., and Rousset, S. (2012). How emotions expressed by adults’ faces affect the desire to eat liked and disliked foods in children compared to adults. British Journal of Developmental Psychology, 30 (2), 253-266 DOI: 10.1111/j.2044-835X.2011.02033.x