Our autobiographical memory is fundamental to the development of our sense of self. However, according to past research, it may be compromised in autism, together with other skills that are also vital for self understanding, such as introspection and the ability to attribute mental states to others (known as mentalising).
For example, experiments involving autistic children have highlighted retrieval difficulties, “impoverished narratives”, and a greater need for prompting, while also suggesting that semantic recall (facts from the past) may be impaired in younger individuals.
Now a UK research team, led by Sally Robinson from London’s St. Thomas’ Hospital, has published the first attempt to assess the nature of – and relationships between – autobiographical memory, mentalising and introspection in autism. Reporting their findings in Autism journal, the group hope their results will shed more light on the way that autistic children and teens develop a sense of self.
Have you heard the riddle about the doctor? A father and his son are involved in a car accident and taken to different hospitals, the boy to a children’s hospital and the father to the general hospital. When the boy arrives at hospital, the doctor on call is shocked, saying “I can’t treat this boy, he’s my son!” The question is: who’s the doctor? The answer, as with many riddles, is obvious once you know it: the doctor is the boy’s mother. Years ago when I first heard this riddle, I was stumped, even though the only doctor I had contact with in my own life happened to be a woman. The very fact that this question works as a riddle is testament to the strength of negative stereotypes surrounding women’s scientific abilities.
Women who take degrees in Science, Technology, Engineering and Mathematics (STEM) subjects do just as well as their male colleagues, even though they are far outnumbered by them: in the UK, only 14 per cent of engineering and technology students, and 17 per cent of computer science students are women. The picture is similar in the USA, where Catherine Riegle-Crumb and Karisma Morton carried out a study, published recently in Frontiers in Psychology, to investigate why the numbers are so low.
Fifty years ago, in Connecticut, a series of infamous experiments were taking place. The volunteers believed they were involved in an investigation into learning and memory, and that they would be administering shocks to a test subject whenever he answered questions incorrectly. But despite pretences, the scientist behind the research, Stanley Milgram, wasn’t actually interested in learning. The real topic of study? Obedience.
Milgram recorded how far his participants were willing to go when told to deliver larger and larger shocks. In one version of the study, 26 out of 40 participants continued to the highest shock level – two steps beyond the button labelled “Danger: severe shock”.
But this was 50 years ago – surely the same wouldn’t happen if the experiment were conducted today? That’s what a group of researchers from SWPS University of Social Sciences and Humanities in Poland aimed to find out, in a “partial replication” of Milgram published recently in Social Psychological and Personality Science.
Emotions can be fleeting and superficial, for example imagine the split-second of anger you experience after missing the bus. But other “peak emotional states” are more powerful and they are accompanied by intense physical reactions, such as crying or “the chills”. Often these physical manifestations accompany extreme fear or sadness, but they can also occur when we admire a magnificent sunset or enjoy a beautiful piece of music.
Now a study published in Scientific Reports by Kazuma Mori and Makoto Iwanaga has taken a closer look at the contrasting psychology and physiology underlying the chills and tears many of us experience when we’re profoundly moved by a song.
Most of us tend to think we’re better than average: more competent, honest, talented and compassionate. The latest example of this kind of optimistic self-perception is the “invisibility cloak illusion”. In research published recently in the Journal of Personality and Social Psychology, Erica Boothby and her colleagues show how we have a tendency to believe that we are incredibly socially observant ourselves, while those around us are less so. These assumptions combine to create the illusion that we observe others more than they observe us.
At least one in four readers of this post will die of cancer. This is a simple statistic that leads rationally thinking people to treat the possibility as very likely. And this is what many do: they try to adopt a lifestyle that minimises the risk to some degree. But how do we know what minimises and what increases this risk? Of course, by listening to experts, the best of whom are scientists who research these things. However, whenever there is disquiet brought about by uncertainty, self-titled experts come out of the woodwork. Discussion of factors increasing the risk of cancer is today not only the domain of medical doctors and psycho-oncologists, but is also engaged in by some alternative medicine proponents, pseudopsychologists, and fringe psychotherapists, whose opinions are disseminated by journalists, some more thorough than others (see myth #26 in 50 Great Myths of Popular Psychology for more background).
Among these opinions is the common claim that negative thinking, pessimism, and stress create the conditions for the cells in our body to run amok, and for cancer to develop. Similar declarations accompany therapeutic propositions for changing our way of thinking into a more positive one that will protect us from cancer, or even cure us of the disease. Should you, therefore, begin to fear the possibility of cancer if you are not prone to optimism, or – even worse – have bouts of depression?
Academically successful children are more likely to drink alcohol and smoke cannabis in their teenage years than their less academic peers. That’s according to a study of over 6000 young people in England published recently in BMJ Open by researchers at UCL. While the results may sound surprising, they shouldn’t be. The finding is in fact consistent with earlier research that showed a relationship between higher childhood IQ and the use in adolescence of a wide range of illegal drugs.
Adolescents take more risks than adults: they are more likely to binge drink, have casual sex, commit crimes and have serious car accidents. In fact, adolescence is a paradox because it is a time of peak physical fitness, but also the time when people are most likely to be injured or killed in an accident. For this reason, it’s critical to understand what drives teenagers to take more risks. To date, many explanations of teenage risk taking have focused on the positive side of these behaviours: the rewarding “kick” that comes from taking a risk that ends well. Some studies have shown that teenagers experience more of this rewarding feeling, and this contributes to the increased risk taking seen at this age.
Fewer studies have considered how teenagers respond when risks turn out badly. This is important because all our previous experiences, both good and bad, affect our subsequent behaviour. If we make a risky decision like gambling money, and it pays off, it’s more likely we’ll decide to gamble again in the near future. Equally, if we take a gamble and it turns out badly, we’ll probably be a bit more reserved next time. But it turns out that some teenagers don’t respond like this: according to a new study in NeuroImage, some of them do not adjust their behavior so readily when things go wrong, and this may be linked to a distinct pattern of activation in their brains.
While autism is usually diagnosed in childhood, some people remain “off the radar” for a long time and only receive a diagnosis much later. One possible reason is that they have learned socially appropriate behaviours, effectively camouflaging their social difficulties, including maintaining eye contact during conversations, memorising jokes or imitating facial expressions.
This pattern of behaviour could have serious consequences for the lives of some people with autism. It is easy to imagine that camouflaging demands significant cognitive effort, leading to mental exhaustion over time, and in extreme cases perhaps also contributing to anxiety and depression.
If there are gender differences in camouflaging, this could also help explain the well-known male preponderance in autism spectrum disorders. At least part of the gender imbalance may, in fact, stem from an under-diagnosis of autism in girls because they are better at “masking” symptoms.
Before now, autism camouflaging has not been studied in a systematic and standardised manner: a recent open-access study in the journal Autism, by Meng-Chuan Lai and his colleagues, is the first to offer an operationalisation of camouflaging, which they define as the discrepancy between internal and external states in social-interpersonal contexts. For instance, if an autistic person maintains eye contact during a conversation because they have learnt that this is socially appropriate, even though this clashes with how they really want to behave, this would be an example of camouflaging.
Coming up with the perfect recipe for crisps or the ideal marketing strategy for a soft drink used to depend on explicit measures. In focus groups and surveys, consumers were asked which product tasted best or which commercial was most appealing. But these measures are imperfect: consumers may choose to hide their true opinions or they might not be fully aware of their own preferences. Food and drinks companies need more objective measures. Currently their best hope is functional magnetic resonance imaging (fMRI).
The idea is that somewhere in the brain, a “buy button” is hidden away: a region (or combination of regions) that influence your purchase decision. The promise of neuromarketing is that one day, we will be able to find this region, record its activity when you watch an ad or sample a product, and then predict how well this product will sell. So far, the success has been limited. But in a recent study in NeuroImage, Simone Kühn from the University Clinic Hamburg-Eppendorf and her colleagues claim to have found “multiple ‘buy buttons’ in the brain”.