The last time I tried to learn a foreign language, I was living in an Italian suburb of Sydney. My hour a week at a local Italian class was inevitably followed by a bowl of pasta and a few glasses of wine. As an approach to language-learning goes, it was certainly more pleasurable than my German lessons at school. Despite the wine, it was also surprisingly effective. In fact, getting better at a new language doesn’t have to mean hard hours on lists of vocab and the rules of grammar. It turns out that what you don’t focus on matters, too. And a glass of wine may even help …
Stress has complicated effects on our memories. Whereas some studies have found that we are better at remembering events that occurred during stressful situations, such as while watching disturbing videos, others have shown that stress impairs memory. Now a study published in Brain and Cognition suggests that stress doesn’t influence the strength of our emotional memories at all. Instead, the researchers claim, it is the fidelity of those memories – how distinct and precise they are – that changes when we go through stressful experiences.
The question is an old favourite – if you could travel back in time, what advice would you give to your younger self? Yet despite the popularity of this thought experiment, no one has, until now, actually studied what people would tell themselves.
Reporting their findings in The Journal of Social Psychology Robin Kowalski and Annie McCord at Clemson University have done just that in two surveys of hundreds of participants on Amazon’s Mechanical Turk website. Their findings show that people’s advice to their younger selves is overwhelmingly focused on prior relationships, educational opportunities and personal worth, echoing similar results derived from research into people’s most common regrets in life. Moreover, participants who said they had followed the advice they would give to their younger selves were more likely to say that they had become the kind of person that their younger self would admire. “…[W]e should consult ourselves for advice we would offer to our younger selves,” the researchers said. “The data indicate that there is much to be learned that can facilitate wellbeing and bring us more in line with the person that we would like to be should we follow that advice.”
Write down the unfinished statement “I am …” twenty times. Now think to yourself “Who am I?” and complete as many of the “I am …” statements as you can in the next five minutes or less.
This is the Twenty Statements Test and it’s designed to assess how we see ourselves – our “self-concept”. For their new paper in the journal Memory, a team at the University of Reading, led by Emily Hards, gave this test to 822 teenagers (aged 13-18) from three schools in England, with the additional instruction “not to think too much about the responses and not to worry about the order/importance of the statements”.
While it’s widely recognised that adolescence is a crucial period for the establishment of our sense of self, little is actually known about how teenagers’ generally see themselves. Indeed, this is the first time that teenagers’ own self-generated descriptions of themselves (what the researchers call their “self-images”) have been gathered in a systematic way.
How do we acquire our native language? Are the basics of language and grammar innate, as nativists argue? Or, as empiricists propose, is language something we must learn entirely from scratch?
This debate has a long history. To get at an answer, it’s worth setting the theories aside and instead looking at just how much information must be learned in order to speak a language with adult proficiency, argue Francis Mollica at the University of Rochester, US, and Steven Piantadosi at the University of California, Berkeley. If the amount is vast, for instance, this could indicate that it’s impracticable for it all to be learned without sophisticated innate language mechanisms. In their new paper, published in Royal Society Open Science, Mollica and Piantadosi present results suggesting that some language-specific knowledge could be innate – but probably not the kind of syntactic knowledge (the grammatical rules underlying correct word order) that nativists have tended to argue in favour of. Indeed, their work suggests that the long-running focus on whether syntax is learned or innate has been misplaced.
Do students take notes in an optimal fashion, in line with what psychology research identifies as best practice? It’s an important question given that modern surveys suggest that most students’ preferred approach to exam preparation is to memorise their notes. To find out, a team led by Kayla Morehead at Kent State University has quizzed hundreds of university students about their note-taking methods and preferences, and they’ve reported their findings in the journal Memory.
For decades, linguists have debated the extent to which language influences the way we think. While the more extreme theories that language determines what we can and can’t think about have fallen out of favour, there is still considerable evidence that the languages we speak shape the way we see the world in more subtle ways.
For instance, people are better at perceiving the difference between light and dark blue if they have dedicated words for those colours (like in Russian) than if they don’t (like in English). But it turns out it’s not just the words that we use: the way in which a language is structured – its syntax – is also important. In a recent study in Scientific Reports, Federica Amici and colleagues show that the word order of a language predicts how good its speakers are at remembering the first or last parts of a list.
Anyone who has stood in the supermarket aisle trying to remember their shopping list might have wished for a larger brain. But when it comes to memory, bigger isn’t always better. A study published in Neuropsychologia has found that young children whose cerebral cortex is thinner in certain areas also tend to have better working memory.
In the past when scholars have reflected on the psychological impact of dementia they have frequently referred to the loss of the “self” in dramatic and devastating terms, using language such as the “unbecoming of the self” or the “disintegration” of the self. In a new review released as a preprint at PsyArXiv, an international team of psychologists led by Muireann Irish at the University of Sydney challenge this bleak picture which they attribute to the common, but mistaken, assumption “that without memory, there can be no self” (as encapsulated by the line from Hume: “Memory alone… ‘tis to be considered… as the source of personal identity”).
In their review, Irish and her colleagues, including doctoral candidate and lead author Cherie Strikwerda-Brown, present a more optimistic perspective based on their analysis of the research literature on autobiographical memory loss in people with Alzheimer’s Disease, people with Semantic Dementia, and others with Frontotemporal Dementia. “Overall,” they write, “… the self is not entirely lost in dementia, with distinct elements of preservation emerging contingent on life epoch and dementia syndrome”.
A picture is worth a thousand words…. When it comes to conveying a concept, this sentiment cancertainly be true. But it may also be the case for memory. At least that’s the message from Myra Fernandes and colleagues at the University of Waterloo, Canada – writing inCurrent Directions in Psychological Science, they argue that their research programme shows that drawing has a “surprisingly powerful influence” on memory, and as a mnemonic technique, it could be particularly useful for older adults – and even people with dementia.