Friday, 22 May 2015

You can now test whether someone is a "Maven"

Malcolm Gladwell’s influential book The Tipping Point popularised the notion that ideas, products and movements owe popular success to opinion leaders: people who are highly connected via weak ties to others, persuasive in character, and an expert or "Maven" in the field in question. The Maven is the friend you go to when you want to buy a new laptop, but don’t know where to start, or consult when you’ve been feeling sluggish and wondering if your diet has something to do with it.

Identifying Mavens is a holy grail for people interested in influence, leading researchers Franklin Boster and Michael Kotowski to develop a "Maven scale". They’ve now published a paper that presents validation studies suggesting people can accurately self-identify as Mavens, and that the scale operates over different fields of expertise.

The first study used a political version of the scale, asking questions such as “When I know something about political issues, I feel it is important to share that information with others” (see footnote* for more examples).

One hundred and thirty-one students completed the scale, together with a measure of their political activities such as voting, volunteering and donating, and a test of political knowledge. High scorers on the Mavens scale were more politically active and more knowledgeable about politics: they walked the walk, as well as talking the talk.

However, another key aspect of mavenhood is that others see them as knowledgeable, and seek their advice. Is this true? A second study using a health expertise version of the scale surveyed the professional staff of a high school. In addition, each participant had to evaluate the other 33 participants on two items: a Yes or No to “this person comes to me for information on health and healthy lifestyle issues” and a rating of the degree to which “This person is a good source of information on health and healthy lifestyle issues.”

If self-identified health Mavens are what they claim, they should have more petitioners and those petitioners should have faith in them. Again, the data confirmed this: health Mavens provide trusted advice to their network.

A well-developed scale of mavenhood will benefit corporations looking to get their new, superior product in front of the right people to create a runaway success. But identifying and targeting Mavens is equally relevant for institutions looking to get bold new political ideas the attention they deserve, or to disseminate new and important health behaviours amongst the population. In their conclusion, the authors say that people “wishing to promote behavior change…may find these scales effective” so if that describes you, get in touch with them.

_________________________________ ResearchBlogging.org

Boster, F., Carpenter, C., & Kotowksi, M. (2015). Validation studies of the maven scale Social Influence, 10 (2), 85-96 DOI: 10.1080/15534510.2014.939224

*The scale is presented in full – in the political variant – in the Appendix of the paper. It includes "connector" items, "persuader items", and subject specific items. If you're a Maven you'd be expected to strongly agree with the following example items as well as other items not shown here:

  • The people I know often know each other because of me (connector item)
  • More often than not, I am able to convince others of my position during an argument (persuader item)
  • If someone asked me about a political issue that I was unsure of, I would know how to help them find the answer (political maven item)
  • People often seek me out for answers when they have questions about a political issue (political maven item)

Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.

Thursday, 21 May 2015

Women are better than men at remembering to remember

Prospective memory is the term psychologists use for when we have to remember to do something in the future – like stopping for milk on the way home from work. It requires not just remembering what to do, but remembering to remember at the right time.

There's actually some past research that suggested women, on average, are more prone to forgetting future tasks than men. But crucially, this research was subjective. Women admitted more memory failures of this kind than men did, but of course that doesn't mean they really do forget more often.

Now a team led by Liana Palermo has conducted a carefully controlled objective test of prospective memory under laboratory conditions. Fifty men and fifty women (average age 25) were given various tasks to remember to complete, mostly over either two-minute or fifteen-minute time scales, although there was one task after 24 hours.

Averaged across all tasks and conditions, there were no gender differences in performance. But focusing on specific types of tasks, differences emerged. Women were better than men at remembering to perform future tasks that were tied to events rather than a specific delay (e.g. perform task x when I give you a card, as opposed to perform task x in two minutes). The women also tended to outperform men on future tasks that were physical in nature (e.g. writing their address on a post card), as opposed to verbal (e.g. remembering to ask a specific question).

It's possible the female advantage for some aspects of prospective memory is merely a side-effect of women's other cognitive advantages. For example, women tend to have superior verbal skills than men, and the instructions in this study were delivered verbally. However, the researchers don't think this is likely because in that case you'd expect women to outperform men on all forms of prospective memory, and especially on future verbal tasks.

This was a small sample and, being lab-based, the study lacked realism, so more research is certainly needed. But the finding does tie in with other research conducted on the internet that also found a female advantage for prospective memory, and with existing evidence that women have an advantage for episodic memory (that is, remembering things that have happened to them in the past). Regarding the prior research that found women admit to more prospective memory failures, this new study raises the possibility that women are simply better at detecting their own forgetfulness.

Assuming this female advantage is replicated in further studies, why should women be better than men at remembering to remember? Here Palermo and her team are left to speculate: they suggest there could be a biological explanation, such as the known sex-linked differences in the hippocampus (a brain area involved in memory). They also propose a possible socio-cultural explanation, which may well resonate with some of our readers:
"...[T]he fact that in addition to work responsibilities, women also have more responsibilities at home. ... As a consequence of this social role, in daily life women might perform tasks involving prospective memory/planning skills more than men, thus enhancing their performance in remembering to remember."
_________________________________ ResearchBlogging.org

Palermo, L., Cinelli, M., Piccardi, L., Ciurli, P., Incoccia, C., Zompanti, L., & Guariglia, C. (2015). Women outperform men in remembering to remember The Quarterly Journal of Experimental Psychology, 1-10 DOI: 10.1080/17470218.2015.1023734

--further reading--
Women really are better than men at processing faces
Women have a superior memory for faces
Women's true maths skills unlocked by pretending to be someone else

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Wednesday, 20 May 2015

Poverty shapes how children think about themselves

"The Culture of Poverty”, published in 1966 (pdf), was hugely influential, persuading many policy makers that children from low-income families are destined for lives of “criminality, joblessness, and poverty” because they exist in enclaves characterised by dysfunctional beliefs and practices. Thankfully, this fatalistic view has since been largely refuted and attention has turned to ways to help poor children, including giving them access to books, good teachers and stable environments.

Now a review from the University of Massachusetts has highlighted a different way that poverty can leave a lasting impression on children: by altering their psychological states in ways that shape their future. This sounds like a bleak picture, but the review urges the situation is one we can combat.

Authors Amy Heberle and Alice Carter point out that adults belonging to a disadvantaged group are vulnerable to a pair of effects: higher levels of stress when their low status is made clear to them, termed status anxiety; and underperformance on a task when reminded that their social group is stereotypically poor at the task, termed stereotype threat. If these phenomena apply to young children, as Heberle and Carter propose, then even if they have a stable, stress-free family life, poor kids are likely to generate their own stress and underperform simply through awareness of their own in-poverty status.

For this to be the case, young children would need to possess social categories and understand stereotypical beliefs. It appears they do. The review explains how, from the age of five, children in Western countries have a handle on the category of "poor people", and are able to describe it coherently. Interestingly, middle income children are likely to use "dirty", "mean", and other stereotypes in their descriptions, whereas poorer children are more likely to describe how poor people feel, suggesting greater empathy and an awareness that they lie within or close to that group.

In terms of stereotypical beliefs, children from first grade (aged six to seven) onwards endorse the belief that poor kids do worse in school. Moreover, there is evidence from a single study that children believe that although poorer children have a similarly broad range of ambitions to that of other children, less than a quarter of these dreams would be achieved, whereas non-poor children should achieve the majority of theirs.

For poor children to be burdened by stereotype threat, they would also need to be conscious that others might assign them to a stereotypical category. Here the evidence is thinner, but we know that poor kids who say they would prefer poor friends give reasons including “they wouldn’t judge you on how you look, you talk, and the way you were.”

Heberle and Carter emphasise that more research is needed to establish exactly when children begin to experience status anxiety and stereotype threat. They urge far more work on the under-5s (children begin drawing social categories and stereotypes by the age of two), which would require the use of non-verbal techniques (e.g. preferential looking) in place of questions and conversation. They predict such research will show that social class and stereotypes fall into place by age three, very early in a child’s sense-making of the world.

If these mechanisms do have an impact, it would explain why researchers have struggled to establish a causal link between inequality and health outcomes at a personal level, even though we know more equal nations have better health. At least in developed nations, it may be that the harm comes not so much from lack of absolute material wealth but from the psychological mechanisms triggered by comparative poverty. These mechanisms might even be a contributing factor in the recent finding that 12-13 year olds from low-income families have thinner cortices in brain regions associated with academic performance.

If Heberle and Carter are right, then growing up poor does throw up psychological obstacles to healthy functioning. But these are issues that teachers and families can challenge by discussing and countering negative beliefs about poverty with their children, and that policy-makers can tackle too. Even innocuous, discretionary costs, such as a museum trip fee, can be too much for a stretched family budget, creating separation between poorer children and their peers. Recognising this, societies can try harder to lessen these burdens.

_________________________________ ResearchBlogging.org

Heberle, A., & Carter, A. (2015). Cognitive Aspects of Young Children’s Experience of Economic Disadvantage. Psychological Bulletin DOI: 10.1037/bul0000010

Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.

Tuesday, 19 May 2015

A preliminary taxonomy of the voices inside your head

Psychologists are taking an increasing interest in the way we all speak to ourselves in our heads. Unpleasant, uncontrollable inner voices can be a feature of mental illness, but private self-talk is a mundane part of most healthy people's consciousness.

When we talk to ourselves in our heads in this way, it's common for there to be a kind of dialogue. Consider how "You" might say to yourself that you want to stop working, but then a voice in your head takes a different stance and urges you to continue.

For a new study, Malgorzata Puchalska-Wasyl at the University of Lublin in Poland has attempted to create a preliminary taxonomy of the different kinds of "internal interlocutors" that people experience.

Ninety-eight participants (mostly students; average age 23; half were women) were asked to think of the inner voice or interlocutor they most often have a dialogue with in their own minds. Next they scored from 1 to 4 how much each of 24 emotional words in four categories (self-enhancement; contact and union with others; positive emotion; negative emotion) applied to this inner voice, including: joy, shame, strength, intimacy, anger, and inner calm.

Puchalska-Wasyl crunched the numbers (using a technique called k-means clustering) and found that the participants' descriptions of their inner voices clustered into four distinct categories. Thirty of them fell into the category of "Faithful Friend" and were associated with strength and unity and positive emotion; 22 fitted the category of "Ambivalent Parent" and were associated with strength and love, but also ambivalence or negativity to the participant's irresponsible ideas; 32 matched the "Proud Rival" category, showing pride and self-confidence combined with a lack of closeness to the participant; and finally the remainder fitted the description of "Calm Optimist" – a relaxed interlocutor, characterised by low self-enhancement, little emphasis on contact with others, but in a way that participants perceived positively.

A limitation of this first investigation is that participants only listed the inner voices they spoke with most often. For a follow-up then, Puchalska-Wasyl asked 114 more participants (again, most were students) to rate the character of four inner interlocutors: the two they experienced most often, and two others that differed in the emotions they showed toward the participant.

After analysing the ratings, Puchalska-Wasyl again found the Faithful Friend, Ambivalent Parent and Proud Rival categories of inner voice, but this time no Calm Optimist. In its place was a different category "Helpless Child", characterised by a low emphasis on self-enhancement, low scores on contact with others, and high negative emotion.

The study mostly relied on Polish students so it's not clear how well the findings will generalise. There might be types of voice that the students preferred not to share, and people in other cultures or different stages of life might describe different voices. Also, there is an element of subjectivity in the way that the researcher chose to characterise the inner voices. Moreover, we need more research on the functions of the different voices. Puchalska-Wasyl says it's currently possible to distinguish "integrative" (solution-seeking) and "confrontational" dialogues, with Faithful Friend and Ambivalent Parent likely fitting the former description and Proud Rival and Helpless Child, the latter.

Despite the shortcomings and need for more research, this is a novel approach to the study of our inner conversations, and Puchalska-Wasyl points out that "given that internal dialogues are a useful instrument in psychotherapy, a universal typology of internal interlocutors and the knowledge of functions fulfilled by these universal types in different contexts can be of practical significance: it could contribute to more effective use of those dialogs in psychological practice."

_________________________________ ResearchBlogging.org

Puchalska-Wasyl, M. (2014). Self-Talk: Conversation With Oneself? On the Types of Internal Interlocutors The Journal of Psychology, 149 (5), 443-460 DOI: 10.1080/00223980.2014.896772

--further reading--
The science of how we talk to ourselves in our heads

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Monday, 18 May 2015

Free will inside the Nazi death camps

The entrance to the Auschwitz concentration camp in Poland. The sign "Arbeit macht Frei" means "Work liberates".
Free will is a controversial topic in psychology, thanks in part to studies suggesting that the brain activity associated with making decisions comes before the conscious feeling of making a choice. Other research claims that when people are exposed to arguments against free will, this makes them more prone to cheat. While intriguing, such insights are arguably somewhat removed from everyday human experience. A new study goes to the other extreme, providing an emotional and challenging take on what it means to have free will.

Jonathan Davidov and Zvi Eiskiovits at the Centre for the Study of Society in Israel conducted in-depth interviews with 20 survivors of Nazi death camps – institutions designed to destroy free will. The researchers' striking interpretation of the testimonies is that without free choice there is quite simply no existence.

The interviewees, 7 men and 13 women, now aged between 80 and 90, were questioned in their own homes about their experiences at the Auschwitz network of concentration camps. After the researchers transcribed and reflected on the accounts they found three recurring themes.

Despite living in a death camp where all freedom and individuality was removed, the interviewees experienced acute moments of choice and will when it came to the experience of "selection" – this was a ghastly recurring process whereby the Nazi captors lined up the prisoners to decide who would be kept alive for manual labour and who would be sent to execution in a gas chamber. "Despite the extreme conditions," the researchers said, "some people found a measure of situational freedom through which they could attempt to direct their lives."

Consider the experience of Israel, aged 15 at the time he was transported to Auschwitz. Taken to a selection after 48 hours without food or drink, standing half naked in the snow, he was asked by his captor to state his age:
"So I answered 'I'm a welder'. And I got this sarcastic face stuck in my mind to this very day, his smile, because he understood exactly what I was doing."
The researchers said: "A huge amount of information is telescoped into this split second. Israel contemplates his journey to the camp, thinks about his situation, and realises this is the time when he must make his final stand or die." The researchers believe that some prisoners, like Israel, made active choices to influence their destiny – exhibiting and experiencing free will – even in the grim context of a death camp. In this case, Israel ignored his captor's question and instead communicated how he could be useful, likely saving his own life in the process.

Such moments of "choice" were followed by long periods of waiting until the next selection, which brings us to the researchers' second theme. The interviewees described this time as like a frozen present. As Ora, aged 15 at the time she was a prisoner, put it:
"As there was no clock and no day or night, we couldn't tell the time. We used to sit, or stand, or wait. Wait, wait, and wait."
The only action available through these empty "borrowed times" was to wait until the next selection. Without any freedom, the interviewees described experiencing nothingness. And here is the final theme –  the way the prisoners described their own and other people's response to this non-existence. Some said they managed to cope through a dulling of their emotions. "My heart became as hard as rock," said Israel. But for others the nothingness proved deadly. "Muselmann" is the German term used to describe prisoners for whom the emotional nothingness had become overpowering. As Esther explained:
"Someone who became nothing, was named a Muselmann. He was exhausted with hunger, weakness, and despair. When there is no despair, you eat something, even grass. But when there is no one to live for, what to live for, no one to support you and no one that you need to support yourself, you don’t have anything to live for. When there is no one to look after, there is no answer to the questions: 'What is it all about? Who am I and what am I? To whom do I matter?"
Taken together, Davidov and Eiskiovits said their interviews show that from a phenomenological perspective, "free will and existence represent the same concept: I choose – therefore I am. When free will is denied there is emptiness and nothingness ...".

_________________________________ ResearchBlogging.org

Davidov, J., & Eisikovits, Z. (2015). Free will in total institutions: The case of choice inside Nazi death camps Consciousness and Cognition, 34, 87-97 DOI: 10.1016/j.concog.2015.03.018

--further reading--
Imagining World War II
Milgram's personal archive reveals how he created the 'strongest obedience situation'

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Saturday, 16 May 2015

Link feast

Our pick of this week's 10 best psychology and neuroscience links:

Why We (and My Mum) Are So Glad The World is Waking up to What Autism Is
If you're on the autistic spectrum, "never give up hope," says Joash Taylor.

Is Late-Night Snacking Really Your Brain’s Fault?
Recent headlines said you could blame your brain for your midnight binges. I looked at what the brain imaging study really found.

Does Wearing Red Really Make You Dominant, Charismatic and Sexy?
Here's what happened when, inspired by new research, Stuart Heritage weaponised himself with red clothing.

More Harm Than Good (video)
For the 52nd Maudsley debate, the house proposed that the long term use of psychiatric medications is causing more harm than good.

Educational App or Digital Candy? Helping Parents Choose Quality Apps for Kids
From the Association for Psychological Science: An evidence-based guide parents can use to evaluate apps for real educational benefits.

The Science of Craving
Amy Fleming meets Dr Kent Berridge and other scholars researching the difference between desire and pleasure.

It’s Not a ‘Stream’ of Consciousness
"We actually perceive the world in rhythmic pulses rather than as a continuous flow," writes Gregory Hickok.

Body Dysmorphic Disorder, Social Media and PTSD, Preventing Procrastination (audio)
Catherine Loveday joins Claudia Hammond in the studio for the latest episode of BBC Radio 4's All in the Mind.

The Professor Who Thinks Video Games Will Be the Downfall of Men
Philip Zimbardo is worried that excessive gaming or porn watching is crippling masculinity. But (writes Pete Etchell's at the Guardian Head Quarters blog) the evidence just doesn’t back up these sorts of claims.

The Centenarian Psychologist
As he approaches his 100th birthday, cognitive psychology pioneer Jerome S. Bruner reflects on the past, present and future of psychology.
_________________________________
 
Post compiled by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Friday, 15 May 2015

Companies are more successful when their employees feel young for their age

If you want a dynamic workforce, seek not the young, but the young at heart. That’s the message of a new study that surveyed over 15,000 employees from 107 companies to determine how subjective age influences workplace performance.

Past research has made the case that employee age is important to workplace performance, with younger workers more likely to make breakthrough contributions – but the evidence is patchy, suggesting there is more to the story. The proposed cause for the youth advantage is that their mindset is focused on getting ahead and furthering their skills, networks and status, whereas older people are more concerned with maintaining their positions. Now a research team led by Florian Kunze has posed the question: if mindset is critical, then isn’t how old you feel really what matters?

In their survey, employees who felt substantially younger than their chronological age were more successful in meeting the goals they'd promised their managers they would achieve. Companies with more of these "young at heart" employees also tended to perform better overall, in terms of financial performance, efficiency and a longer tenured workforce. The survey also showed that organisations tended to have more young at heart workers when they offered both age-inclusive policies and, on average, their employees felt that their work was more important and meaningful.

This cross-sectional study can’t prove the causality, but it’s possible that the optimism and possibilities afforded by meaningful work can make us feel more vibrant, and active policies that challenge stereotypes and extend opportunities to older workers help remove the sense of age being an issue.

The Western workforce is steadily greying, so if chronological age were the be-all and end-all, organisational leaders ought to be concerned. But this research suggests that climates where all workers can feel young, energised by their work and not judged and stereotyped, facilitate the kind of dynamic performance associated with young bucks.

_________________________________ ResearchBlogging.org

Kunze, F., Raes, A., & Bruch, H. (2015). It Matters How Old You Feel: Antecedents and Performance Consequences of Average Relative Subjective Age in Organizations. Journal of Applied Psychology DOI: 10.1037/a0038909

Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.

Google+