Friday, 31 October 2014

Spook Me, Please: What Psychology Tells Us About the Appeal of Halloween

By guest blogger Mathias Clasen

It’s the time of year, at least in our part of the world, when darkness encroaches on us—literally and metaphorically. The symbols and agents of darkness dominate Halloween decorations everywhere, and Halloween is growing in popularity across Europe and in the US. According to the National Retail Federation, US Halloween spending now exceeds $7 billion. In the UK, Halloween is worth about £330 million.

Why is this Americanised version of the ancient pagan festival so successful? Is it merely another instance of the McDonaldisation of culture, the increasing hegemony of American commercial culture, explicable in terms of market mechanisms alone? No. The dread scenarios evoked by the paraphernalia of Halloween are deeply fascinating to a prey species such as Homo sapiens. Ghouls, zombies, demons, giant spiders, and horrors hidden in darkness all engage evolutionarily ancient survival mechanisms—and all figure prominently in the scenography of horror films and in Halloween decorations. We seem to love the good thrill of a safe scare, and Halloween provides plenty of those.

Horror films, horror monsters, and the iconography of Halloween are culturally successful because they are well-adapted to engage evolved danger-management adaptations. We know that existence for our prehistoric ancestors was precarious. The threat of predation has been very real and very serious for hundreds of millions of years. As the anthropologist Lynn Isbell has shown, mammals and reptiles have been engaged in a lethal co-evolutionary arms race for a hundred million years or more, and that arms race has profoundly shaped our genome. A hard-wired, adaptive tendency to easily acquire fear of snakes explains the prevalence of snake phobias today, even in snake-less ecologies.

Similarly, the threat posed by poisonous spiders in prehistoric environments has left an eight-legged imprint in human DNA, an imprint that is expressed as a tendency to easily acquire fear of spiders. We are, at the very least, likely to pay close attention if a saucer-sized arthropod scuttles out from under the couch. Spiders engage attention—as recent research documented, spiders override inattentional blindness, our tendency to overlook even striking stimuli in peripheral awareness when we’re engaged in a cognitively taxing task. Another study claimed that five-month-old infants pay closer attention to schematic representations of spiders than to representations that consist of the same graphic elements but do not look like spiders. Spiders are inherently attention-demanding and, to most people, gross and a little scary, and that explains why they feature so prominently in Halloween iconography. They simply perform the functions of engaging attention and eliciting a shudder well.

Likewise, the usual suspects in the horror genre’s antagonistic line-up—from supernatural monsters via rotting zombies to homicidal maniacs in masks—all connect squarely with defensive psychological adaptations that arose over evolutionary time in response to dangers in the environment, from the threat posed by hostile conspecifics and lethal pathogens to the bite of hungry carnivores. Although there were no child-eating clowns in prehistoric environments, a character like Pennywise the Dancing Clown has achieved pop-cultural infamy because it effectively targets danger-management mechanisms in human cognitive architecture.

The dangers of pre-historical existence have left deep grooves in human nature. The creatures and situations we typically fear—spiders, snakes, the dark, heights, confined spaces, and so on—are the same creatures and situations that posed real dangers to our evolutionary ancestors, even though they play little role in modern-day mortality statistics in the West. We should be afraid of driving too fast in a car, of smoking cigarettes, of eating unsaturated fats, and so on. Our Halloween decorations should feature such elements prominently, but they don’t. Why? Because humans evolved to swiftly detect, respond to, and develop phobias of stimuli that posed a threat over thousands of generations. The dangers posed by fatty acids and cigarettes are evolutionarily novel and have left no impression in human DNA. When we thrill to supernatural monsters and giant spiders, we are thrilling to the ghosts of dangers past, ghosts that persist in the human central nervous system despite relaxed selection pressures.

Of course the scary costumes and props of Halloween are symbolic and don’t pose any real threat; they provide safe thrills, our love for which has roots deep in our mammalian heritage. Other mammalian infants also find great pleasure in forms of play that allow them to get experience with life-threatening situations in a safe context. Children’s play often revolves around simulating dangerous situations. Witness an infant responding enthusiastically to a game of peek-a-boo, the most primal of horror situations where the primary caretaker disappears from the infant’s field of vision (and thus its world) for a few stress-inducing seconds … only to reappear suddenly, causing a mild startle reaction. Or witness any kid delightedly simulating being chased by a daddy- or a mommy-monster in a session of chase play or hide-and-seek. Such activities serve the adaptive functions of giving children experience with evasion techniques, they build locomotor skills and muscle tone, and they allow the children to get experience with their own cognitive and emotional responses to situations that feel dangerous but aren’t. Such experience could become vital later in life, when they face truly dangerous situations or when they have to face and overcome their own fear.

Halloween has the potential to bring us into contact with our evolutionary heritage by confronting us with reflections of evolutionarily ancient, fear-inducing stimuli. Halloween is here to stay, so we might as well embrace it. When darkness falls, the monsters stir. That’s true of prehistory no less than of horror films—and on the last day of dark October, they all come out to play.
_________________________________

Post written by Dr Mathias Clasen for the BPS Research Digest. Clasen is assistant professor in literature and media at Aarhus University. He has published on evil in horror fiction, zombies, vampires, and the psychological functions and effects of horror across media, and is currently working on a project on post-apocalyptic science fiction in a biocultural perspective. Several of his writings are available at Academia.edu and Horror.dk.

--further reading--
31 terrifying links for Halloween
The lure of horror

Thursday, 30 October 2014

The psychology of "mate poaching" - when you form a relationship by taking someone else's partner

According to one estimate, 63 per cent of men and 54 per cent of women are in their current long-term relationships because their current partner "poached" them from a previous partner. Now researchers in the US and Australia have conducted the first investigation into the fate of relationships formed this way, as compared with relationships formed by two unattached individuals.

An initial study involved surveying 138 heterosexual participants (average age 20; 71 per cent were women) four times over nine weeks. All were in current romantic relationships that had lasted so far from 0 to 36 months. Men and women who said they'd been poached by their current partner tended to start out the study by reporting less commitment to their existing relationship, feeling less satisfied in it, committing more acts of infidelity and looking out for more alternatives.What's more, over the course of the study, these participants reported progressively lower levels of commitment and satisfaction in their relationships. They also showed continued interest in other potential romantic partners and persistent levels of infidelity. This is in contrast to participants who hadn't been poached by their partners - they showed less interest in romantic alternatives over time.

The researchers led by Joshua Foster attempted to replicate these results with a second sample of 140 heterosexual participants who were surveyed six times over ten weeks. Again the participants who said they'd been poached by their partners tended to report less commitment and satisfaction in their current relationships, and more interest in romantic alternatives. However, unlike the first sample, this group did not show deterioration in their relationship over the course of the study. The researchers speculated this may be because the study was too short-lived or because deterioration in these relationships had already bottomed out.

It makes intuitive sense that people who were poached by their partners showed less commitment and satisfaction in their existing relationship. After all, if they were willing to abandon a partner in the past, why should they not be willing or even keen to do so again? This logic was borne out by a final study of 219 more heterosexual participants who answered questions not just about the way their current relationship had been formed, but also about their personalities and attitudes.

Foster and his team summarised the findings: "individuals who were successfully mate poached by their current partners tend[ed] to be socially passive, not particularly nice to others, careless and irresponsible, and narcissistic. They also tend[ed] to desire and engage in sexual behaviour outside of the confines of committed relationships." The last factor in particular (measured formally with the "Socio-sexual Orientation Inventory-revised") appeared to explain a large part of the link between having been poached by one's partner and having weak commitment to the new relationship.

Across the three studies, between 10 and 30 per cent of participants said they'd been poached by their current partners. This shows again that a significant proportion of relationships are formed this way, the researchers said, and that more research is needed to better understand how these relationships function. "We present the first known evidence [showing] specific long-term disadvantages for individuals involved in relations that formed via mate poaching," they concluded.

_________________________________ ResearchBlogging.org

Foster, J., Jonason, P., Shrira, I., Keith Campbell, W., Shiverdecker, L., & Varner, S. (2014). What do you get when you make somebody else’s partner your own? An analysis of relationships formed via mate poaching Journal of Research in Personality, 52, 78-90 DOI: 10.1016/j.jrp.2014.07.008

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Wednesday, 29 October 2014

Friendly, conscientious people are more prone to "destructive obedience"

In Milgram's shock experiments, a surprising number of people obeyed a scientist's instruction to deliver dangerous electric shocks to another person. This is usually interpreted in terms of the power of "strong situations". The scenario, complete with lab apparatus and scientist in grey coat, was so compelling that many people's usual behavioural tendencies were overcome.

But a new study challenges this account. Recognising that many participants in fact showed disobedience to the scientist in Milgram's studies, Laurent Bègue and his colleagues have investigated what it is about an individual's character that influences the likelihood he or she will obey or not. Specifically, the researchers measured the Big Five personality factors of participants taking part in a quiz-show adaptation of the traditional Milgram situation.

Seventy-six adults (40 men) played the role of questioner in a pilot episode of a French TV show. A quiz host urged the participants to apply increasingly intense electric shocks to a quiz contestant each time the contestant answered a question incorrectly. In the standard version of the set-up, in which the host remained present, 81 per cent of participants obeyed instructions to administer the highest level 460 volt shock marked "xxx".

Eight months later, the participants who played the role of questioner (and electrocutioner) were contacted again, ostensibly as part of a separate investigation, and asked if they would answer some survey questions about their personality and political beliefs. Thirty-five men and thirty women who'd taken part in the TV quiz agreed to answer these questions. The results showed that people who scored more highly on the personality traits of agreeableness and conscientiousness were more likely to be obedient in the Milgram-style situation. Meanwhile, describing oneself as left wing went hand in hand with greater disobedience, and, for women only, a history of having taken part in strikes or other acts of rebellion was also associated with more disobedience.

The researchers acknowledged there is a slim possibility that the TV quiz experience shaped participants' later personality scores. This issue aside, they said their findings showed how "destructive obedience" might actually be facilitated by "dispositions [agreeableness and conscientiousness] that are consensually desirable elsewhere with family and friends." Conversely, behaviours that may be considered disruptive in other contexts (such as political activism)  "may express and even strengthen individual dispositions that are both useful and essential to the whole society, at least in some critical moments," they said.

_________________________________ ResearchBlogging.org

Bègue, L., Beauvois, J., Courbet, D., Oberlé, D., Lepage, J., & Duke, A. (2014). Personality Predicts Obedience in a Milgram Paradigm Journal of Personality DOI: 10.1111/jopy.12104

--further reading--
More on Milgram in the Digest archive.

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Tuesday, 28 October 2014

What I don’t hear can’t hurt me: insecure managers avoid input from employees

Organisations do better when there are clear communication channels that allow staff to point out ways the company can improve. Similarly, teams who freely share ideas and concerns are more tight-knit and motivated. And their managers get enhanced awareness, and to share in the praise for any improvements that pay off. So encouraging employee voice should be a no-brainer, especially for any manager feeling unsure of their ability to deliver solo. Yet according to new research, these insecure managers are the ones least likely to listen and act on staff input.

Nathanael Fast and colleagues began with a survey of 41 managers and their 148 staff within a multinational oil company. Managers who rated themselves lower on managerial self-confidence (e.g. they disagreed with statements like “I am confident that I can perform effectively on many different tasks”) tended to have staff who were less likely to speak out, stating that they perceived their manager did not encourage it. Why? A follow-up experiment aimed to find out.

One hundred and thirty-one employed participants (84 women) read an imaginary scenario in which they were the manager of an airline that was receiving a rise in customer complaints. The scenario then described a meeting where the participant began announcing a solution. But before they had finished, an employee – a maintenance chief named Spencer – offered an alternative he argued was better for the airline in the long-term.

The researchers found that whether participants heeded Spencer's advice depended on their confidence, which was manipulated at the start of the scenario. Some participants were told that they were performing impressively, others were told that people were questioning their competence. Those in the latter condition expressed lower faith in the maintenance officer’s expertise and showed less willingness to either implement his proposal or to seek help in the future from him or his colleagues.

The underlying cause appears to be the existential threat posed to low-confidence managers by these employee ideas. As people are loath to admit to such insecurities, the researchers didn’t directly measure them. Instead, they showed they could cancel the effect of low confidence by asking participants to complete a positive affirmation: a short writing exercise reminding themselves of their other positive qualities, As this intervention worked, it suggests that the root cause of managers’ ignoring staff advice was related to their own defensiveness and desire to protect their managerial status.

Accepting unsolicited feedback can be challenging for anyone. But “The Manager” is by definition on top of things, so gaps in awareness can be particularly threatening for people in that role. Self-confidence makes it easier to take that medicine, and enjoy its benefits in the long-term. But those anxious about their capability may be afraid of being unmasked, and turn away from sources of insight, at their own cost.

Here we see how the harms caused by self-doubt can spill over into a wider climate. Organisations could help new managers put aside unrealistic expectations of their need to be omniscient, and to recognise the benefits of putting the entire team brain to work. After all, better to have the Spencers of this world on your side than against you.

_________________________________ ResearchBlogging.org

Fast, N., Burris, E., & Bartel, C. (2014). Managing to Stay in the Dark: Managerial Self-Efficacy, Ego Defensiveness, and the Aversion to Employee Voice Academy of Management Journal, 57 (4), 1013-1034 DOI: 10.5465/amj.2012.0393

--further reading--
Self doubt turns bosses into bullies

Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.

Monday, 27 October 2014

The psychology of violent extremism - digested

Today the UK and its allies are at war with an extremist group based in Syria and Iraq that calls itself the Islamic State (IS; a name rejected by mainstream Muslim organisations). The group declared a caliphate in June this year and is seeking to expand its territory.

Amnesty International has accused IS of war crimes including ethnic cleansing, torture, abductions, sexual violence and the indiscriminate killing of civilians. Prime Minister Cameron has branded the group "evil" and says they "pervert the Islamic faith as a way of justifying their warped and barbaric ideology."

Many of the fighters of the Islamic State are Western citizens. Indeed, this week there were reports that a fourth jihadist from Portsmouth, England, has died fighting for the Islamic State.

Never has it been more urgent that we understand why people are drawn to extremist beliefs and to violent extremist organisations. Here the Research Digest provides a brief overview of the psychological research and theories that help explain the lure of extremism.

The Need to Belong
A  2006 survey and interviews with British Muslims (cited by Andrew Silke 2008) uncovered an important finding - people who felt their primary identity was Muslim, rather than British, held more sympathetic views towards the concept of jihad and martyrdom. Indeed, according to Randy Borum (2014) writing in Behavioural Sciences and the Law, a key psychological vulnerability of those drawn to extremism is their need to feel they belong. "In radical movements and extremist groups, many prospective terrorists find not only a sense of meaning," he writes, "but also a sense of belonging, connectedness and affiliation." A related idea is that extremist groups and their ideologies help people cope with uncertainty about themselves and the world.

Who Becomes an Extremist?
In 2006 Edwin Bakker published a review of hundreds of jihadi terrorists in Europe based on media and court reports. Of the 242 people Bakker identified, most were in their late teens or twenties, and just five were women. According to Silke 2008, most Islamist extremists are also from upper or middle-class backgrounds and tend to be well educated (see also).

Most Extremists Are Not Mentally Ill
According Borum (2014) "research suggests that knowledge of mental illness has little to offer professionals with operational responsibilities for preventing and dealing with terrorism." Silke (2008) agrees: "... the vast majority of research on terrorists has concluded that the perpetrators are not psychologically abnormal."

Extremism is Fuelled By a Group Process Known as "Risky Shift"
Many people are originally introduced to extremist ideologies through close-knit groups of friends. Within small groups of this kind, a classic psychological effect known as "risky shift" (or "group polarisation") frequently occurs. This is the tendency for groups to arrive at more extreme positions than any individual members would have done on their own.

Marginalisation and Perceived Injustice
Many would-be violent extremists bear grievances, sometimes a sense of humiliation (either personally or on behalf of their in-group) and a desire for revenge. At the same time, they feel that their needs and interests are not recognised by mainstream authorities. It's notable that in the UK and other Western countries, the Muslim population are massively under-represented in national parliaments. A 2009 paper "Patterns of Thinking in Militant Extremism" analysed the mindset of many extremist groups around the world (based on internet and printed material), including the IRA and the Muslim Brotherhood, and two key beliefs were the illegitimacy of the established authorities and that change can only be achieved through extreme and unconventional means.  

Dehumanisation of Enemies
A shocking feature of the behaviour of many violent extremists is their total disregard for the value of other human lives. A relevant concept here is the way that people are able to "dehumanise" their enemies or those they see as unimportant - that is, to see them as somehow less than human. This ugly feature of human psychology has been shown in the context of brain responses to homeless people and drug addicts, and in connection with gang violence.

Existential Influences
For many people, extremist religious movements offer existential comfort. "... [E]xtremists and many so-called fundamentalists in all religions, use one of the most basic and often most destructive forms of defense," writes Gibbs (2005) "they repress the anxiety of nonbeing, splitting the self and filling the void with self-protective belief systems and structures ..." Also relevant here is "Terror Management Theory" - this states that we respond to reminders of our mortality by entrenching our beliefs and deepening our cultural allegiances. A 2006 study found that Muslim Iranian students reminded of their own mortality subsequently expressed more support for their peers who believed in the legitimacy of suicide attacks against the US.

Violent Scriptures
It's well known that passages of the Koran and the Bible contain calls for violence. Theologians explain that these passages are not meant to be taken literally and need to be considered in context. Nonetheless, there remains the possibility that violent scripture incites aggression. A 2007 study put this to the test. Bushman and his colleagues found that students exposed to violent scripture subsequently exhibited more aggression, especially if they were religious believers.

Excitement, Danger and the Search for Meaning
"...the quest for personal significance constitutes a major motivational force that may push individuals toward violent extremism," write Arie W. Kruglanski et al in a 2014 paper. Silke (2008) similarly points out that in many communities, "joining a terrorist group increases the standing of a teenager or youth considerably." It's also important to recognise the lure of danger and excitement, especially to young disenfranchised men. Silke quotes a former IRA member reminiscing about his time as a terrorist: "I lived each day in a heightened state of alertness. Everything I did, however trivial, could seem meaningful."
_________________________________
   
Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Doing the "happy walk" made people's memories more positive

Walking in a more happy style could help counter the negative mental processes associated with depression. That's according to psychologists in Germany and Canada who used biofeedback to influence the walking style of 47 university students on a treadmill.

The students, who were kept in the dark about the true aims of the study, had their gait monitored with motion capture technology. For half of them, the more happily they walked (characterised by larger arm and body swings, and a more upright posture), the further a gauge on a video monitor shifted to the right; the sadder their gait, the more it shifted leftwards. The students weren't told what the gauge measured, but they were instructed to experiment with different walking styles to try to shift the bar rightwards. This feedback had the effect of encouraging them to walk with a gait characteristic of people who are happy.

For the other half of the students, the gauge direction was reversed, and the sadder their gait, the further the gauge shifted to the right. Again, these students weren't told what the gauge measured, but they were instructed to experiment with their walking style and to try to shift the gauge rightwards as far as possible. In other words, the feedback encouraged them to adopt a style of walking characteristic of people who are feeling low.

After four minutes of gait feedback on the treadmill, both groups of students were asked how much forty different positive and negative emotional words were a good description of their own personality. This quiz took about two minutes, after which the students continued for another eight minutes trying to keep the gait feedback gauge deflected to the right. The students' final and crucial task on the treadmill was to recall as many of the earlier descriptive words as possible.

The striking finding is that the students who were unknowingly guided by feedback to walk with a happier gait tended to remember more positive than negative self-referential words, as compared with the students who were guided to walk with a more negative style. That is, the happy walkers recalled an average of 6 positive words and 3.8 negative words, compared with the sad walkers who recalled an average of 5.47 positive words and 5.63 negative words. Focusing on the students who achieved the happiest style of gait, they recalled three times as many positive words as the students who achieved the saddest style of gait.

"Our results show that biased memory towards self-referent negative material [a feature of depression] can be changed by manipulating the style of walking," said the research team led by Johannes Michalak. The observed effects of gait on memory were not accompanied by any group differences in the students' self-reported mood at the end of the study, suggesting a direct effect of walking style on emotional memory processes.

The results build on past research that suggests pulling a happy facial expression can lift people's mood. There could be exciting practical implications for helping people with depression, but the researchers acknowledged some issues need to be addressed. For example, the current study involved a small non-clinical sample, and the researcher who delivered the forty emotional words to the walking students was not blind to the gait condition they were in, raising the possibility that he or she inadvertently influenced the results in some way. It's also notable that there wasn't data from a baseline control group whose gait was not influenced; it would have been useful to see how they performed on the memory test.
_________________________________

  ResearchBlogging.orgMichalak, J., Rohde, K., & Troje, N. (2015). How we walk affects what we remember: Gait modifications through biofeedback change negative affective memory bias Journal of Behavior Therapy and Experimental Psychiatry, 46, 121-125 DOI: 10.1016/j.jbtep.2014.09.004

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Saturday, 25 October 2014

Link feast

Our pick of the best psychology and neuroscience links from the past week or so:

Brain Games Exploit Anxieties About Memory Loss For Profit – Scientists
A group of over 70 psychologists and neuroscientists has written an open letter warning that the claims of brain training companies are unsubstantiated, and that playing the games could divert people from healthier activities.

Free Journal Articles on the Psychology of Violence and Aggression
A digital give away from the publishers Psychology Press.

Beware, Playing Lots of Chess Will Shrink Your Brain!
A new study compares the brain structure of chess grandmasters and amateurs.

Coma Songs
From BBC Radio 3: A meditation on the cultural representation of comas through music, poetry and interviews with the families of people who have a suffered brain injury.

"Just Because Something Mentions the Word 'Brain' Doesn't Mean It's Necessarily Valid Neuroscience"
A video of Professor Dorothy Bishop's recent conference talk on the increasingly popular field of "educational neuroscience".

Are Women Better Decision Makers?
A round-up of recent research findings suggests that, in stressful situations, women make better decisions than men.

Brain Baloney Has No Place in the Classroom
Pete Etchells reports on a worrying new study that found strong endorsement of neuromyths by teachers around the world.

Social Anxiety: Why The Mundane Can Be Terrifying
Guardian blogger Dean Burnett with some personal reflections on extreme shyness.

The Real Crisis in Psychiatry is That There Isn’t Enough of It
President of the Royal College of Psychiatrists Simon Wesseley lampoons the idea that psychiatrists are agents of government control, and argues instead that the real problem with psychiatry is a lack of funding and services.

This is What Developing Acute Schizophrenia Feels Like
Moving, graphic first-person account of a young man's descent into a psychotic episode and his subsequent recovery.
_________________________________
   
Post compiled by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Friday, 24 October 2014

Publication bias afflicts the whole of psychology

In the last few years the social sciences, including psychology, have been taking a good look at themselves. While incidences of fraud hit the headlines, pervasive issues are just as important to address, such as publication bias, the phenomenon where non-significant results never see the light of day thanks to editors rejecting them or savvy researchers recasting their experiments around unexpected results and not reporting the disappointments. Statistical research has shown the extent of this misrepresentation in pockets of social science, such as specific journals, but a new meta-analysis suggests that the problem may infect the entire discipline of psychology.

A team of psychologists based in Salzburg looked at “effect sizes”, which provide a measure of how much experimental variables actually change an outcome. The researchers randomly sampled the PsycINFO database to collect 1000 psychology articles across the discipline published in 2007, and then winnowed the list down to 395 by focusing only on those that used quantitative data to test hypotheses. For each main finding, the researchers extracted or calculated the effect size.

Studies with lots of participants (500 or more) had an average effect size in the moderate range r=.25. But studies with a smaller sample tended to have formidable effect sizes, as high as .48 for studies with under 50 participants. This resulted in a strong negative relationship between number of participants and size of effect, when statistically the two should be unrelated. As studies with more participants make more precise measurements, .25 is the better estimate of a typical psychology effect size, so the higher estimates suggest some sort of inflation.

The authors, led by Anton Kühberger, argue that the literature is thin on modest effect sizes thanks to the non-publication of non-significant findings (rejection by journals would be especially plausible for non-significant smaller studies), and the over-representation of spurious large effects, due to researchers retrospectively constructing their papers around surprising effects that were only stumbled across thanks to inventive statistical methods.

The analysts rejected one alternative explanation. To detect powerful effects a small sample is sufficient, so researchers who anticipate a big effect thanks to an initial "power analysis" might deliberately plan on small samples. But only 13 per cent of the papers in this report mentioned power, and the pattern of correlation in these specific papers appears no different to that found in the ones who never mention power. Moreover, the original 1000 authors were surveyed as to what they expected the relationship between effect size and sample size to be. Many respondents expected no effect, and even more expected that studies with more participants would have larger effects. This suggests that an up-front principled power analysis decision is unlikely to have been driving the main result.

Kühberger and his co-analysts recommend that in future we give more weight to how precise study findings are likely to be, by considering their sample size. One way of doing this is by reporting a statistic that takes sample size into account, the “confidence interval”, which describes effect size not as a single value but as a range that we can be confident the true effect size falls within. As we all want to maintain confidence in psychological science, it’s a recommendation worth considering (but see here for an alternative view).

_________________________________ ResearchBlogging.org

Kühberger, A., Fritz, A., & Scherndl, T. (2014). Publication Bias in Psychology: A Diagnosis Based on the Correlation between Effect Size and Sample Size PLoS ONE, 9 (9) DOI: 10.1371/journal.pone.0105825

--further reading--
Questionable research practices are rife in psychology, survey suggests
Serious power failure threatens the entire field of neuroscience
Made it! An uncanny number of psychology findings manage to scrape into statistical significance
Fake data or scientific mistake?

Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.

Google+