Social psychology is reeling from its second research scandal in less than a year, after the Erasmus University of Rotterdam announced the withdrawal of two articles by one of its senior social psychologists. The problematic papers were identified by a ‘Committee for Inquiry into Scientific Integrity’ (chaired by Rolf Zwaan, a psychologist in the University’s Brain and Cognition lab), which was set up to investigate concerns raised about the work of Dirk Smeesters. Among the Inquiry’s recommendations was a call for greater regulation of the fields of marketing and ‘to a lesser extent’ social psychology.
Smeesters, who was Professor of Consumer and Society in the Rotterdam School of Management, was found guilty by the Inquiry of ‘data selection’ and failing to keep suitable data records. Smeesters resigned his post after admitting to using a ‘blue dot technique’ whereby, after achieving a null result, he omitted participants who failed to read the instructions properly (7 to 10 per study, he claims), thus lifting the findings into statistical significance – a procedure he failed to detail in his affected papers. However, Smeesters blamed the unavailability of his raw data on nothing more heinous than a computer crash and a lab move. The Inquiry said it ‘doubted the credibility’ of these reasons.
The affected papers pertained to social priming and past selves and were published in the Journal of Personality and Social Psychology, published by the APA, and the Journal of Experimental Social Psychology, published by Elsevier. A third affected paper had only reached the submission stage of publication. The Inquiry found no evidence of wrong-doing by Smeesters’ co-authors although there’s no doubt they are suffering from the fall-out (at least one of them has posted his feelings online).
These latest revelations come in the wake of the case of Diederik Stapel, a senior social psychologist at Tilburg University, who last year admitted to fabricating the results behind several dozen published studies (see December news, 2011). Smeesters has kept a low profile since the scandal broke, but he surfaced late in June to tell the Dutch newspaper Algemeen Dagblad that he was ‘no Stapel’ – his data was not fabricated; he had made a scientific mistake. Stapel and Smeesters reportedly never worked together.
Concerns were first raised about Smeesters’ work by Uri Simonsohn, a social psychologist at The Wharton School, University of Pennsylvania. Simonsohn has developed a statistical technique for detecting massaged data, details of which are contained in an as yet unpublished paper with the working title ‘Finding Fake data: Four True Stories, Some Stats, and a Call for Journals to Post All Data’ (criticisms of the technique have surfaced online). Simonsohn contacted Smeesters requesting his raw data, and then he reported his findings to Smeesters’ head of school, which led ultimately to the Inquiry.
According to the Inquiry’s report (pdf), Simonsohn’s technique identifies dubious data by looking at the amount of variation in the group means derived from the same population. With the aid of two statistical experts, the Erasmus University Inquiry applied Simonsohn’s algorithm to 22 of 29 of Smeesters’ papers published or submitted since 2007, for which the necessary data were available, which led to the identification of the three suspect papers (the technique was also applied to a random selection of four comparable control papers by others in the field and no anomalies were found).
Concerns were also raised about data anomalies in a fourth paper published by Smeesters and co-authors in the Journal of Consumer Research. In relation to this paper, the Inquiry stated that it had found a file on Smeesters’ network desk that shouldn’t have been there based on his description of how the data were collected. The Inquiry states it ‘cannot rule out that Smeesters used the … file to manipulate the raw data before sending these’ to his data-analyst.
This isn’t the first time the whistleblower Simonsohn has taken an interest in research integrity. Last year he co-authored a paper ‘False-positive psychology’ in Psychological Science, in which he and his colleagues demonstrated the ease with which false-positive results can be obtained by indulging in research practices that occupy a grey area of acceptability, such as adding more participants to a subject pool in search of a significant finding. A paper published in May this year in Psychological Science (but detailed on the Research Digest blog last December) surveyed 6000 US psychologists about practices in this ‘grey zone’ and found that 58 per cent admitted excluding data post-hoc and 35 per cent had doubts about the integrity of their own research. Smeesters told the Inquiry that he doesn’t feel guilty because many authors in his field knowingly omit data to achieve significance.
Early in July, Simonsohn gave an interview to Nature in which he claimed to have identified a third case of scientific misconduct that’s yet to be made official, and a fourth that’s not been acted upon. He said he was motivated to act in these cases by the fact that ‘it is wrong to look the other way’, but he stressed he hadn’t taken justice into his own hands – he was careful to pass things over to the appropriate authorities. ‘If it becomes clear that fabrication is not an unusual event,’ he said, ‘it will be easier for journals to require authors to publish all their raw data. It’s extremely hard for fabrication to go undetected if people can look at your data.’