If you want to improve your tennis swing, learn how to repair your car, or master the piano, you’re likely to seek the help of an expert tutor. Similarly, many people who want to sharpen up their critical thinking skills turn to one of the many books written by philosophers to help lay people identify and avoid the biases and failures of logic that cause us to be, in the words of psychologist Dan Ariely, "predictably irrational".
But what if philosophers are just as susceptible to bad – or at least not entirely rational – thinking as the rest of us?
Evidence that this might be the case began to emerge in 2012 [pdf], when Eric Schwitzgebel and Fiery Cushman reported a study that presented professional philosophers and two comparison groups (academic non-philosophers and non-academics) with a series of moral scenarios to see whether the order in which they were read made any difference to how they were judged, from moral to immoral – the guiding idea being that if scenario A is judged worse than B on rational and philosophical grounds, then it shouldn’t matter whether you read about A before B, or vice versa.
It was known that ordinary people’s moral judgments are affected by the order in which they encounter moral dilemmas, and Schwitzgebel and Cushman confirmed this. For example, many people would consider a drunk driver who passes out, loses control and kills a pedestrian to be more morally blameworthy than a similarly intoxicated driver who hits a tree. Yet people who read about the tree case before the killed pedestrian case are more likely to rate the two scenarios as equivalent in terms of moral blameworthiness than when they encounter them in the opposite order. Surprisingly, Schwitzgebel and Cushman found that professional philosophers were swayed by order too, even those with training in ethics (although, curiously, the order effect went in the opposite direction).
This was a disappointing result for the philosophers, but to be fair, it’s possible that irrational order effects might be lessened among philosophers with expertise in the specific scenarios used in the experiments, or who report stable opinions on these scenarios. Philosophers might also be less influenced by irrelevant factors like order of presentation, or the specific framing of dilemmas, if the scenarios only differ in phrasing and not content. Finally, these symptoms of irrationality might abate if philosophers are instructed to take time to reflect on the scenarios, and to consider alternative phrasings of them.
Now Schwitzgebel and Cushman have tested these possibilities, as reported in Cognition. The pair recruited, via email, 497 philosophers (97 per cent of whom held PhDs, with 21 per cent describing themselves as professors with a specialisation in ethics), and 921 non-philosophers (of whom 87 per cent held a PhD in subjects broadly representative of academia as a whole).
To further probe order effects, participants were presented with variants of the famous
"trolley dilemma", in different orders. In addition, Schwitzgebel and Cushman also looked at whether philosophers were susceptible to framing effects — another kind of well-documented irrationality — by having them make decisions about options for tackling a contagious disease or nuclear threat that were logically the same but differed in how they were phrased, or framed. (See Box: "What scenarios tripped up philosophers?" for more details.)
Half of the participants were put in a "reflection condition", in which they were encouraged, before being presented with the moral dilemmas or framing scenarios, to take time to think carefully about these cases, and afterwards were forced to wait at least 15 seconds before giving their answers. They were also asked to consider alternative ways of framing these scenarios, to provide an opportunity to overcome their immediate reactions. In other words, the researchers did all they could to encourage the philosophers to draw on their knowledge and experience.
None of this made any difference. Echoing their earlier findings, Schwitzgebel and Cushman found that, once again, philosophers are as vulnerable to order and framing effects as everyone else, even under the favourable conditions of the reflection condition, and despite some of the participants considering themselves to have specific expertise in ethics and framing effects!
Schwitzgebel and Cushman confess to being surprised by these results. They not only cast into doubt the supposed expertise of philosophers, even on issues within their speciality, but are also dispiriting for attempts to help ordinary people overcome their mental biases. Taking time to think about dilemmas, considering alternative ways of construing situations, gaining expertise in the issues at hand, and having a solid grounding in logical reasoning are all often touted as ways to achieve this end. Yet these findings suggest that none of these factors – all exemplified by at least some of the philosophers in this study – make much of a difference.
It’s possible that it simply takes an exceptional philosopher to overcome their mental biases and irrationality. But as Schwitzgebel and Cushman conclude, “if there is a level of philosophical expertise that reduces the influence of factors such as order and frame upon one’s moral judgments, we have yet to find empirical evidence of it.”
Schwitzgebel, E., & Cushman, F. (2015). Philosophers’ biased judgments persist despite training, expertise and reflection Cognition, 141, 127-137 DOI: 10.1016/j.cognition.2015.04.015
The unscientific thinking that forever lingers in the minds of physics professors
Post written by Dan Jones (@MultipleDraftz) for the BPS Research Digest. Dan Jones is a freelance writer based in Brighton, UK, whose writing has appeared in The Psychologist, New Scientist, Nature, Science and many other magazines. He blogs at www.philosopherinthemirror.wordpress.com. Check out his previous contributions to the BPS Research Digest.