People Who Trust Science Are Less Likely To Fall For Misinformation — Unless It Sounds Sciencey

By Matthew Warren

“Trust in the science” is the kind of refrain commonly uttered by well-meaning individuals looking to promote positive, scientifically-backed change, such as encouraging action against climate change or improving uptake of vaccines. The hope is that if people are encouraged to trust science, they will not be duped by those who are promoting the opposite agenda — one which often flies in the face of scientific evidence. But are people actually less likely to fall for misinformation when they have trust in science?

Yes and no, according to a new a study in the Journal of Experimental Social Psychology. Thomas C O’Brien and colleagues from the University of Illinois Urbana-Champaign find that people with greater trust in science are generally less likely to believe misinformation. But when that misinformation is presented with scientific-sounding content to back it up, they become more easily duped by it.

In the first study, 532 online participants read an article about the “Valza virus”, which stated that the virus had been created as a bioweapon in a government lab and subsequently covered up. The article was written informally, in a style meant to imitate real online posts made by conspiracy theorists. In one condition, the article cited scientists who said that studies in their own lab proved that there had been a conspiracy; in the other, the article instead quoted activists.

Participants then answered questions about how much they believed the article (e.g. to what extent they agreed it was “credible” or “probably true”) and whether they felt it should be shared with a class studying current affairs. (They also answered other questions about their comprehension of the text, in order to obscure the purpose of the study). Finally, participants rated their own trust in science, responding to statements like “scientific theories are trustworthy”, and answered questions that probed their own understanding of scientific methodology.

The team found that, overall, people with a greater trust in science and/or a stronger understanding of methodology were less likely to believe the conspiracy theory. But the article’s content made a difference: people who had a high trust in science were more likely to believe the article if it had quoted scientists than if it had not. Similarly, this group was more likely to say that the article should be shared with the current affairs class if it had apparently scientific content. For people with lower levels of trust in science, whether or not the article had cited scientists made no difference to their beliefs or intentions to share it.

A subsequent study looked at people’s belief in a conspiracy theory about genetically modified foods, and this time used real articles and pictures from websites. All participants read that genetically modified foods caused tumours, and that this was being covered up. In one condition, they saw articles that referred to an actual scientific paper which supported the theory (and which, unknown to participants, has since been heavily criticised and retracted). In the other condition, they again read arguments by activists, without any scientific content.

The results were the same: although, overall, people who trusted in science were less likely to believe this conspiracy theory, their levels of belief increased when the article contained supposedly scientific content.

In a final study, the team found that people were less likely to believe the conspiracy about genetically modified foods when they had first been asked to think about times when it is important to critically evaluate evidence, compared to when they had been asked to think of times when science had benefited humanity.

The research suggests that trust in science can actually increase people’s vulnerability to pseudoscience, the authors write. Broad campaigns to promote trust in science may therefore not be that useful; instead, it may be more beneficial to promote critical analysis skills.

That said, there is clearly some benefit to trusting science: people with low trust in science tended to believe the conspiracy theories regardless of their content, while those with high trust only showed a tendency to believe them if they contained some (pseudo)scientific content. So it seems likely that those broad campaigns could still have some positive effects. Still, to help people spot misinformation it seems important to teach them to critically evaluate evidence and understand that science isn’t infallible.

Misplaced trust: When trust in science fosters belief in pseudoscience and the benefits of critical evaluation

Matthew Warren (@MattBWarren) is Editor of BPS Research Digest