People are “consistently inconsistent” in how they reason about controversial scientific topics

GettyImages-475407733.jpgBy Christian Jarrett

There are various issues on which there is a scientific consensus but great public controversy, such as anthropogenic climate change and the safety of vaccines. One previously popular explanation for this mismatch was that an information deficit among the public is to blame. Give people all the facts and then, according to this perspective, the public will catch up with the scientists. Yet time and again, that simply hasn’t happened.

A new paper in Thinking and Reasoning explores the roots of this problem further. Emilio Lobato and Corinne Zimmerman asked 244 American university students and staff whether they agreed with the scientific consensus on climate change, vaccines, genetically modified (GMO) foods and evolution; to give their reasons; and to say what would convince them to change their position.

Past research has already done a good job of identifying the individual characteristics – such as having an analytical thinking style and being non-religious – that tend to correlate with accepting the scientific consensus, but this is the first time that researchers have systematically studied people’s open-ended reasoning about controversial scientific topics. The results show that for many people, there are certain issues for which the truth is less about facts and more about faith and identity.

Lobato and Zimmerman found that the most common justifications people gave for their positions on the controversial issues were to simply re-state or qualify their belief – what the researchers called a non-justification, which made up 34 per cent of all responses. Among the actual justifications, the most common kind, making up 33 per cent of all responses, was to cite evidence – a promising result. However, 20 per cent of justifications were subjective and involved making a reference to one’s cultural identity, personal experience or fallacious reasoning.

The specific kinds of subjective justification given tended to vary according to topic. For instance, sceptical attitudes toward the scientific consensus on genetically modified foods (as safe) tended to involve fallacious reasoning of a conspiratorial bent (such as “I am hesitant to believe there are NO concerns because the multinational agricultural corporations such as Monsanto have profits as the basis of their existence so any information they put out is suspect”), or they invoked the naturalistic fallacy (such as, “GMO foods are unnatural and therefore not safe for our bodies”). In contrast, subjective justifications for disbelieving evolutionary theory were more likely to make reference to the importance of one’s cultural or religious identity.

Probably the most significant finding, though, was people’s inconsistency in how they reasoned about the four different scientific topics. For example, while many participants did cite data and evidence to justify their stance on some occasions, only 27 participants (11 per cent) did this consistently across all the topics. “It seems that people are consistently inconsistent in how they reason about scientific topics,” the researchers said.

It was a similar story when it came to the kind of reasons that participants cited that would cause them to change their minds. While it was promising that new evidence or data was mentioned more often than any other reasons, still 45 per cent of participants explicitly denied, at least once, that anything could change their mind on a particular topic; 17 per cent said this of more than one topic. And while 80 per cent of participants indicated that new evidence or data would change their mind on a particular topic, not a single participant took this position on all four controversial topics. Again, the main message is people’s inconsistency in how they reason about different scientific topics.

Confirming prior research on individual characteristics that correlate with scientific reasoning, participants with more of an analytical reasoning style (as measured by agreement with questionnaire items like “I enjoy problems that require hard thinking”) and a stronger liberal political orientation, were more likely to agree with the scientific consensus on the four topics, and to make reference to evidence when reasoning about their position.

Lobato and Zimmerman cautioned that it remains to be seen whether their findings would generalise to scientific topics that have not (yet) been politicised. Of course the current results are also based on a narrow US sample and may be different elsewhere. Notwithstanding these limitations, the researchers said their findings could have implications for scientific advocates: they suggest it may be beneficial to discuss and present topics in a way that reduces any association with your audience’s socio-political identity; and they highlight the importance of better training and education in an analytical thinking style.

“Being able to tailor education about science to the manner in which people think about science may improve scientific literacy,” the researchers concluded, “but doing so requires more research into why people hold the beliefs they do about science.”

Examining how people reason about controversial scientific topics

Christian Jarrett (@Psych_Writer) is Editor of BPS Research Digest

17 thoughts on “People are “consistently inconsistent” in how they reason about controversial scientific topics”

  1. It would be interesting to know whether anyone cited the fact that science is an ongoing quest to discover more – rarely is any study the final say on a topic. In reality, science is a bunch of hypotheses waiting to be confirmed/disproved by other scientists, as any good scientist would have to agree. Science is great at giving us clues, but it doesn’t really give us solid facts. I like to see science as a guideline from which others can do further research. It is a fascinating topic for all of us who want a starting point from which we can explore our own world.

    1. I think some more precision in terminology would be of help here, so as to not fall prey to the faulty thinking that is evidenced by subjects in this study.

      Science is not “a bunch of hypotheses waiting to be confirmed/disproved by other scientists”. That is a circularly defined self-referential system.

      By the way, even if “good scientists” “have to” agree with that definition, it would not establish it. Meanings of words are to some extent determined by consensus. Principles of organized systems are not, so much (constructivist quibbling set aside for the moment.)

      Science is a more a MEANS of “confirming/disproving” hypotheses (though those are brittle terms that a scientist would not use. Instead, hypotheses are not confirmed but supported, or are not disproven, but unsupported and eventually discarded.)

      That is, science is more carefully thought of as a method of “doing research”, as you put it. Better said, it is a means of gaining knowledge and understanding that is reliable and verifiable. The scientific method may not invariably give us “solid facts”, but it leads us as close to them as we can get.

      There is a confusion here between science as a discipline, and the colloquial usage offered by Google as “a systematically organized body of knowledge”, which I could elaborate on as the ever-improving results of scientific work that are nevertheless sometimes flawed (due to errors or shortcomings of the work more that the method itself) and never complete.

      In other words, science is NOT the same as knowledge or concepts derived from best attempts at science. You can throw out the latter through better attempts. But because science as a method is self-improving, you cannot throw out the baby of science itself with the bathwater of “scientific” results. Any criticism of science as a method that is valid is (eventually) taken in and becomes part of science.

      One may use science as a “guideline”, but the question is how one will then “do further research”. If we “explore our own world” without relying on science, the tendency of humans is to look for “evidence” (for which we often mistake mere opinion of others) that confirms our own views.

      The inability to recognize our own ignorance, bias, limited perceptions, and faulty reasoning, along with a democratic ethos that is generally admirable but incorrectly applied, leads us to believe that our own naive “research” is equivalent or superior in validity to that of experts. But it just isn’t so.

  2. Pingback: Sex with Timaree

Comments are closed.