With science in mind, morality increases. Credit: Stockbyte/ PunchStock/ Getty Images

An article by Scientific American.

Public opinion towards science has made headlines over the past several years for a variety of reasons — mostly negative. High profile cases of academic dishonesty and disputes over funding have left many questioning the integrity and societal value of basic science, while accusations of politically motivated research fly from left and right. There is little doubt that science is value-laden. Allegiances to theories and ideologies can skew the kinds of hypotheses tested and the methods used to test them. These, however, are errors in the application of the method, not the method itself. In other words, it’s possible that public opinion towards science more generally might be relatively unaffected by the misdeeds and biases of individual scientists. In fact, given the undeniable benefits scientific progress yielded, associations with the process of scientific inquiry may be quite positive.

Researchers at the University of California Santa Barbara set out to test this possibility. They hypothesized that there is a deep-seated perception of science as a moral pursuit — its emphasis on truth-seeking, impartiality and rationality privileges collective well-being above all else. Their new study, published in the journal PLOSOne, argues that the association between science and morality is so ingrained that merely thinking about it can trigger more moral behavior.

The researchers conducted four separate studies to test this. The first sought to establish a simple correlation between the degree to which individuals believed in science and their likelihood of enforcing moral norms when presented with a hypothetical violation. Participants read a vignette of a date-rape and were asked to rate the “wrongness” of the offense before answering a questionnaire measuring their belief in science. Indeed, those reporting greater belief in science condemned the act more harshly.

Of course, a simple correlation is susceptible to multiple alternative explanations. To rule out these possibilities, Studies 2-4 used experimental manipulations to test whether inducing thoughts about science could influence both reported, as well as actual, moral behavior. All made use of a technique called “priming” in which participants are exposed to words relevant to a particular category in order to increase its cognitive accessibility. In other words, showing you words like “logical,” “hypothesis,” “laboratory” and “theory” should make you think about science and any effect the presentation of these words has on subsequent behavior can be attributed to the associations you have with that category.

Participants first completed a word scramble task during which they either had to unscramble some of these science-related words or words that had nothing to do with science. They then either read the date-rape vignette and answered the same questions regarding the severity of that transgression (Study 2), reported the degree to which they intended to perform a variety of altruistic actions over the next month (Study 3), or engaged in a behavioral economics task known as the dictator game (Study 4). In the dictator game the participant is given a sum of money (in this case $5) and told to divide that sum however they please between themselves and an anonymous other participant. The amount that participants give to the other is taken to be an index of their altruistic motivation.

Across all these different measures, the researchers found consistent results. Simply being primed with science-related thoughts increased a) adherence to moral norms, b) real-life future altruistic intentions, and c) altruistic behavior towards an anonymous other. The conceptual association between science and morality appears strong.

More from Scientific American.

Though this finding replicates across different measures and methods, there’s one variable that might limit the generalizability of the effect. There is some evidence suggesting that attitudes towards science vary across political parties with conservatives having become decreasingly trustworthy of science over the past several decades. Though the researchers did include measures of religiosity in their studies, which did not affect the relationship between science and morality, ideally they would have also controlled for political affiliation. It’s not a stretch to imagine that undergraduate students at the University of Santa Barbara disproportionately represent liberals. If so, the relationship between science and morality found here might be stronger in self-described liberals.

That said, there’s also reason to believe that the general public, liberal or conservative, can draw a distinction between the scientific process and its practitioners. In the same way that people might mistrust politicians but still see nobility in the general organizing principles of our political structure, we could hold charitable views of science independent of how it might be conducted.

These results might seem encouraging, particularly to fans of science. But one possible cost of assigning moral weight to science is the degree to which it distorts the way we respond to research conclusions. When faced with a finding that contradicts a cherished belief (e.g. a new study suggesting that humans have, or have not, contributed to global warming), we are more likely to question the integrity of the practitioner. If science is fundamentally moral, then how could it have arrived at such an offensive conclusion? Blame the messenger.

How can we correct this thought process? A greater emphasis on, and better understanding of, the method might do the trick. It’s significantly harder to deny the import of challenging findings when you have the tools necessary to evaluate the process by which scientists arrived at their results. That new study on global warming is tougher to dismiss when you know (and care enough to check) that the methods used are sound, regardless of what you think the authors’ motivations might be. In the absence of such knowledge, the virtue assigned to “science” might also be a motivational force for ideological distortion, the precise opposite of impartial truth-seeking.