Science-vs.-Religion Rigidity May Help Perpetuate Creationism, Other Antiscience Beliefs

EXPLAINED: Why Anti-Creationism Arguments Often Fail

By Kyle Hill

The Earth is flat. A full moon leads to more crime. Humans were created less than 10,000 years ago.

If you made your way through even the most general of science educations, the above statements should strike you as suspect. Having a Copernican worldview challenged by such a statement, for example, may encourage you to take a quick look around various sources of information to stabilize your psyche. When contradicting information shakes our foundations, how do we respond?

A few weeks ago I wrote a critique of an evaluation of a video by science educator Bill Nye that sparked a debate about how to best communicate a scientific position to a resistant public. If nearly half of Americans hold a creationist worldview, is Bill Nye’s video against teaching creationism to children an effective way to voice the scientific position to a wide audience? (Nye’s video was at 4.5 million views at the time of this writing.) I feel an informed approach is lacking in discussions like these. Communication research is replete with the science of persuasion. Here I want to deal with perhaps the most interesting case: persuading an ideologically entrenched audience that a worldview is incorrect.

Information Processing and Defensive Motivations

A goal of good science communication is to inform attitudes. How the public judges the safety of vaccinations, the nutritional value of genetically modified foods, or the veracity of evolution depends on sound explication. Especially today, when people can instantly get any information they want, it is essential for science communicators to understand how the information people are finding is changing their attitudes.

Research in the information processing paradigm seeks to answer how information modulated by our cognition affects attitudes. Whether we are trying to come to an informed decision, developing our values, or just trying to learn, information processing theories define the routes we take. The most successful theories in this field put human information processing into two modes. We process information either heuristically or systematically (also called System 1 and System 2 or peripheral and central processing). In heuristic processing we rely on cues and gut instincts to help us. For example, if we read an article by a NASA scientist, and do not have the capacity to evaluate the content ourselves, the authority of the author could be enough to superficially judge the information as accurate. Conversely, if we did have the capacity to evaluate the arguments in the NASA article, we could systematically process the information: looking deeply into the arguments and internalizing new ideas while updating old ones.

People are misers of mental effort. If we don’t have the interest or the capacity to look into the arguments, if the message isn’t personally relevant, if we judge that we already know all we need to know about a topic, there is no reason to spend precious mental resources. Thus, heuristic processing is our default mode (think of superficially surfing the web). Looking critically at information takes motivation. This is where information processing research informs effective science communication.

Researches have found that people who systematically process information form longer lasting attitudes that are more resistant to counter-arguments [1]. For example, if an article provides enough information and instills sufficient interest so the reader can systematically process it, the reader’s ensuing judgments will be based on the actual content of the article [2]. Judgments based on a heuristic scanning of the article will instead be based on peripheral cues such as message length, message source, or emotion, rather than on judgment-relevant information [2]. Because we don’t want to waste mental effort, systematic processing is subject to many more constraints such as time, degree of personal relevance, and general clarity of the message. So systematic processing is crucial for contentious topics like creationism and evolution.

Seeking and processing accurate information is one thing, but encountering worldview-contradicting information is another. Even though the science on the issue of evolution versus creationism is one-sided, it doesn’t seem to matter. Refuting creationism is fundamentally different from merely disseminating scientifically accurate information. The reason that creationism cannot be dispatched in the same way belief in a flat Earth can, for example, is that creationism entails an entire worldview with its own set of values distinct from concerns of scientific accuracy. Therefore, even systematically processing accurate information about evolution may do little to change creationist attitudes. The motivation is what matters here.

The desire to form attitudes that square with the available facts is straightforward from an information processing perspective. When determining whether genetically modified foods are as nutritious as conventional foods, for example, we expend as much mental effort as is needed to come to a confident conclusion [1]. We process until we feel we have formed an accurate attitude about genetically modified foods. However, with a topic like creationism, the standard of what is considered sufficient information changes dramatically. As opposed to a motivation to be accurate, encountering a critique of a worldview instills a defensive motivation.

Researchers have found that “the sufficiency of a defensive processing strategy is determined, not by its ability to increase confidence in the objective accuracy of the conclusion, but by its ability to increase confidence in a preferred conclusion [my emphasis] that is consistent with material interests or self-defining beliefs [3].” When a creationist worldview is shaken, the desire to be in accordance with the scientific evidence can fly out the window, biology be damned. Information processing becomes biased in favor of supporting and therefore maintaining belief. And science is the ultimate worldview shaker. Biased systematic processing may also explain why creationism is so hard to root out: effortful processing favoring a particular worldview makes the view more resistant to counter-arguments.

Intelligently Designing Our Messages

In my critique of the Bill Nye article by Marc Kuchner, I stated that the suggestions offered by the “business communications expert” were hollow and insincere. I also claimed that because Nye’s video was a candid expression and not meant to be a primer on evolution, the critical style could be effective for some people. Looking at the persuasion literature, perhaps my critique was not as nuanced as it needed to be. For some, Nye’s message was surely tantamount to blasphemy (indeed, creationists responded quickly), and the resulting motivation to defend a worldview could bias evaluations of evolution.

On the other hand, Nye’s characterization of creationism could be seen as an implication that those of the “creation persuasion” hold views that do not reflect reality. A call to reflect accuracy in our attitudes may encourage a thorough look at the evidence for each position, which is all one can hope for. But disentangling these motivations can be tricky. Just what kind of communication transforms a desire to be accurate into a desire to support prior beliefs? It depends on how much one knows about the topic, how much one needs to know, and a capacity to evaluate the arguments, among other things.

How should we communicate science that potentially contradicts a worldview? Communication research suggests an effort to switch motivations. Anecdotally, the most fruitful conversation I’ve had in this debate was with a creationist who simply wanted a clearer explanation of evolution than was offered in his religious education. The desire to have accurate attitudes allowed him to dispassionately consider the evidence. It was my job as a communicator to give him the ability to understand the basics of the theory, to be able to systematically process what I was saying. What resulted was an enduring embrace of the science. By crafting a message that fostered a desire to be accurate, rather than one of defense, I had changed a mind. My case study of one should hardly inform science communication as a whole, but it shouldn’t be dismissed either.

The research on risk communication offers a similar conclusion. Self-affirmation theory [4] proposes that our thoughts and behaviors are motivated by the desire to maintain self-worth or self-integrity. When threatening information is encountered, people tend to respond defensively in order to maintain this positive self-image. For example, a coffee drinker who thinks he is a “healthy” person may discredit information that claims drinking coffee has health risks. However, if a message can affirm a person’s image through other means, like bolstering important values, the need to respond defensively to threatening information is reduced [5]. This may be the lynchpin in communicating possibly worldview-shaking messages:

Salient, self-affirming thoughts should make it easier to be objective about other, self-threatening information; they should reduce the pressure to diminish the threat inherent in this information. In this way, self-affirming thoughts may be an effective means of reducing thought distorting defense mechanisms such as denial and rationalization (Steele, 1988, p. 290).

Resolving the evolution versus creationism debate may then be about appealing to shared values, like a desire to have beliefs supported by good reasons or evidence. Communicators can affirm an audience’s self-integrity by pointing this out. This approach dovetails nicely with shifting from a defensive to an accuracy motivation. Indeed, this sort of appeal has been found to reduce the defensive processing of messages and increase their acceptance [6]. But it gets more complicated. In one study, having participants reflect on small acts of kindness they recently completed had a similar self-affirming effect. By bolstering self-image (i.e., “I’m a good/smart person”), one is more receptive to threatening information, and processes it in a less biased way [6]. This surely has implications for the language used in potentially threatening messages. Lowering someone’s self image by calling him or her in effect “not good” or “stupid” could easily trigger a defensive cognitive stance. It needs to be made clear when arguing for evolution that its acceptance does not reduce a person’s integrity or self-worth, though many fundamentalist mindsets will claim otherwise.

It could very well be that the emphasis on “science versus religion” has poisoned the well by emphasizing belief defense over accuracy. If the frame can somehow be shifted, and science communicators are diligent in providing the public with the best information available, all we can ask for is that the public thinks deeply about it, and inform attitudes with it.

References:

1. Eagly, A., & Chaiken, S. (1993). The psychology of attitudes. San Diego, CA: Harcourt Brace Jovanovich.

2. Chen, S., & Chaiken, S. (1999). The heuristic-systematic model in its broader context. In S. Chaiken & Y. Trope (Eds.), Dual- process theories in social psychology (pp. 73–96). New York: Guilford.

3. Giner-Sorolla, R., & Chaiken, S. (1997). Selective use of heuristic and systematic processing under defense motivation. Per Soc Psychology Bull, 23, 84–97.

4. Steele, C. (1988). The psychology of self-affirmation: Sustaining the integrity of the self. In L. Berkowitz (Ed.), Advances in experimental social psychology (Vol. 21, pp. 261–302). New York: Academic Press.

5. Sherman, D., Nelson, L., & Steele, C. (2000). Do messages about health risks threaten the self? Increasing the acceptance of threatening health messages via self-affirmation. Per Soc Psychology Bull, 26(9), 1046–1058.

6. Reed, M., & Aspinwall, L. (1998). Self-affirmation reduces biased processing of health-risk information. Mot & Emot, 22(2), 99–132.

Before You Go

Lucy's Skeleton

Lucy, Australopithecus

Popular in the Community

Close

What's Hot