I want to add a few thoughts to Emily's post earlier today on the DailyKos/Research 2000 poll of Republicans and how they might check for a skew in the sample that some argue would result from "sane Republicans" hanging up after taking offense to the questions. Another potential problem, called out today by Republican pollster Alex Lundry, is not as easy to check: The possibility of a skew in respondents' answers caused by what pollsters call "acquiescence bias."
Acquiescence bias is the tendency of some respondents to select affirmative answers where the choice is whether to affirm or reject the statement presented (including "agree or disagree," "favor or oppose" and "yes or no" formats). This topic has been the subject of decades of study and debate among social scientists, and even though pollsters continue to rely on agree-disagree questions, academic survey researchers mostly agree that this format tends to produce more apparent agreement than questions offer a choice between two competing statements.
Here is an example from Schuman and Presser's classic text, Questions and Answers in Surveys (p. 221), based on an experiment first conducted by the NORC General Social Survey in 1974: They asked a random half sample to agree or disagree with this statement: "Most men are better suited emotionally for politics than women." Slightly less than half (47.0%) agreed, 53.0% disagreed.
They asked the other random half-sample to choose between two statements (and included a middle choice):
Would you say that most men are better suited emotionally for politics than are most women, that men and women are equally suited, or that women are better suited than men in this area?
Fewer (33.1%) agreed that men were better, 4.3% said women were better suited than men, and 62.6% said they were both equally suited. Researchers at the University of Michigan's Survey Research Center replicated the experiment three times between 1974 and 1976, producing similar results. They produced consistently greater agreement that "men are better" using the agree/disagree format (ranging from 44.3% to 45.5%) than when using forced-choice format (ranging from 32.5% to 38.3%).
Another strategy to reduce this bias is to try to balance the direction of the statements, as recommended in Presser, et. al, Methods for Testing and Evaluating Survey Questions (p. 440):
Acquiescence bias can be reduced by balancing scales so that the affirming response half the time is in the direction of the construct and half the time is in the opposite direction (e.g. six agree/disagree items on national pride, with the patriotic response matching three agree and three disagree responses).
With those recommendations in mind, consider the questions asked on the DailyKos/Research2000 survey in the order in which they presented the results. The first eight present all of the more sensational, ludicrous assertions (most of which pertain to President Obama). Seven of eight ask respondents to affirm or reject the extreme statement:
- Should Barack Obama be impeached, or not?
They then ask 15 issue questions that do mix up the order somewhat. Eight questions -- ask respondents if they agree with a liberal policy position, five ask about a conservative policy position, and two (the questions about Christ and marriage as a partnership) force choices between two statements:
- Should Congress make it easier for workers to form and join labor unions?
I don't want to overstate the consensus of pollsters -- academic or otherwise -- on this issue. Many highly regarded survey researchers continue to rely on agree/disagree questions, often because of their simplicity and brevity or because such questions are part of a long-standing time series that the pollster would rather not disrupt (good example of the latter here; for more discussion see Javeline, 1999).
So while it would be a bit unfair to condemn Research 2000 for relying on question formats that pollsters and academics continue to rely on, Lundry has a point. Acquiescence bias probably exaggerates the amount of agreement measured for some of the more ludicrous assertions about Barack Obama tested on the Kos poll.
Update: As Alex Lundry notes below, his comments about acquiescence bias earlier today came after reading a message sent by Stanford graduate student Josh Pasek to AAPOR's members only listserv. With Josh's permission, here is a portion of that message:
Given that 10-20% of respondents tend agree with any statement (likely
due to social norms), I went through the survey mentally subtracting 15
percentage points from every "yes" answer. That does leave some
shocking numbers -- particularly as acquiescence tended to indicate
support for gay rights, sex education, etc. -- but suggests that
Birthers, for instance, may be outnumbered in the party (a slight
consolation at best). I'm not saying this to suggest that the opinions
being expressed even with a correction are reasonable, but I worry that
not addressing this kind of issue is the reason so many people out there
are skeptical of survey results in the first place.