Some thoughts about tonight's South Carolina's primary, including some follow-up on yesterday's post on the variation in pre-election poll results:
1) Reverse Bradley/Wilder? Noam Scheiber sees evidence of a potential "reverse Bradley/Wilder effect" in South Carolina. His theory is that live-interviewer surveys may be understating Barack Obama's support among African-Americans. Scheiber's post is worth reading in full, but here is the gist:
If Obama consistently did better among black voters in automated polls, which eliminate the "social discomfort" that might discourage them from telling (presumably white) interviewers they support him, we'd have evidence for this hypothesis.
So what do the polls say? They say I might be onto something:
In the three most recent automated polls in South Carolina (PPP, SurveyUSA, and Rasmussen), Obama takes 67, 73, and 68 percent of the black vote, while Hillary takes 13, 18, and 16. In the three most recent live-interviewer polls (Zogby, Mason-Dixon, and ARG), Obama's takes 55, 59, and 61 percent of black voters, while Hillary takes 18, 25, and 25.
So, among black voters, that's an average lead of 69-16 for Obama in automated polls, but only 58-23 in live-interviewer polls--a huge difference (53-point lead in the former; 35-point lead in the latter). It's not exactly definitive--I'm only using three data points in each case, and there are other methodological differences between the polls--but it does strongly suggest that some black voters are reluctant to tell human pollsters they support Obama, but feel comfortable saying it to a machine.
2) Lying to Robots? Mickey Kaus floats a whole new theory, that voters are just as likely to lie to "robots," perhaps even more so:
I used to think talking to a robotic phone answerer was pretty close to a "secret ballot"--what was the robot going to do to me, anyway? But machines do a whole lot these days--they track your musical tastes, follow your movements, raise or lower your credit ratings. Now a robot can conceivably do a lot to me, at least in the paranoid part of my imagination activated when I get an unsolicited call. At best, it's probably generating a list to sell someone! I don't want it know my real innermost thoughts, including my political thoughts, especially my un-PC political thoughts. These days, I'd be much more paranoid about pushing a button that say "I'm voting against beloved minority candidate X" than telling a live operator the same thing. Sorry, Rasmussen! The traditional truth-revealing advantage of robo-calling may be the artifact of a transitional era in info-technology
This is an interesting theory that, at least for the moment, lacks supportive evidence. Survey methodologists have been studying "interviewer effects" for decades, and have found consistent evidence that "self-administered" surveys (that use paper or a computer rather than an interviewer) produce more reports of "sensitive" behaviors (sexual activity, drinking, drug use). If the growing presence of computers in ours lives has made respondents less truthful when responding to self-administered surveys, so far at least, no one has proven it.
3) Will The SC Exit Poll Resolve These Questions? Noam Scheiber and my colleague Charles Franklin (and many others) will be looking at vote-by-race tabulations in the South Carolina exit poll. But readers of both wonder if exit polls are susceptible to the same effects as telephone polls. After quoting Franklin, Kaus writes:
Of course, people can lie to exit pollsters too! If you're a black South Carolinian and want to help Hillary as much as you can, you'll walk into the booth, vote for her, then walk out and tell the exit poll person you voted for Obama. There may also be non-Machiavellian peer pressure in black precincts to tell the exit pollsters the same thing (which, perversely, might hurt Obama in tomorrow night's press spin by making it look as if he received an ethnic bloc vote). In white areas similar pressure might enocourage voters to falsely tell exit pollsters they voted for Edwards or Clinton.
One of Scheiber's commenters reaches a similar conclusion:
But aren't the exit polls all done by human beings, not machines? How will you know how African-Americans really voted if they tell you on the way out that they voted for Clinton?
The problem with both arguments is that voters don't "tell" exit pollsters anything. Interviewers hand respondents a paper form, which they fill out privately and drop into a "ballot box."
On the other hand, the characteristics of exit poll interviewers (race, gender and age) may have some influence on whether voters agree to participate in the survey. Historically, exit pollsters depend on mostly younger interviewers and have, as a results, had the hardest time gaining cooperation from older voters. While exit pollsters attempt to correct for such "non-response bias" by weighting, some distortions may remain.
So on the question of racial polarization in today's vote, it would be helpful to attempt to verify the exit poll findings with actual results in heavily African-American precincts.
4) More on Race and the Exit Poll - Speaking of validating the exit polls with hard numbers, NBC's Chuck Todd shares a helpful email he received from Mason-Dixon's Brad Coker after Todd noted that the 2004 exit poll estimated South Carolina Democratic primary electorate four years ago as 47% black, as compared to 55% on the most recent Mason-Dixon poll. Coker is skeptical of the exit polls as an "overall demographic indicator." He writes:
To get a real handle on what the African-American vote is likely to be, one only needs to look at real numbers. The South Carolina Secretary of State's office published the following statistics on South Carolina's 2004 and 2006 state Democratic primary elections. These are based on real voters, not a survey sampling.
According to the state's statistics, the '04 Dem primary for president attracted 58% of non-white voters compared to 42% of white voters; In the '06 Dem primary for governor, the ratio was 60-40 in favor of black voters.
These hard numbers show a much higher percentage of African-American voters in South Carolina's state primary races for Governor and U.S. Senate, so I don't think it is a stretch to expect a similar turn-out in a presidential primary that features a major African-American contender. If anything, 55% black might actually end up being a bit on the low side. I will be very surprised if a clear majority of today's Democratic primary voters are not African-American."
5) "Fraudstorm Advisory" - Our friend Mark Lindeman points out in a DailyKos diary that South Carolina "votes on ES&S iVotronics, paperless Direct Recording Electronic (DRE) voting machines" that are "exquisitely, ridiculously vulnerable to attack." As such, he has some helpful information on the shortcomings of those machines and advice on how to interpret the results in light of the inevitable speculation of vote fraud. Most relevant to our topic is his discussion of exit polls:
The exit poll "results" are not what they seem. If form holds, very shortly after the polls close at 7 PM, several networks will post preliminary tabulations based on exit poll estimates. Even before that, once the quarantine is broken, rumors may fly about what the exit polls show. Please be advised: even if you unaccountably believe that exit poll interviews are practically foolproof, these tabulations (or rumors) will not tell you the interview results! The early projections will be based on a combination of interview data and prior expectations. Given the variability in the pre-election polls, who knows what "prior expectations" will be?
Also be advised that in the 2004 general election, the estimated margin of error for the South Carolina exit poll -- assuming that the poll was otherwise unbiased -- was about 8 points on the margin between Kerry and Bush. (This margin of error cannot be figured in advance, and it can't be figured based on the number of respondents alone. It depends on the variability across the precincts in the exit poll sample.)