More NH Clues: The CBS Panel Survey

More NH Clues: The CBS Panel Survey
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

One of the theories about what went wrong for the polls in New Hampshire is that the apparent post-Iowa "bounce" for Barack Obama never really occurred. Perhaps the surge for Obama was just the artifact of some sort of sampling or other methodological distortion that created the false impression that some New Hampshire voters were moving to Obama (or away from Clinton) in the wake of the Iowa Caucuses. While it certainly does not resolve the New Hampshire mystery, there is one piece of forensic evidence on this point that most of us have overlooked: The "panel back" survey of New Hampshire Democrats conducted by CBS News.

Unlike the other pollsters, who contacted fresh samples of New Hampshire households over the final weekend, CBS did something different. They re-contacted 417 likely New Hampshire Democratic primary voters they had already previously interviewed in November, and were able to re-interview 323. This design, which pollsters typically call a "panel back," allows for an examination of individual change. In other words, instead of comparing the aggregate results of two totally separate samplings, the CBS pollsters were able to look for changed opinion among individual respondents.

The CBS panel-back study, completed on the Saturday and Sunday before the New Hampshire primary, found a large individual-level shift from Clinton to Obama (and Edwards), but virtually no shift away from Obama:

26% of likely Democratic primary voters have changed their preference since November. [...]

The New York Senator lost almost one in five of her November voters to Obama, and 10% of her voters have gone to Edwards. Obama, meanwhile, has kept 95% of the individual voters he had in November.

Those shifting preferences helped move the race, as measured by the two CBS surveys, from a 20-point Clinton lead over Obama in November (39% to 19% among those later re-interivewed) to a seven-point Obama advantage (35% to 28%) over the final weekend.

But that's not all. The design of this survey also provides the only data I am aware of to test another theory: Were Clinton supporters, having grown "dispirited" and "disillusioned by her decline in Iowa," simply "undercounted" by the pollsters, as political scientists Bob Erikson and Chris Wlezien theorize here on Pollster.com.

One check would be the response rate, as reported by CBS polling director Kathy Frankovic in her latest column:

The January response rate for the November Obama and Clinton voters was nearly the same, 74 percent for November Obama supporters and 68 percent for November Clinton voters.

But what about a likely voter screen? CBS polling analyst Anthony Salvanto emailed me to add that 71% of Obama's November supporters responded to the second survey and said they were still "definitely" or "probably" planning to vote in the Democratic primary, as compared to 64% of the Clinton's November supporters.

Were these differences statistically significant? Ah, there's the rub. The sample sizes involved are small and lack the statistical power to determine if the differences in response and intent to vote were real. Extrapolating from the November data suggests that CBS had to re-contact roughly 154 Clinton supporters and 92 Obama supporters from November. The margin of sampling error around each subgroup is in the +/- 8-10% range. So neither difference above is statistically significant (for the statistically fluent: I get p-values in the.20 to .30 range, though your mileage may vary).

Of course, significant or not, CBS did use the response variation in weighting their January data, although they saw "little" change in the Clinton-Obama margin as a result:

Before publication of the results, we adjusted (“post-stratified”) the results to account for that small difference in response by previous candidate preference (which is normally done in panel surveys). Correcting for that small difference in response changed little.

My own back-of-the-envelope estimate is that their non-response adjustment added maybe a point to Clinton's support and took a point away from Obama. Either way, their weighted result still showed Obama leading by seven percentage points (35% to 28%), and their individual level data showed a significant shift away from Clinton. Thus, individual-level shifts in opinion, rather than an enthusiasm gap, explained virtually all of the Obama weekend "surge" on this survey.

Kathy Frankovic's column also uses the same response rate data to question at least one interpretation of the so-called Bradley-Wilder theory:

The theory is that the respondent (white or black) might not want an interviewer to think they aren’t voting for a black candidate. They might think the interviewer will take offense, or believe the respondent to be racist.

Taken to its extreme, this theory predicts that respondents who think they have socially unacceptable opinions -- or situationally unpopular opinions -- simply won’t answer a questionnaire.

As Frankovic points out, "the theory would predict that those not voting for Barack Obama would be less likely to complete an interview." However, as the data above indicate, Obama supporters were just as likely to complete an interview as Clinton supporters, if not more so.

PS: If you like the notion of "panel-back" surveys, you will have more to chew over soon, as the Gallup Organization is apparently calling back respondents to its final New Hampshire survey. Susan Page, whose beat includes the Gallup polls sponsored by USA Today, made the following comment on MSNBC's Tim Russert Show this past Sunday (my transcript):

Page: I think it's going to be some time before we know [what happened in NH]. We're going back in the field to reinterview the people we interviewed in our poll that had [Obama] up thirteen points to ask who changed their mind, who didn't go to vote who said they were going to vote, maybe who did go to vote who told us there weren't going to vote who made it through our likely voter screen.

Russert: Are you going to publish that?

Page: Oh absolutely. We want to know why the poll was off, and we don't want to repeat the error.

I am assuming that other New Hampshire pollsters may have similar re-conctact studies in the works.

Popular in the Community

Close

What's Hot