A who’s-who of pollsters and researchers are gathered in Boston for their annual conference today, and two thirds of Team Pollster are there to bring you the highlights. (The remaining third is back in DC, bringing you this newsletter.) This is the HuffPost Pollster update for Friday, May 17, 2013.
ELECTION-YEAR RESULTS RAISE QUESTIONS FOR POLLSTERS A Google Trends search for “poll” shows public interest peaking every election cycle. The attention paid to the horse race inspires some questions, a few of which of which HuffPollster raised on Thursday night in his capacity as moderator for the plenary session of the annual conference of the American Association for Public Opinion Research (AAPOR):
If a critical story of the election is the ongoing demographic transition of the electorate in terms of race and ethnicity, and pre-election polls miss or understate that trend, have we told an accurate story? If non-response bias interacts with our likely voter models to create or exaggerate swings in voter preferences, can we accurately identify the “truly consequential events” in the campaign? If measurement error clouds our reading of voter turnout or who is really undecided, have we increased our understanding of voter behavior? If surveys tell a campaign that it leads by a comfortable margin but that candidate ends up losing – as happened just this week in British Columbia – did that campaign made the right strategic decisions? How much does this sort of very public failure begin to undermine confidence in the data we produce by those who pay our way, no matter what kind of survey we conduct?
The National Journal's Ron Brownstein explained that given the increase of the non-white portion of the electorate to 28 percent in 2012, according to the network exit polls, Barack Obama was able to win the election by receiving 80 percent of minority votes and 39 percent of the vote from whites: "Lets stop for a minute and consider how extraordinary what I just said is and why this election was such a rock through window for so many Americans. In 2008, Barack Obama was the first candidate ever to lose whites by double digits and win. He lost them by 12 points. Four years later, he lost them by fully 20 points and won. In fact, Mitt Romney matched the best performance ever in the history of polling [since 1952] for a Republican challenger among white voters....Mitt Romney won a higher percentage of white voters in 2012, than Ronald Reagan won in 1980, and he lost.
UCLA political scientist Lynn Vavreck summarized research she conducted with GWU's John Sides as showing an "equilibrium" in which "the top-line results that everyone's familiar with that didn't move very much," yet they found evidence of a "dynamic equilibrium" in which many voters shifted preferences in offsetting ways. She argued that pollsters should conduct more panel surveys, in which pollsters interview the same respondents repeatedly over time: "We can do more than just describe the equilibrium, if we run some panel surveys....we can actually explain the equilibrium. And I think people will be interested in that. And in that way, we move from potential 'game changing moments,' like Lindsay Lohan's endorsement of Mitt Romney, and we move toward talking about things that are actually moving right now."
Dan Wagner, the Chief Analytics Officer for the 2012 Obama Presidential campaign explains the now much discussed decision by the campaign to invest heavily in analytics and modeling: "Our story really starts back in 2010 in terms of opinion research which was essentially a disaster for Democrats, in terms of predicting the outcome. We grossly overestimated Democratic support. We missed the differences between self ID'd and registered independents. And Democrats largely missed the enthusiasm gap completely. But the models, which had been based on huge samples with a modeled electorate were largely right. They said that we were going to lose, and we were going to lose by a huge, big margin. We rebuilt polling from the ground up. We reassessed methodology. We built methodology to sampling that we built internally. We built a unified infrastructure for our data.."
AAPOR plans to publish a video recording of the plenary speeches in the coming weeks.
MORE FROM AAPOR, IN 140 CHARACTERS OR LESS
Tom Guterbock of UVa: cell phone poll interview costs now 1.5x landline, down from 2x in 2010 #aapor
— Mike Mokrzycki (@mikemokr) May 17, 2013
Big #AAPOR takeaway: panel data less responsive to supposed game change events than one-shot cross-sectional polls.
— Michael McDonald (@ElectProject) May 17, 2013
— Scott Clement (@sfcpoll) May 17, 2013
— Annie Pettit (@LoveStats) May 17, 2013
FRIDAY'S 'OUTLIERS' - Links to more news at the intersection of polling, politics and political data:
-Nate Cohn dismisses the possibility of a “Blue Texas” as wishful thinking. [TNR]
-Pollster questions the surveys used to gauge military’s sexual assault problem. [Bloomberg]
-AAPOR releases report on non-probability sampling. [AAPOR]
-Alex Lundry is surprised by GOP's reluctance to embrace listed sample polling. [Twitter]
- Lovestats blogger Annie Petit liveblogs multple AAPOR panels. [Lovestats]
- Today in fake polling: “According to a recent Gallup poll...Nearly 96 percent of respondents said they just wanted to make it clear that they think that everything currently going on in Washington is definitely bad and shouldn’t be happening, unless all of it is actually okay, in which case they are fine with it.” [The Onion]