THE BLOG
08/03/2009 09:49 am ET Updated May 25, 2011

'Huffpollstrology' and Response Rates

My National Journal column for this week, on a largely overlooked Huffington Post feature ("Huffpollstrology") that asked national media pollsters for their response and refusal rates last fall, is now posted online.

The subject of response rates is a tough one to try to approach with a 800-900 word column. Inevitably, something important is left out. With more space and time I would have worked in mention of the two experimental projects on response rates conducted by the Pew Research Center in 1997 and 2003. These involved parallel surveys with identical questionnaires, but one used Pew's standard procedures and the second used a much more rigorous methodology to obtain the highest response rate possible. In both years, the difference in results across a wide variety of demographic traits and political attitudes was negligible. In addition to the Pew write-ups, both experiments led to articles in Public Opinion Quarterly (in 2000 and 2006).

For those looking to read further on non-response bias, Public Opinion Quarterly also devoted a special issue to the subject that remains free to all online. I summarized it here.

I also gathered all of the responses from pollsters that Huffington Post ran last fall. I will post those either later today or tomorrow, along with some explanation of the guidelines on how to calculate and report response rates published by the American Association for Public Opinion Research (AAPOR).

Finally, I sent Arianna Huffington some questions for the column that she chose to answer via email. I am reproducing them here in full:

Q: Generally, what did you learn from queries to pollsters about their response rates made by the Huffpollstrology project? Were you surprised at the level of cooperation you received (or the lack thereof) or by the information provided?

Plummeting response rates have been the dirty little secret of the polling industry for years now. They've often dropped below 20 percent, making the core polling principle of "equal probability of selection" something of a joke. But polling companies often refuse to release these numbers - and when they do release them, they often bury them at the end of a poll in tiny print. So when we launched HuffPollstrology, we decided that we would also put response rates front and center. We wanted to delve into the gray area of how polls are conducted.

Q: Do you think the response rate information that pollsters provided was helpful to Huffington Post readers? Why/why not?

Absolutely. The media continue to let polls dominate their political coverage - and yet are reluctant to let the public know how much skepticism it should bring to its consumption of polling results. Not just because of results-skewing response rates but also variables like undecided voters and margins of error. So it was important to remind readers that poll results need to be taken with a grain of salt, not treated like they were just brought down from the mountaintop by Moses.

Q: Do you plan to repeat this project again in the future and if so, what if anything might you do differently?

HuffPollstrology was our way of putting the media's obsession with polls into what we consider the proper context -- that is, alongside astrology and betting lines. Asking for and highlighting response rates was only one aspect of the project. Moving forward, given the media's addiction to polls and polling, we will continue digging deeper into response rates and other polling methodology - and, sometime before the 2010 election, we'll decide whether to bring HuffPollstrology back.