How to Improve Pollster Disclosure

It can be hard to convince pollsters to disclose methodological details. I want to review the past efforts and propose an idea to promote more complete disclosure in the future.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

In Part II of this series on how to answer the question, "Can I trust this poll," I argued that we need better ways to assess "likely voter" samples: What kinds of voters do pollsters select and how do they choose or model the likely voter population? Regular readers will recall how hard it can be to convince pollsters to disclose methodological details. In this final installment, I want to review the past efforts and propose an idea to promote more complete disclosure in the future.

First, let's review the efforts to gather details of pollster methods carried out over the last two years by this site, the American Association for Public Opinion Research (AAPOR) and the Huffington Post.

  • Pollster.com - In September 2007, I made a series of requests of pollsters that had released surveys of likely caucus goers in Iowa. I asked for information about their likely voter selection methods and for estimates of the percentage of adults represented by their surveys. A month later, seven pollsters -- including all but one of the active AAPOR members -- had responded fully to my requests, five provided partial responses and five answered none of my questions. I had originally planned to make similar requests regarding polls for the New Hampshire and South Carolina primaries, but the responses trickled in so slowly and required so much individual follow-up that that i limited the project to Iowa (I reported on the substance of their responses here)
  • AAPOR - In the wake of the New Hampshire primary polling snafu, AAPOR appointed an Ad Hoc Committee to investigate the performance of primary polls in New Hampshire and, ultimately, in three other states: South Carolina, California and Wisconsin. They made an extensive request of pollsters, asking not only for things the AAPOR code requires pollsters to disclose but also for more complete information, including individual-level data for all respondents. Despite allowing pollsters over a year to respond, only 7 of 21 provided information beyond minimal disclosure, and despite the implicit threat of AAPOR censure, three organizations failed to respond with even the minimal information mandated by AAPOR's ethical code (see the complete report).
  • HuffingtonPost - Starting in August 2008, as part of their "Huffpollstrology" feature, the Huffington Post asked a dozen different public pollsters to provide response and refusal rates for their national polls. Six replied with response and refusal rates, two responded with limited calling statistics that did not allow for response rate calculations and four refused to respond (more on Huffpollstrology's findings here).
Continue reading on Pollster.com

Popular in the Community

Close

What's Hot