The Pollster.com Disclosure Project

The Pollster.com Disclosure Project
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

Over the last few months I have written a series of posts that examined the remarkably limited methodological information released about pre-election
polls in the early presidential primary states (here, here
and here,
plus related items here). The gist is that these surveys often
show considerable variation in the types of "likely voters" they select yet
disclose little about the population they sample beyond the words "likely
voter." More often than not, the pollsters release next to nothing about how tightly
they screen or about the demographic composition of their primary voter samples.

Why do so many pollsters disclose so little? A few continue
to cite proprietary interests. Some release their data solely through their
media sponsors, which in the past limited the space or airtime available for methodological
details (limits now largely moot given the Internet sites now maintained by virtually
all media outlets and pollsters). And while none say so publicly, my sense is
that many withhold these details to avoid the nit-picking and second guessing that
inevitably comes from unhappy partisans hoping to discredit the results.

Do pollsters have an ethical obligation to report
methodological details about who they sampled? Absolutely (and more on that
below), and as we have learned, most will disclose these details on request as per the ethical codes of
the American Association for Public Opinion Research (AAPOR) and the National Council on Public
Polls (NCPP). Regular readers will know that
we have received prompt replies from many pollsters in response to such
requests (some pertinent examples here,
here,
here
and here).

The problem with my occasional ad hoc requests is that they
arbitrarily single out particular pollsters, holding their work up to scrutiny (and
potential criticism) while letting others off the hook. My post
a few weeks back, for example, focused on results from Iowa polls conducted by the American
Research Group (ARG) that seemed contrary to other polls. Yet as one alert
reader commented,
I made no mention of a recent Zogby poll with results consistent with ARG. And while
tempting, speculating about details withheld from public view (as I did, incorrectly,
in the first ARG post), is even less fair to the pollsters and our readers.

So I have come to this conclusion: Starting today we will
begin to formally request answers to a limited but fundamental set of
methodological questions for every public poll asking about
the primary election released in, for now, a limited set of
states: Iowa, New Hampshire, South Carolina or for the nation as a whole.
We are starting today with requests emailed to the Iowa pollsters and will work
our way through the other early states and national polls over the next few
weeks, expanding to other states as our time and resources allow.

These are our questions:

  • Describethe questions or procedures used to select or define likely voters orlikely caucus goers (essentially the same questions I asked of pollstersjust before the 2004 general election).
  • Thequestion that, as Gary Langer of ABC News putsit, "anyone producing a poll of 'likely voters' should be prepared toanswer:" What share of the voting-age population do they represent? (Thespecific information will vary from poll to poll; more details on thatbelow).
  • Wewill ask pollsters to provide the results to demographic questions and keyattributes measures among the likely primary voter samples. In otherwords, what is the composition of each primary voter sample (or subgroup) interms of gender, age, race, etc.?
  • Whatwas the sample frame (random digit dial, registered voter list, listedtelephone directory, etc)? Did the sample frame include or exclude cellphones?
  • Whatwas the mode of interview (telephone using live interviewers, telephoneusing an automated, interactive voice response [IVR] methodology,in-person, Internet, mail-in)?
  • And inthe few instances where pollsters do not already provide it, what was theverbatim text of the trial heat vote question or questions?

Our goal is to both collect this information and post it
alongside the survey results on our poll summary pages, as a regular ongoing
feature of Pollster.com. Obviously, some pollsters may choose to ignore some or all
of our requests, but if they do our summary table will show it. We are starting
with Iowa, followed by New
Hampshire, South
Carolina and the national surveys, in order to keep
this task manageable and to determine the feasibility of making such requests
for every survey we track.

Again, keep in mind that the ethical codes of the
professional organizations of survey researchers require that pollsters
adequately describe both the population they surveyed and the "sample frame"
used to sample it. The Code of
Ethics
of the American Association for Public Opinion Research, for
example, lists "certain essential information" about a poll's methodology that
should be disclosed or made available whenever a survey report is released. The
relevant information includes:

The exact wording of questions asked . . . A
definition of the population under study, and a description of the sampling
frame used to identify this population . . . A description of the sample
design, giving a clear indication of the method by which the respondents were
selected by the researcher . . . Sample sizes and, where appropriate,
eligibility criteria [and] screening procedures.

The Principles of
Disclosure
of the National Council on Public Polls (NCPP) and the Code of Standards and Ethics
of the Council of American Survey Research Organizations (CASRO) include very
similar disclosure requirements.

We should make it clear that we could ask many more
questions that might help assess the quality of the survey or help identify methodological
differences that might influence the results. We are not asking, for example, about response rates, the method used to
select respondents within each household, the degree to which the pollster
persists with follow-up calls to unavailable respondents or the time of the day
in which they conduct interviews. We have limited our requests to try to make
it easier for pollsters to respond while also focusing on the issues that seem
of greatest importance to the pre-primary polls.

What can you do? Frankly, we would appreciate your support. If
you have a blog, please post something about the Pollster Disclosure Project
and link back to this entry (and if you do, please send us an email so we can
keep a list of supportive blogs). If not, we would appreciate supportive
comments below. And of course, criticism or suggestions on what we might do
differently are also always welcome.

(After the jump - a more exhaustive list of the questions that
we will use to determine the percentage of the voting age population
represented by each sample)

Appendix - Primary
Voter Sample as a Percentage of Adults

We aim to report the percentage of the voting age population
represented by each primary sample. In most cases, primary surveys have
reported results from samples of likely Democratic and Republican primary
voters, although a few have reported results for only one party.

1) If the pollster has collected and reported results from a
sample of adults, this calculation is relatively easy: What is the size of each
primary voter subgroup (the respondents that answered the primary vote
questions for each party) as a weighted
percentage of the full sample? More specifically, we need to know:

What is the weighted n of
respondents asked the Democratic primary
trial heat questions (or the size of the subsample as a weighted percentage of
all interviews)

What is the weighted n of
respondents asked the Republican primary
trial heat questions (or the size of the subsample as a weighted percentage of
all interviews)

2) If the pollster screened out some contacts in order to
report results from all registered voters or all likely voters, we also need to
know:

How many otherwise eligible adults
were terminated on screen questions concerning vote registration, votes cast in
previous general elections or intent to vote in the upcoming general election?

3) If the pollster screened for and reported the results for
likely primary voters only (regardless of the type of sample used), we need to
know:

How many otherwise eligible adults
were terminated on screen questions concerning intent to vote in the primary
election (or caucus) or past votes cast in primary election (or caucus)?

4) Some pollsters use "over sampling" to fill a quota of
completed interviews of primary voters of each party, such as 600 Democrats and
600 Republicans. They call a random sample of all adults or all registered
voters until they fill the quota for likely primary voters in one party's
primary, then screen out likely primary voters of the first party until they
fill the quota for the second (in other words, once they interview 600
Democrats, then terminate likely Democratic primary voters until they interview
600 Republicans). In this case, we need to know:

How many otherwise eligible likely
primary voters were terminated due to a filled quota for primary voters, and in
which party primary did they intend to vote?

5) If the pollster sampled from a list of registered voters,
we also need to know the size of the target population they sampled in order to
determine the percentage of the voting age population represented by each
primary sample.

If the pollster drew a sample of
all registered voters or all registered partisans, the size of the pool of
target population should be available through the statistics reported by the
official registrar of voters (usually the Secretary of State) in each state.

If the pollster used prior vote
history or some other criteria to narrow the target population, its size will
not be available through public sources. If so, we need to know the size (or
"count") of the sampled target population as determined by the sample vendor.

Popular in the Community

Close

What's Hot