My column for this week reviews the notion that the success of automated polling, sometimes known by the acronym IVR (for Interactive Voice Response), in predicting the outcomes of this year's elections extends to polls on other issues, especially health care reform. Please click through and read it all.
The column quotes the pollsters at the three most prominent firms that conduct automated polling, SurveyUSA, Rasmussen Reports and Public Policy Polling (PPP). Since I quoted each only briefly in the article, and since their comments were all far more extensive and on-the-record, I am sharing them here verbatim.
I asked each to respond to this passage of a polling review from former George W. Bush deputy chief of staff Karl Rove:
Automated polling firms like SurveyUSA and Rasmussen have
drawn criticism in the health care debate for showing Americans
significantly more opposed to reform than traditional pollsters who use
Yet on Tuesday, automated polling firms like Rasmussen were significantly more accurate
than conventional competitors. Voters who stay on the phone to answer
the questions of an automated pollster may more accurately represent
the electorate in off-‐year elections when turnout is lower and only
the most enthusiastic voters are likely to turn out. If so, Democrats
who face re-‐election next year should start worrying--automated
pollsters' results showing a majority of Americans opposed to health
care reform may be the most prescient look at what lies in store for
next year's midterms.
Scott Rasmussen, Rasmussen Reports
First, I am pleased that Karl Rove noted how "automated
polling firms like Rasmussen were significantly more
accurate than conventional competitors" in polling the New
Jersey Governor's race.
Only part of that success can be attributed to the
automated methodology. Much of it has to do with the way
that we measured the support of nominal supporters of
Daggett and undecided voters. Our survey model helped us
project actual Daggett's vote total more closely than
As a result, I continue to believe that you can do a good
automated poll or a good operator-assisted poll. You can
also do a bad poll using either method. Automated systems
clearly have an advantage when it comes to consistency in
tracking polls, but there may be areas where
operator-assisted polls have an advantage as well.
As for the health care debate, the methodology issue has
little to do with it because all polls show a plurality or
majority opposition to the health care plan working its
way through Congress. On the Pollster.com site, the
average results show 49.6% opposed and 41.8% in favor, a
gap of just under 8 points. Our latest polling at
Rasmussen Reports shows 45% in favor and 52% opposed, a 7
I do believe Democrats should be concerned because the
health care debate has become a lose-lose situation for
them. But, it's not because automated polls show a
different result. It's because all polls send the same
message. The health care issue is complex and very
challenging to measure. But, the overall messages from
polling using both automated systems and operator-assisted
approaches are quite similar. Most Americans are at least
somewhat happy with their own coverage and quality of
care. Anything that would force them to change is going to
create political problems. Competition and choice are seen
as good things. And, there is a strong desire to reduce
the cost of health care along with a skepticism about the
ability of our political process to accomplish that goal.
Jay Leve, SurveyUSA
Recorded-voice telephone polls are not inherently superior.
Recorded-voice telephone polls are not inherently inferior.
True: when asked yes/no questions about personal conduct - such as: "Do you have unprotected sex?" or "Do you drink alone?" - respondents who answer by pressing a button or checking a box report higher incidences than respondents who must "confess" to a human.
But: I don't think you can argue, on an issue as complicated as health-care, that mode trumps. I could draft two health-care questions today, and produce conflicting results tomorrow, one that shows support for reform, the other that shows opposition. And I could do that regardless of whether the research was conducted by US mail, mall intercepts, headset operators, professional announcers, or email.
Too many poll watchers are mode-fixated. Often, mode is the least of it.
Tom Jensen, Public Policy Polling (PPP):
IVR polls were more accurate than live interviewers in New Jersey and Virginia at calling the horse race. That does not mean IVR is superior to live interviewers on every kind of question that ever gets polled. It does mean that IVR polls should be taken as seriously as any other polls on most measures of public opinion- they deserve to be a part of the discussion. They should not be ignored on issues like health care and Obama's approval.
That said, I think Rasmussen's Republican friendly numbers on things like Obama's approval and health care are more a result of his polling likely voters, presumably for the midterm elections, than an IVR vs. live interviewer thing. We saw last Tuesday that GOP voters are a lot more fired up right now so it's not surprising they're more likely to pass an off year voter screen. We model our monthly national approval polls on a Presidential year electorate because of the 2012 horse race polling we do and we find Obama with numbers more similar to the live interviewer national pollsters than to Rasmussen's. That's a sampling issue rather than a mode issue.
There are good live interviewer polling outfits and bad ones. There are good IVR polling outfits and bad ones (particularly the sort of fly by night ones that aren't a consistent presence on the polling scene.) What I want to see is not for everyone to think that IVR polls are superior, but for people to judge individual polling companies on their actual merits and not how they conduct their interviews.
I'm not sure if that gets to the heart of what you're looking for and if you have any specific questions I'm happy to answer but those are my overall feelings- no individual poll should be treated as if it's the one and only accurate one but all polls with a track record of accuracy, so long as they're transparent about their methodology, deserve to be taken seriously.
Follow Mark Blumenthal on Twitter: www.twitter.com/MysteryPollster