Huffpost Politics
The Blog

Featuring fresh takes and real-time analysis from HuffPost's signature lineup of contributors

Mark Blumenthal Headshot

An Automated Pollster Interviews by Cell Phone

Posted: Updated:

Will this be the year that "cell phone only" voters wreak havoc on the results of pre-election polls? And does the cell phone only problem doom pollsters that depend on automated, recorded voice methodologies? Two new recent polls from SurveyUSA suggest the answers are not as obvious as some may think.

Let's start with the first question. SurveyUSA, a company that has been conducting recorded-voice surveys for local television news stations for nearly twenty years, has recently released two statewide surveys based on dual samples of both landline and mobile phones. In both cases including cell-phone-only voters interviewed over their cell phones did not make much difference in the results. Their recent Washington poll, for example, shows Democratic Senator Patty Murray leading by a not-statistically-significant four-point margin (37% to 33%) over challenger Dino Rossi in a combined sample of landline and mobile phones. Murray's lead would have been a virtually identical five-point margin (39% to 34%) had they interviewed by landline phone only.

Similarly, a North Carolina survey released just yesterday shows Republican Richard Burr leading Democrat Elaine Marshall by ten points (46% to 36%) in the combined sample interviewed over both landline and cell phones. Burr would also have led by a 10-point margin (47% to 37%) had they interviewed all respondents via landline phones only.

These are just two surveys, of course. A more comprehensive assessment of national data gathered by the Pew Research Center earlier this year found that, "weighted estimates" from a large landline sample "tend to slightly underestimate support for Democratic candidates when compared with estimates from dual frame landline and cell samples in polling for the midterm congressional elections this year." But if that slight understatement is real, it may not produce many "significant" differences, either statistically or substantively, in individual statewide surveys.

What is more interesting here, however, is that an automated pollster managed to conduct a "dual frame" survey at all. The underlying story gets us closer to an answer to the issue of the impact of cell phones on automated surveys.

Some background: Pollsters have a harder time interviewing Americans on their cell phones because of the provision in the 1995 Telemarketing and Consumer Fraud and Abuse Prevention Act (TCPA) that places restrictions on unsolicited calls to mobile phones. As explained by the Marketing Research Association:

The TCPA forbids calling a cell phone using any automated telephone dialing system (autodialer) without prior express consent. This rule applies to all uses of autodialers and predictive dialers, including survey and opinion research.

Virtually all pollsters use some form of "autodialer" to place calls to landline respondents, so virtually all pollsters are affected by the TCPA's restrictions on calls to cell phones. With the exception of CBS News (the only operation I know of where interviewers still hand dial each number), virtually all pollsters use some form of computerized interviewing system that dials the phone so interviewers don't have to. Some also use "predictive dialers" that place calls and only connect the respondent to an interviewer once a live person answers the phone (a process that creates that annoying pause that anyone who has answered a call from a telemarketer is all too familiar with). Finally, all recorded-voice pollsters use an "automated dialing system" for their complete process, though they could theoretically begin with a live interviewer and then hand off the process, with the respondent's consent, to an automated interview.

So when live-interviewer pollsters want to interview respondents on their cell phone, their interviewers need to place the calls manually. Their process becomes less efficient and more expensive, but they do not face a total barrier.

Pollsters that use a recorded-voice methodology face a much bigger problem. Yet somehow, SurveyUSA managed to interview voters in North Carolina and Washington over their cell phones. How did they do it? They used live interviewers:

Cellphone numbers were dialed one at a time, cellphone respondents were interviewed by call center employees. Landline respondents heard the recorded voice of a SurveyUSA professional announcer.

In North Carolina, SurveyUSA used more expensive live interviewers to conduct 404 out of 1,000 interviews, although only 250 of those were in cell-phone-only households (see their methodology statement for more details).

So while this approach amounts to a technical solution to the challenge of reaching cell-phone-only households, it creates a huge challenge to the underlying business model of automated pollsters like SurveyUSA. Consider the chart below, prepared by SurveyUSA CEO Jay Leve for a presentation last year. It suggests that in this case, their costs were somewhere between triple and quadruple what they would have been had they done all interviews using a recorded voice methodology.

2010-07-14-leve-costs.jpg

Any other lessons here?

First, this issue provides another demonstration of why all automated surveys are not created equal. In this case, SurveyUSA is actually doing more of a "mixed mode" poll that combines both recorded-voice and live interviewers.

Second, all "dual mode" surveys based on combining landline and cell phone samples are not created equal either. Pollsters have to decide whether to use the cell phone samples to reach just the "cell phone only" households, or whether to also include the "cell phone mostlys" as well. And either way, they need to decide how to weight the combined samples, often without reliable estimates of the percentage of cell-phone-only households at the state level (see the SurveyUSA release and the Pew Research report for more detail).

Third, as is true for many aspects of poll methodology, pollsters could do a better job disclosing the procedures and methods they use to interview Americans over their cell phones and combine those results with interviews conducted via landline phones. CBS News, for example, tells us only that the numbers for their just released survey "were dialed from random digit dial samples of both standard land-line and cell phones." The release for the NBC News/Wall Street Journal poll tells us that their sample of 1,000 adults included "200 reached by cell phone," but nothing more. There are exceptions, of course -- most notably the Pew Research Center -- but they are few and far between.