Huffpost Politics

Mark Blumenthal, Polling Expert, Answers Your Questions (LIVE Q&A)

Posted: Updated:

With Barack Obama's reelection announcement and the ever more frequent public statements from Republican primary contenders, the 2012 election cycle is beginning to kick into action. As with every cycle, there will be a large quantity of polls using a wide variety of methodologies questioning the public on almost every election related topic imaginable.

Mark Blumenthal, the senior polling editor at the Huffington Post, will be helping election watchers sort through all of these polls. Blumenthal has worked in the political polling business for more than 20 years, conducting and analyzing political polls and focus groups for Democratic candidates and market research surveys for major corporations. He also founded Pollster.com, a polling website acquired by the Huffington Post in July 2010, and served as a polling analyst at the National Journal.

Today, from 2:30 to 3:30 EST, Mark will be answering your questions about polling. If you want to ask Mark a question, leave a comment below or tweet your question under the hashtag #pollchat.

Like this Q&A? Follow HuffPostLive on Twitter and Facebook to learn about the next Q&A.

Mark Blumenthal Q&A

live blog

Oldest Newest

And that's about all the "bandwidth" I have today. I promise I'll check in here tomorrow and answer any really good questions as time allows.

Thanks to my HuffPost helpers -- and they know who they are -- for pulling this off. Looking forward to doing this again soon!

Share this:

I've got time for one more (before I pass out cold on my keyboard):

Globality asks, in the comments:

"How would you go about polling a national issue

How large of a sample. How geographic­ally dispersed. How to phrase the question. How to include race, religion, etc.."

Um...you want the elevator version?

And sorry to be a wise guy. It's a good question that gets at what I've been trying to do since I started blogging about polling back in 2004: try to take the mystery out of an often mysterious topic. How *do* you interview 600 or 1000 people and "scientifically" estimate the opinions of a few hundred thousand or 230 million?

The answer isn't easy to bang out in a live blog, but the most important thing is that we start with a random sample. That's the easy part. Actually achieving a random sample in practice is harder, and has been getting harder over the last few decades. Some would argue that it has become impossible and that we need to toss out the concept of true "random sampling" altogether (see these two columns from 2010 for the deep dive on that one).

But that argument and your question is the ongoing theme of our work on HuffPost Pollster. I hope you'll tune in if you're not already a regular reader.

Share this:

Dan Fissel has a great question in the comments:

“Why do pollsters use special word choices to skew results of the poll, or to get the right answer they want, Isnt it better to have an accurate result from a poll or do polls have that much affect on polls?”

It sounds like you'll disagree, but I don't think many media pollsters try to intentionally manipulate their wording to produce skewed results. But we all biases, conscious and unconscious, and these can seep into judgements about question wording.

A few years ago, there was a big fight over questions about the Terry Schiavo case. I got an email from an old friend, now a poli-sci prof, who had what I think is the best answer to this question. In short, we often wrongly assume:

that there is a "right" or "unbiased" way to ask a question about any given public issue. There is no such thing. Everyone who works within the polling field is well aware that small changes in wording can affect the ways in which respondents answer questions. This approach leads us into tortuous discussions of question wording on which reasonable people can differ... The answer is NOT to find a single poll with the "best" wording and point to its results as the final word on the subject. Instead, we should look at ALL of the polls conducted on the issue by various different polling organizations. Each scientifically fielded poll presents us with useful information. By comparing the different responses to multiple polls -- each with different wording -- we end up with a far more nuanced picture of where public opinion stands on a particular issue.

I also wrote about this notion more recently here.

Share this:

Artanis1 asks via comments:

In regards to the Presidenti­al election. With polling already in full swing, how accurate is it really? There is so much time from now until the actual election is there any history that shows taking polls at this point provides any kind of realistic view of the results for next year?

(I mean to put the primary polls aside, I know they are much closer)

Great question that, unfortunately, depends on a few things. Polls on the presidential general election, conducted in the last few weeks of the campaign have been amazingly accurate, freakishly so -- to be honest -- given all the challenges that surveys have been facing in recent years. As you move backward in time, they tend to get more variable and, as you anticipate, less accurate. Although once the nominees are known, and except right around the conventions, they're still pretty consistent in the election year.

Primaries are less so, although as Nate Silver has noted recently, even early presidential primary polling tends to be more accurate when there's a dominant, well known frontrunner who beings with a big early lead. That may be especially true in the Republican primaries, many of which were still winner take all until 2008

Although take note: both will be different next year: no dominant frontrunner and one of the Super Tuesday races will be winner take all.

Share this:

Seems like these questions could use a headline, yes? Adding one starting now (told you this was an experiment)...

Ted Bryan asks, via the comments:

"What's your overall read on all of the recent polling about whom Americans would blame for a shutdown? I've seen some polling results that seem contradict­ory."

I've written my last two pieces on that subject, here and here.

The short version: The questions on this subject are producing some unexpected and not necessarily intuitive variation based on minor differences in wording and (perhaps) question order. Usually when that happens, it means lots of people don't have real attitudes -- i.e. in his case many probably have no idea who they might blame if a shutdown they haven't heard much about yet actually happens.

Share this:

Via the comments from Joe Lenski (who, for those unfamiliar is the guy who runs the national exit polls that the television networks and AP run every election year):

“When are we going to start seeing charts on pollster.c om for the 2012 Republican Presidential Primaries - both national and the early states?."

Short and slightly evasive answer: Very soon. We certainly have enough polls at this point to allow for charts of the national head-to-heads a few early states, especially New Hampshire.

Our brilliant and always overworked tech team has been busy with this merger you may have read something about, so we've been a little overdue on a few new features. But we're on it.

Meanwhile, Joe, want to take some questions on exit polls?

Share this:

By the way...apologies for the inevitable typos. Regular readers from the Pollster.com and MysteryPollster days know I'm a terrible proofreader. And this live blog isn't being copy edited.

Ok..next...Robert Nix asks via the comments:

“One out of four homes in America no longer have landlines. Do most pollsters still use mostly landlines or are they using a mix? How does it feel to be on the other side of the questions?

Feels great. Wish I could type (and think faster) though...

When it comes to national media polls, it's pretty much de rigueur to do parallel samples of landline and cell phones. The result from 2010 that I mentioned a few questions back is the main reason. The automated (they don't like to be called "robo") pollsters are an exception, because the federal telecom laws specifically prohibit calling any cell phone with an automated dialing system. Landline pollsters need to have their callers hand dial cell phones.

At the statewide level, we're seeing more and more pollsters shifting to the same approach, but the shift has been much slower, partly because the additional marginal cost of hand dialing cell phones is pretty big, especially if the pollster is trying to screen for "cell phone onlies." Also there are far more pollsters doing robo surveys at the state level.

One exception has been the automated pollster SurveyUSA, which did parallel surveys in 2010 in some states (CA and WA if I remember right) that were automated to landlines and live caller to cell phones.

Share this:

dpearl asks, via the comments:

“In the past you have been a supporter of pollsters signing on to profession al associatio n standards regarding transparen cy and the like. Can you briefly review what those standards are? Also can HuffPost/P ollster make a point of recognizin g those pollsters that follow such standards - for example by putting an asterisk next to survey reports that are in alignment with them?”

Past and present. And briefly? Ah, there's the rub.

For my money, the most important existing standards are part of the ethical codes of the American Association for Public Opinion Research (AAPOR) and the National Council on Public Polls (NCPP). The links in the previous sentence take you to their codes.

What they call for, in essence, is for pollsters to disclose enough so that the rest of us can understand how and when they did the poll, who it represents. The NCPP standards call for disclosure of the things most of us are used to seeing: sample size, survey dates, margin of error, sponsorship...but also two things that many pollsters omit from their reports: The mode (live interviewer, automated, internet, etc) and the sample "frame" (random digits, a list, a panel). AAPOR mandates release of response rate, which virtually no public pollsters do in their public reports.

AAPOR is trying to increase disclosure big-time with their new Transparency Initiative. I wrote a column on this in 2010. I'm a big time supporter.

Yes...we very much want to recognize pollsters that meet and exceed these standards. I'm hoping to share many more developments on this in the coming year.

Share this:

Ok...since I'm obviously desperate to unmask @researchrants and since, well, he/she's the only one asking questions at the moment, I'll take his/her second question:

"Obligatory 'how much do cell phone interviews matter, given that cell phone only respondents don't much vote; question."

That used to be the conventional wisdom, particularly after the 2004 election. The number of Americans with a cell phone but no landline phone was still small enough, and tended to skew toward non-voters (lower income,urban) enough...that it didn't matter.

But with the cell-only population growing to roughly 25% as of last year, it seems to matter. The national generic vote was measured more accurately in 2010 by polls that included cell phone samples. And the difference between polls was pretty close to what the Pew Research studies suggest it should have been.

But an odd thing: At the state and House district level, where virtually all of the polls were landline only, we didn't see a similar skew. In fact, the CD level, polls skewed toward the Democrats.

Share this:

@researchrants asks via twitter: "Has any work been done on whether R's with strong Tea Party backing (Bachmann, Angle, etc.) outperform poll expectations?"

First of all, welcome RR. For those who don't know him, "Rants" is a mysterious but knowledgeable anonymous blogger who works in market/survey research. Worth following.

I can't say we've tried to study that systematically, but anecdotally, it did seem like Tea Party candidates outperformed in GOP primaries in 2010. I'm thinking mostly of Angle and O'Donnell, as I'm sure you are. It's an interesting question, and one we need to watch.

I recall that in 2004, SurveyUSA's Jay Leve pointed out that "internet" candidates like Howard Dean seemed to slightly over-perform in automated polls.

By the way, RR, a clue? Just give us a hint...East Coast? West? University affiliation?

Share this:

I'll start by saying hello everyone, and welcome to this first, somewhat experimental live blog Q&A! Hoping to answer as many questions as I can in an hour or before I pass out from exhaustion, which ever comes first. Hopefully the former. Here we go....

Share this:

Around the Web

Frequently Asked Questions | Pew Research Center for the People ...

Questions About Polling - PollingReport.com

20 Questions A Journalist Should Ask About Poll Results | NCPP ...

20 Questions Journalists Should Ask About Poll Results | Public Agenda

Nate Silver Economist Interview: Seven Questions About Polling ...

All About Polling

Pollster.com: FAQ

The Gallup Poll - FAQ

Q-Poll: Florida Voters Like Barack Obama, Disapprove of His Policies

How the poll of boomers was conducted

Lawmaker pleads for understanding over polls

Q&A: GeoPollster co-founder Adam Kraft on how location-based polling works

Filed by Jake Bialer