THE BLOG
09/19/2007 08:45 pm ET | Updated May 25, 2011

Public Opinion Surveys and Pollcats: If It Stinks, it's Probably a Skunk

The following piece was produced through OffTheBus, a citizen journalism project hosted at the Huffington Post and launched in partnership with NewAssignment.Net. For more information, read Arianna Huffington's project introduction. If you'd like to join our blogging team, sign up here If you're interested in other opportunities, you can see the list here.

"Bad wording is often the deliberate result of interested parties whose aim is to generate specific responses. One of the best tests of a poll is your reaction to it. Does it seem fair and unbiased? This 'smell test' is not foolproof, however. Seemingly innocent variations in phrasing such as 'aid to needy' vs. 'public welfare programs' can produce very different results." --- Daniel Yankelovich, 2002 PBS interview with Bill Moyers

Face it. There are polls and there are polls. The notion of a perfect poll, one that accurately and fairly measures and reflects the opinions of "the majority of likely voters" everywhere, every time, is just that -- a notion. And it's a dangerously far-fetched one. All polls are not created equal.

I recently had the chance to communicate with a real, live person who has a weighty opinion about what "some polls" are like. Rick Beaule, a schoolteacher from Pennsylvania, worked his way through college. In 1992-93 he worked for Intersearch, a market research firm based out of Horsham, PA. The company was contracted to conduct surveys of various types -- a quality control survey for Cigna Healthplan, a survey of TV coverage of the '92 Winter Olympics. Then there was the other one. It was political.

"It was the fall of '92, I believe, when a survey came in for a political race in the Philadelphia area," Mr. Beaule wrote. "At first the survey seemed as straightforward as the [others], asking for a general rating of each of the two candidates, but I soon began to notice differences.

"Among the most striking were long paragraphs...that we were to read before asking a rating question. Often these paragraphs would contain words analogous to the following: 'If I were to inform you that Candidate X voted to raise taxes six times during his tenure in the legislature, how would that affect your [opinion] of the candidate? Positively, somewhat positively, not at all, somewhat negatively or very negatively?'

"Following that was another such question preceded by a paragraph detailing pay raises the candidate had supposedly voted for himself.

"[Then] was a long paragraph detailing something the other candidate had done that was positive, followed by a rating question about [him].

"Finally the survey asked the overall ratings of the two candidates again.

"It was plain to me that this survey was not intended to obtain opinions so much as to sway them. This seemed unethical to me. I went to my supervisor who sent me to the branch manager who stated that this was what we were hired to do and that the survey had been vetted by experts."

Mr. Beaule requested -- and was granted -- permission to be removed from participating in that survey.

A female voter from the Midwest tells me she's been polled this primary season. Another telephone opinion poll. How did she feel about Hillary Clinton? Did she support HRC for the Democratic nomination? After responding that Clinton was not her first choice of candidates, as she remembers it, she was asked the following question: "Given that Senator Clinton supported allowing women who have just given birth to stay in the hospital for a minimum of 48 hours [after delivery], would you say your opinion of her is now much more favorable, more favorable, less favorable or much less favorable?"

Now, what sane woman in America would say her opinion was anything but "much more favorable" to a question like that? A childless, menopausal insurance company exec, maybe? Where's the positive percentage in such a weighted question? Whose "favorables" might rise accordingly?

"I felt that was kind of like asking if 'you still beat your wife,'" this voter went on. "I asked her [the pollster] questions about some of her questions but she didn't respond, which made me suspect the call was being monitored. She talked very fast, too."

A Virginia voter shares her experience: "I was polled during Democrat Tim Kaine's race for governor. I was asked a lot of questions about religion -- like 'Would I be okay having a non-Christian governor?' Kaine is Catholic, which was apparently not Christian according to the crazy Southern Baptist zealots around here."

The Milwaukee Journal Sentinel Onlinereports that Oregon-based Moore Information called hundreds of Wisconsin voters about the '04 presidential race. Among the questions asked was this one: "Whose position do you think is closer to the truth -- those 'veterans who served with John Kerry' and say that he does not deserve the medals that he received, or John Kerry who disagrees with the veterans that he served with and who appear in the [swiftboaters] ad?"

This kind of push polling begs the questions Who paid for this poll? Who stands to gain from this kind of "question"? Am I being manipulated?

Some polls are about as "fair and balanced" as Fox News, as "No Spin..." as Bill O'Reilly. They're not about the business of asking what we think, they're engineered to tell us what to think. And there's money to be made for doing it. Maybe they think we're a few watermelons shy of a truckload. Well, we're smarter than they believe we are--and we've all got noses. If something doesn't smell right we know it. It's time to compare notes, ask questions and demand answers.

Scratch and sniff, America.

The above piece was produced through OffTheBus, a citizen journalism project hosted at the Huffington Post and launched in partnership with NewAssignment.Net. For more information, read Arianna Huffington's project introduction. If you'd like to join our blogging team, sign up here. If you're interested in other opportunities, you can see the list here.