12/22/2014 03:20 pm ET Updated Dec 06, 2017

Ambivalent Attitudes: The Public and Opinion Polls

Pundits have few kind words for politicians who consult public opinion polls when formulating policy. On the other hand, pollsters themselves consider polls no less than the voice of the people. But where does the public stand? A review of polls about polling, from the Roper Center for Public Opinion Research archive:

Polls: A Good Thing!

The first poll about polling, by the Office of Public Opinion Research, was conducted in 1944. Among the 57 percent who had heard of a "public opinion poll," 75 percent thought they were a good thing for the country. Just 2 percent thought they were bad, and 21 percent had no opinion. This pattern of response remained fairly stable over many decades.


Despite this positive overall appraisal of the role of public opinion surveys, the public remains shows significant skepticism toward polling. In a 1999 Gallup survey, just 4 percent said they had a great deal of trust in polls, and 34 percent a fair amount. Sixty-one percent said they had not much trust or none at all. Still, in 2001, the Kaiser Family Foundation found 84 percent agreed that "polling is far from perfect, but it is one of the best means for communicating what the public is thinking."

But how accurate are they?

Americans' somewhat contradictory attitudes may originate in reservations about polling accuracy. In the 1944 Office for Public Opinion Research poll, 55% of those who had heard of a public opinion poll believed they were pretty nearly right about policy issues most of the time; 12% said not right at all. A third didn't know. When Gallup asked that question again in 1996, 64% percent thought polls were pretty nearly right.


Similarly, the proportion of Americans saying that polling's record in predicting elections has been "pretty nearly right most of the time" stayed relatively stable from 1944 through 1956, despite the well-publicized 1948 error in predicting Dewey over Truman. When the question was asked again in 1996, again there was little change. But a series of Gallup questions in 1965, 1975, and 1988 found somewhat lower numbers thought election polling's record of accuracy had been excellent or good, with very low numbers saying excellent. Although it is possible that these questions corresponded to a period of lower-than-usual confidence in election polling, it is as likely that, given the significant consequences of even a one or two percentage point error in predicting election outcomes, "pretty nearly right" may seem only "fair" to some Americans.


Does the public understand how polls work?

Whether the public has faith in polling results or not, one thing is clear: they don't understand how polling works. In 1948 Gallup asked respondents to describe in their own words how a public opinion poll was conducted. Just 15% said polls relied upon a representative sample, while 37% mentioned some sort of interviewing without mentioning representativeness. Forty-one percent admitted they didn't know. The idea of a representative sample did not become much clearer with time. When asked in a 2005 Gallup poll if a random sample of 1,000 US adults accurately reflected the views of the nation's population, just 30% thought it did; 68% believed it did not. And a 2001 Kaiser Family Foundation poll asked how important the difference was between a poll of randomly-selected telephone numbers and a poll of people who called an 800 number advertised on TV. Only 37% considered this difference in methodology to be very important and 31% somewhat important. Twenty-eight percent consider it not too or not at all important.

The public is equally confused about what pollsters mean by margin of error. A 2007 Harris poll asked respondents which types of error were included in the term. While sampling error was considered part of margin of error by the largest proportion of respondents (69%), significant numbers also thought "margin of error" implied question wording errors (66%), errors in developing a representative base or weighting errors (45%), mistakes by interviewers (45%) and question placement errors (40%).

Should politicians follow the polls - or their guts?

Despite reservations about polling accuracy and the scientific validity of survey methodology, the public actually believes that leaders should pay more, not less, attention to polling results. Gallup has found that, since 1996, majorities think the country would be better off if leaders paid more attention to public opinion polls.


In 2001, the Kaiser Family Foundation found a significant discrepancy between the amount of attention people believed politicians actually paid to polls and the amount people thought they should. Twice as many Americans said that government officials should be paying a great deal of attention to polls than believed they really were. While most in the same poll believed officials in DC should pay a great deal of attention to polls on a range of domestic matters, lower numbers said the same about foreign policy. In specific foreign policy dilemmas, however, perceptions of whether a politician should pay more attention to the polls may depend on the particular leader and the state of public opinion. A Fox News/Opinion Dynamics poll of registered voters in 2007, for example, found 48% thought that George W Bush should follow the polls on Iraq, while just 36% said he should follow his gut.


In general, a politician's gut may be less valued by the public than is commonly believed. A Fox/Opinion Dynamics poll in May 2000 found that 19% said elected officials should pay the most attention to their own knowledge and conscience, about the same number as said they should pay the most attention to experts (20%), constituents who contact them (22%) — and polls (18%).