My point in this post is not to sort out which types of polls are more accurate than others, though that might come in a later post. Instead, I'm interested in looking at the average rate of polling error. Although there are a lot of things one could look at to judge the quality of a poll (question wording, sampling method, etc.), my guess is that most people focus on the bottom--how close the poll finding is to the election outcome--when judging trial-heat polls. For better or worse, this is likely to be the case. Toward that end, I examine the error in Senate, House, and gubernatorial polls taken in the last fifteen days of the 2006 and 2008 campaigns.
The histograms below show the distribution of polling error across offices and years. In all of the figures, I compare the Democratic percent of the two-party vote result to the Democratic percent of the two-party poll result. Data for all of these graphs come from pollster.com.
To be sure, there are differences across years and across level of office, but the overall picture is not one of wildly inaccurate polls. Some of the key findings are:
- The statewide polls in Senate and gubernatorial elections are generally more accurate than district-level House polls. House polls in 2008 had the highest error rate.
- Most polls (90% of Senate polls, 88% of gubernatorial polls, and 81% of house polls) are within five points of the actual outcome.
- Overall, the election outcomes fall within margin of error of individual polls 79% of the time. Theoretically, this should happen 95% of the time.