Sometimes the Magic Works...

Sometimes the Magic Works...
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

"Sometimes the magic works," said Chief Daniel George in the 1970 classic flim Little Big Man, "and sometimes it doesn't." The same can be said about the loess regression trend lines we plot in our charts.

When we plot pre-election poll results from various pollsters on the same charts, the trend lines usually have the helpful characteristic of minimizing the impact of outlier results and pollsters with consistent "house effects" on the overall estimate. In other words, if one of five or ten pollsters produces a consistently different result, their results do not typically skew the overall average significantly so long as the timing of the various polls is more or less random.

But for some of the national measures we have been plotting recently -- especially Obama's job and favorable ratings and the question about whether Americans perceive things to be "headed in the right direction" or "off on the wrong track" -- a few pollsters that do daily or weekly tracking are producing results with large house effects. Unfortunately that combination, along with the more sporadic timing of other national surveys, is producing the appearance of trends on some charts that are not really trends.

Last night, for example, Andrew Sullivan linked to two charts that appear to show trends in recent weeks: An uptick in the unfavorable rating for Obama and an increase in the percentage saying that things are off on the wrong track. In both cases, unfortunately, the apparent trends are an artifact of timing and house effects.

Let me explain, starting with the right direction/wrong track chart, that follows. (I am using screen shots rather than our live-embedded version here to preserve the look of the chart at the time of this writing -- follow the link to the live chart to use the filter tools yourself):

What Sullivan noticed was the recent uptick in the red line (wrong track) and downturn in the black line (right direction) at the far right (or "nose") of the trend. Now look what happens when we use our filter tool to remove from the trend the two pollsters -- Rasmussen Reports and DailyKos/Research2000 -- whose weekly tracking results provide nearly half (41 of 96) polls plotted in this chart so far during 2009. The recent trend disappears producing an essentially flat line since mid-April:

So removing just two pollsters -- and particularly the two that contributed all four of the poll released in the last two weeks -- eliminates the apparent trend. One problem we have is that these two pollsters release weekly tracks, while the others poll more sporadically. Worse, virtually all of the national pollsters released surveys just before the Obama administration reached its 100th day in office, and we have experienced something of a poll drought since.

But wait. Perhaps those two weekly tracks are catching a more recent trend that we might miss if we rely (for the moment) on the other national tracking surveys that have not produced more surveys in the last few weeks.

To check, let's use the filter tool to select only the surveys from Rasmussen and DailyKos/Research 2000. And just to be safe, I will also turn up the smoothing setting to be especially sensitive to any recent trend:

The trend is almost exactly the same as the version with these pollsters removed, but you can also see that the gap between wrong track and right direction is larger on the second chart of just Rasmussen and Research 2000 (11 points) than on the previous chart excluding those two (4 points), with virtually all of the "house effect" coming from the Rasmussen survey.

So when we look at only the weekly trackers or only the other polls separately, we see flat lines over the last few weeks. When we put them together, we see a recent upward movement on "wrong track." Why? Because when combined the weekly trackers are driving the "nose" of the trend line and the trackers -- especially the Rasmussen track -- is producing consistently different results. So as the Rasmussen results have more influence in the trend line, they tend to drive the red line up and the black line down.

Now let's repeat the exercise with the Obama favorable rating. First, the standard chart showing all surveys. The recent apparent trend is the sharp upward movement on the red "unfavorable" line:

In this case, the Rasmussen and Daily Kos/Research2000 results are six of the seven surveys conducted in the month of May (the new Gallup result was added this morning, after Sullivan's initial post). If we use our filter tool to remove the weekly trackers, the apparent recent change smooths out, reflecting the more gradual increase in Obama's unfavorable rating since the inauguration:

Again, are the trackers picking up a more recent trend that the other national surveys are missing? Here is what the chart looks like if we include only the Rasmussen and DailyKos/Research2000 polls. Here, we see virtually no trend since late March:

The last chart above also clearly shows the enormous house effect separating (in this case) Rasmussen and DailyKos/Research 2000 surveys, with Rasmussen producing consistently lower favorable and higher unfavorable ratings for Obama.

We have discussed the "why" of house effects, especially the consistent differences in the Rasmussen tracking, in previous posts. This case involves something a little more troubling for us: The way house effects and timing have combined to produce misleading "trends" that are more artifact than real. That is something we need to address in a systematic way.

Update: At the suggestion of a reader, Andrew Sullivan removed

only the Rasmussen surveys with similar results to what I obtained above.

Popular in the Community

Close

What's Hot