Now that major media outlets are asking what when wrong in their coverage of the primaries over the past week, you've got to wonder will they learn their lesson between now and February 5th?
Some disagree that the media even goofed at all. Take for example, MSNBC's Chuck Todd who, according to a recent San Francisco Chronicle story said, "it's wrong to think media members have egg on their faces. They were just doing their jobs. All of the pre-election polls, including private polling conducted by Clinton's campaign, were pointing to a clear Obama victory." But, even that seasoned veteran of the airwaves, Tom Brokaw, recently admitted that the media was wrong in following polls to shape their election coverage.
But what if anything could have been done differently? Take a look at what my fellow citizen journalists at OffTheBus and many others at independent news organizations did over these past weeks. Feet to the ground, (or maybe, boots in the snow) we covered events that received little to no coverage from many major news outlets. This happened not just this month, but over the course of the last several months. We watched, listened and spoke one-on-one with voters in these early states and beyond face to face. Because when it all comes down to it, it is the voters who ultimately decide it all. At least that's the ideal.
Unfortunately, what passes for time-tested journalism in 2008 is too often based on a much less ambitious method of reporting. Through my own experience covering the Democratic race over the past several months, I've done side-by-side comparisons of citizen journalists' and alternative news' accounts with mainstream media coverage of the exact same event and I'm left wondering: how they could be so drastically different?
Butch Ward, wrote in an article Jan. 8 on the Poynter Institute that journalists, even professionals with years of experience, too often forget the basic premise of what it means to be a journalist:
Cover the news, don't predict it.
"Watching the cable channels cover Tuesday night's results, I was struck by how little anyone told me about why people in New Hampshire voted as they did. At one point, I heard the briefest of snippets on one channel that exit polls showed New Hampshire voters had been most concerned with the economy. And yes, I saw charts that told me Clinton had reclaimed much of the women's vote she had lost to Obama in Iowa.
But no one was telling me why."
"After almost a year of nonstop coverage, why can't someone tell me what the most important players in this election -- the voters -- are thinking?
Where's the journalism?"
Although some claim there is unarguable journalistic value in reporting polls to provide just that example of what voters are saying. Potential voters are sampled, but what happens if they never show up on caucus or election day? Then you've got trouble.
The American Association of Public Opinion Research describes polling as "a scientific process that attempts to capture information about individual attitudes and behaviors, both of which are subject to variation over time. Events following the conduct of a survey or poll can result in opinion and behavior changes."
So, therefore it appears that just the very act of reading about the results of a poll, in say a news story, could then influence those polled after they've already been questioned.
1. Who paid for the poll and why was it done?
2. Who did the poll?
3. How was the poll conducted?
4. How many people were interviewed and what's the margin of sampling error?
5. How were those people chosen? (Probability or non-probability sample? Random sampling? Non-random method?)
6. What area or what group were people chosen from? (That is, what was the population being represented?)
7. When were the interviews conducted?
8. How were the interviews conducted?
9. What questions were asked? Were they clearly worded, balanced and unbiased?
10. What order were the questions asked in? Could an earlier question influence the answer of a later question that is central to your story or the conclusions drawn?
11. Are the results based on the answers of all the people interviewed, or only a subset? If a subset, how many?
12. Were the data weighted, and if so, to what?
A second matter that complicated the media's N.H. coverage was the large turnout in New Hampshire coupled with reporters who failed to consider this possibility even after the Iowa caucuses showed it could be imminent, all because they were too closely focused on polling data.
Shane D'Aprile wrote Jan. 9 on Campaigns and Elections "that "turnout in the Granite State yesterday was record-setting with more than 500,000 voters coming out to the polls - easily surpassing the average turnout in the state's presidential primary. Many of those voters are not in the normal universe of presidential primary voters, and thus most pollsters didn't anticipate them casting a vote yesterday."
A final and most obvious possibility to consider: human bias. We try to be objective, to be sure. But there is always some larger plan at hand. We judge, no matter how much we don't want to admit it. It makes for good reading when the judgment is well-reasoned and presented as just that: opinion. But, when a news story is based on a position of judgment that leaves out crucial facts, that's when the news can go terribly wrong.
And unfortunately, as John Edwards and the candidates being excluded from the current news trends can attest, what exists is a near to complete media blackout.
While you wouldn't want to discount the opportunity for opinion and analysis, which is so valuable to a strong democracy, basing coverage on the results of a poll, failing to look beyond news trends, and not confronting one's own bias is doing a disservice as a journalist. And as we've seen, after New Hampshire, doing so can only come back to haunt us.