THE BLOG

On Filled-in Questionnaires and the Clinton Pollsters

08/12/2008 04:25 pm ET | Updated May 25, 2011

I want to add one thought the chorus of commentary on Josh Green's Atlantic Monthly article on the Hillary Clinton campaign, based on a remarkable collection of email and memoranda he obtained from sources within the campaign. It concerns the first sentence in an April 25 email from newly installed pollster Geoff Garin to the Clinton high command:

Attached is the filled in questionnaire from the North Carolina survey.

Those ten words probably seem utterly mundane to the ordinary reader, even to the ordinary campaign consultant. Pollsters share results with their clients. It's a basic part of the job. Notice also that Garin sent his email at 7:25 a.m. on a Friday morning. The timing and content imply that he was sharing the most critical "top line" results of a tracking survey that had completed the night before.** Thus, this email shows us Garin passing along results as soon as he has them for review by other decision makers. Further analysis and internal discussion no doubt followed.

What makes Garin's ordinary act so remarkable is that Mark Penn, the original Clinton pollster and "chief strategist" rarely delivered a "filled in questionnaires" to the Clinton campaign's senior decision makers. I know this because I heard the story a few months ago from a Clinton staffer with first-hand knowledge of what Penn provided to the campaign (who agreed to share the story on condition of anonymity). My source said that Penn would routinely brief strategy sessions without providing the complete results of the poll in advance. Instead, he would present whatever results best made his case (as exemplified by the the smattering of numbers that appear in the Penn memoranda that accompany Green's article).

Perhaps Hillary and Bill Clinton received the full data, but the senior staff and consultants did not. Amazingly, I am told, Penn also initially refused to share the full cross-tabular reports (the reams of tables like this one showing results to every question by every subgroup of interest), as is also standard practice among campaign pollsters. It was not until relatively late in the campaign at the insistence of then campaign manager Patti Solis Doyle that Penn relented, sharing a hard copy of the cross-tabs on condition that Solis Doyle keep it locked up in a file cabinet in her office.

One can understand the temptation that a "chief strategist" might have to control the flow of data. If you are convinced you have the right strategy, and you make the final decision, why give others a tool to question your judgment?

The problem with that approach should be obvious. It poisons the environment within which functional campaigns privately hash out disagreements and reach consensus about strategy. The pollsters job in this process is to put the data on the table, to provide analysis and guidance about that data, but also to let other senior staffers examine and question it. When the pollster wears two hats -- pollster and "chief strategist" -- greater conflict, questioning of motive and campaign "dysfunction" are inevitable.

**One reason I'm confident that this email followed within hours after completion of calling is that one of the respondents later blogged about his experience (discussed here). The respondent reported having been called a night or two before Garin sent his email.