03/21/2011 05:52 pm ET Updated Dec 06, 2017

Four Market Research Mistakes to Avoid

Somebody in my family was recently sent a direct-mail survey from the National Republican Senatorial Committee (NRSC). The survey's purpose, according to the accompanying letter from NRSC Chairman Senator John Cornyn (R-TX), is to help Republicans in the Senate "fight for the interests and issues that matter to our grassroots base..."

That's a legitimate reason to put a survey in the field, but the questionnaire -- like so many others -- has a variety of flaws that prevent it from accurately collecting and reflecting the views of the respondents. Whether it's for political, business, or academic purposes, proper survey design should help market researchers to reveal truths that will enable better decision-making. With that in mind, I'd like to look at some of the flaws in this particular survey with the goal of demonstrating how surveys should actually be constructed.

To begin, look at the format of the following questions:


Note that the Very important/Somewhat important/Not important answer format doesn't allow for negative responses. You have to either agree that the prompts represent issues that are important to you ("Very important" and "Somewhat important") or state that you are indifferent ("Not important"). There is no way for anybody to express that they disagree with a party platform. Suppose you're a Republican voter who thinks we need amnesty for illegal immigrants. Saying the issue is "Not important" will not accurately reflect your opposition to one of the party platforms.

This entire section could have more accurately captured the opinions of the respondents if a five-point Likert scale had been used. For example, the immigration question could have been written as:

The Senate should stop passage of amnesty for illegal immigrants.
__ Strongly Agree
__ Somewhat Agree
__ Neither Agree Nor Disagree
__ Somewhat Disagree
__ Strongly Disagree

Of course, this suggestion assumes that the purpose of the survey is to accurately reflect the opinions of Republican voters. There are, however, some questions on the survey that suggest that may not be the case. For example, look at this item from the same section:


The prompt is worded to intentionally bias the reader. "ObamaCare" is a weighted term that was coined to make the listener feel negative feelings against health care reform.

Here's the most blatant example of bias being built into the questions:


That's not a question -- it's a lecture. Are the creators of the survey anticipating that anybody will check the "No" box?

Here's an important rule to remember about survey design: never ask a question if the response will not influence your decision-making process. Otherwise, the data you collect will be useless. This survey has many questions where the responses are unlikely to have any impact on how the NRSC behaves either in terms of the legislative agenda or their campaign marketing strategy. So why did they ask that last question? Just as some earlier questions were meant to remind readers about how they should feel about key issues, this final question is meant to act as a pledge. The survey is asking you to confirm your party loyalty.

It should come as no surprise that this pledge (disguised as a survey question) is used to segue to the final section of the document:


It may look like a survey -- but this mailing is really a fundraising effort. Combining market research and fundraising into a single mailing is an ethically questionable practice that violates the trust of the participants. When people choose to complete a survey, they believe that they are helping the researcher and having their voice heard. If the survey turns out to be nothing more than a fundraising campaign, then the participants waste their time and their good intentions.

Hypothetically, if this poll really did have the dual purpose of collecting data and raising funds, what effect would the donation or fee requirement have on the results of the survey? The survey's stated purpose is to take the pulse of all Republicans. The fundraising effort would have the effect of making sure that the sample represents only Republican donors. (There is no check box for completing the survey but not sending money.) This would introduce a selection bias and corrupt the results.

If you're a Democrat, I hope this case study doesn't make you feel superior: I'm sure there are progressive and liberal organizations that use similar fundraising tactics. If you're a Republican, I hope you're able to look past politics to see that the purpose of market research is to collect accurate data that represents your target population, not to persuade people. It doesn't matter whether a survey is being conducted by political fundraisers or by businesses -- the basic standards of responsible market research must remain absolute.