March 24, 2009

Surveys Less Than Truthful

Share: LinkedInRedditXFacebookEmail

By Tom Ryan

Ever accentuating the positive, people tend to exaggerate accomplishments,
downplay faults, or outright lie on surveys, according to a new study in
the Journal of Consumer Research.

“The tendency of people to portray themselves in a more favorable light
than their thoughts or actions, called ‘socially desirable responding,’ is
a problem that affects the validity of statistics and surveys worldwide,” said
Ashok K. Lalwani, assistant professor of marketing, University of Texas,
and author the study, in a statement.

When asked about their own behavior in relation to materialism, compulsive
buying, drug and alcohol addiction, cigarette smoking, shoplifting, gambling,
prostitution, and intolerant attitudes, “people tend to answer in a
less than candid manner,” the study found.

But the researchers also defined two separate forms of “socially desirable
responding” based on a person’s cultural orientations. For example,
people from cultures that have a “collectivist orientation” (China,
Korea, India, Taiwan, Singapore, Japan) are more likely to engage in “impression
management,” which is “a deliberate, strategic presentation of
a socially approved image of the self.”

Impression management is
“a conscious, active and deliberate attempt to fake good behavior in
front of a real or imagined audience,” writes Prof. Lalwani. That need
to give the “right” answer can be reduced by keeping survey participants “cognitively
busy” by playing background music during surveys, he found.

In contrast, consumers with an “individualist cultural orientation” (the
U.S., Canada, France, U. K., Australia, Germany) are more likely to engage
in self-enhancement, which is
“a spontaneous tendency to present an internalized, unrealistically
positive view of the self.”

The professor claims that this behavior “is so unconscious that there
is little that can be done to curtail it.”

Discussion Questions: Do you think people’s tendencies to fabricate to
impress have a significant impact on survey results exploring consumer
behavior? How do you get more truthful answers or adjust survey results
for possible fabrications?

Discussion Questions

Poll

28 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
David Rich
David Rich

Social scientists, at least those candid enough to admit to the failings of their profession, have always admitted that survey respondents’ answers to survey questions need to be carefully interpreted. Some respondents–hopefully–tell the truth (as far as they know or understand it), but other respondents will lie or fabricate answers either to confuse, mislead, impress or please the surveyor or surveying institution.

So the answer to the question is, of course their tendencies to fabricate have an influence on surveys of customer behavior. But, smart marketing researchers have built fail-safes into their questionnaires ever since they figured this out. First, researchers will attempt to somewhat mask their intention for asking questions. This neutralizes those out to confuse or please the surveyor. Next, researchers use procedures to identify or trap those telling lies so that their answers can be excluded from the final survey tally. But most of all, good researchers know the limitations of survey research and therefore are disciplined in the amount and type of information they ask respondents for. Even the most accurately collected survey responses are still only right up to some probability. The real challenge of survey research is to carefully interpret responses and to explain their limitations to the client who has commissioned the information gathering.

Mel Kleiman
Mel Kleiman

I keep seeing the same comments presented over and over again in different ways. It really boils down to one question: is this a survey or is this research? Most of what we get are surveys that we take as research. I think most surveys are done to prove what the surveyor wants it to prove.

Gary Edwards, PhD
Gary Edwards, PhD

The article rightly points out the limitations of opinion research vis a vis collecting valid data from consumers. I believe it serves as a clear warning as to what kind of information one can reasonably expect honest answers. My experience has been consumers will give fairly straightforward, candid opinions on what they like or dislike in a retail experience and on which product are more or less desirable.

The issue of providing socially desirable responses, which has been written of in the academic literature for some decades, surfaces as it always has when more sensitive areas are explored. A restaurant challenging patrons to comment on portion sizes for food, or on their desire for what decadent food beckons versus the healthy choice for them to make is fraught with potential invalid answers. All of these kinds of issues where the socially desirable response is prone to counter the “truth” of the decisions consumers are known to make should be approached with caution. It is in the hands of the researcher to pose questions and probe delicate issues as to uncover the truth, no different in a survey than in any kind of social interaction we have.

Li McClelland
Li McClelland

Like others here, I learned how to write poll/survey questions and to conduct surveys and focus groups while in grad school. Once understanding that almost all key questions are written with a bias (either purposeful or unintended), and that nearly all surveys are conducted and paid for with a specific marketing or cause-related purpose which presupposes a POV, then in general it is hard to take survey results seriously. Further, (as this article suggests) since people being surveyed may stretch the truth for social reasons or may not have been selected for adequate relevance to the subject being polled (perhaps intentionally) well, Houston we have a problem!

For all those reasons and more, this article purporting to have done “research” on surveys seems really silly.

Camille P. Schuster, Ph.D.
Camille P. Schuster, Ph.D.

Validity of responses is always a concern in any kind of research. Just because methods are commonly used doesn’t mean that the validity of concepts, questions, and responses can be ignored. These concepts are critical to the validity of the results so presentations of results need to include a description of how validity was assessed.

Gene Detroyer

One always has to use surveys carefully. My colleagues are right with regard to how a survey is drafted. The way a question is asked often times projects a specific answer. Just check out Lou Dobbs on CNN and you’ll see questions that 90% of the respondents can answer only one way. Not surprisingly, the answers are consistent with Mr. Dobbs’ position.

The other issue is the skew generated by the participants. Just the fact that someone answers a survey adds the issue that these people are different than the general population. How odd people must be who fill out diaries for a pittance of a gift? Forget about the behavioral bias noted in the article, just consider when YOU might or might not respond to a survey.

Perhaps, the best way to get information is to mine for it on the internet, though that also has bias. The best way to use it is as something directional but not definitive.

Dr. Stephen Needel

This is not a new finding–Rosenthal and Rosnow wrote about this in Artifact in Behavioral Research back in 1968. We’ve published a number of articles demonstrating this problem in packaging research. Fortunately, there are a number of research techniques that avoid this problem by observing behavior rather than asking questions–we use virtual reality shopping in our work to get around the self-presentation or social desirability problem.

Warren Thayer

Depends on the survey and how sensitive the info is that is being sought, of course. But it’s a real problem, and one reason why a good research firm is worth its weight in gold if it can get at “reality,” with actionable insights that can be used in predicting consumer buying behavior. Lots of smoke and mirrors out there, but few really good firms that can do this on any kind of regular basis.

David Biernbaum

Surveys are as reliable only as one’s ability to construct them, and conduct them. As a marketing professional in the consumer goods industry I am extremely insistent that a relentless amount of thought and planning go in to any efforts to pursue a meaningful survey with the objectives defined very clearly. I will also say that as a marketing person, I would know how to conduct a very skilled survey to get exactly the end-results my brand, or my client so desired, and that’s why, before taking survey results seriously, I’m a huge fan of first exploring who was the surveyor and what might have been his or her end-result goals?

Len Lewis
Len Lewis

How do you know everyone here isn’t lying right now?

Michael Tesler
Michael Tesler

In relation to retailing (not all marketing research) the vacuum of a focus group room and the desire to get along and seem intelligent to a peer group frequently of the opposite sex has virtually nothing to do with how people act in the multi-sensory environments that great stores provide and the variety of attitudes consumers take on when shopping with friends and their emotional rather than intelligent reactions to products.

Doron Levy
Doron Levy

It really does depend on how the survey is worded. I have seen some great questions that force the survey taker to think about an answer reducing the chance of fabricating a response. I always tell marketers to look at things from the customer’s eyes. While facilitating some focus groups, I found that articulating a more detailed question tends to lead to a more detailed and thought-out response. Stay away from one word answers and force the participant to think about their answer. ‘Does this mustard taste good?’ is not the best question for a survey, in my opinion.

Joan Treistman
Joan Treistman

Many years ago I was asked to indicate my preferred measure for estimating purchase interest. On a five point scale, my client wanted to know, did I use top box or top two boxes as a barometer? My answer: I believe the bottom two boxes. It’s always been the case that you can trust what people say they DON’T Like. Forecasting from “likes” is amorphous.

As a professional researcher, it’s my obligation to uncover the truth and that’s why methods of research and questionnaires have to be carefully developed. Too bad there are so many who believe anyone can provide a sound methodology, construct a questionnaire, conduct a survey or uncover truth in a focus group. You wouldn’t choose your doctor the same way. If your survey is less than truthful, look in the mirror and ask yourself why.

Phil Rubin
Phil Rubin

For the work we do on customer marketing–loyalty, relationship and partnership marketing–we often see a very strong correlation between reported data (i.e., answers on surveys) and actual data (e.g., shopper data). Part of this is a function of the audience, part is the study design and part of it is clearly the relationship that respondents have with the category and (if disclosed) the brands involved.

There is a large difference between surveys on social-related topics and those related to customer behavior. Increasingly, customers expect that companies (and especially their marketing departments) know more about them than they usually do, which might account for what we see as often very accurate responses.

Herb Sorensen, Ph.D.
Herb Sorensen, Ph.D.

The results of any survey present a “Picasso image” of the truth. For example, Picasso’s painting of a young woman is clearly recognizable as that of a woman. It may even accentuate or communicate features not readily noticeable, until the artist has called attention to them. But even a photograph is an inadequate representation of the truth. Not many young men would confuse the photo they carry of a beloved as the beloved themselves.

There is nothing wrong with the Picasso, or the photograph. The fault is with non-understanding of their relationship to reality, and the stupid pretense that they ARE reality.

Doug Pruden
Doug Pruden

Marc Gordon suggests that the only question that gets a totally honest answer is “how likely are you to recommend?”. I wonder.

Prof V. Kumar published research on research in the Harvard Business Review Online last Fall demonstrating that that question isn’t delivering the whole truth either. In follow-up research with respondents from a financial services and a telecom study he discovered that “While 68% of the financial services firm’s customers expressed their intention to refer the company to other people, only 33% followed through.” Even more dramatically, while “81% of a telecom firm’s clients thought they’d recommend the company, merely 30% actually did.”

What’s more, very few of those referrals, in either case, actually generated customers (14% at the financial services firm; 12% at the telecom company).

Maybe it’s not The Only Question we need to ask.

Jason Rushin
Jason Rushin

How ironic to have a survey on truth in surveys! 😉

Craig Sundstrom
Craig Sundstrom

I, myself have filled out millions of surveys, and I’ve never exaggerated…even once!)

Seriously though, researchers have long been aware of this (see comments above) and have come up with means of dealing with it (see also above). Journalists, propagandists and others with various ulterior motives, on the other hand, are a problem, often (excitingly) repeating nonsense verbatim.

Joel Warady
Joel Warady

We believe that consumer research has changed forever. Asking questions, and getting responses will not provide you with the information that you need. You will be better served monitoring conversations between people, and listening to what they have to say in real life. Online social network sites can be extremely helpful in this area. And not only will the information be more truthful, but it will be available in a more expedient manner.

Surveys and Focus Groups really are yesterday’s tools. The use of Social Networking will replace these tools.

Dennis Price
Dennis Price

I am with the consensus view on this. My daughter works casually in a market research call center and the stories of how they get people to participate–and then how they change the questions (that the researcher slaved over) AT WILL just because they (the juniors) thinks it makes more sense and they don’t want to repeat themselves. It is hilarious.

John Lofstock
John Lofstock

I agree with Mel Kleiman. These surveys tend to have shades of George Costanza–responses rooted in what people want to believe rather than the truth. Costanza took it to a whole new level:

JERRY: What do you do?

GEORGE: I’m an architect.

JERRY: You’re an architect?

GEORGE: I’m not?

James Tenser

Of course the very first response bias that unavoidably plagues every marketing research study regardless of subject or intent is the response, “Yes, I agree to participate.”

Since all research samples are composed of people who say “yes,” we must accept from the outset that all our research results ignore the major part of the study universe that said “no.” The “no”-sayers may be quite different

It may be argued, therefore, that socially desirable response bias is ensured by the survey sampling process itself. It’s not an artifact but an inherent feature of all survey research.

Now this may seem like a blanket indictment of sample-based research, but we recognize that well-designed studies do in fact bring useful insights–especially when many studies are done over time by different researchers who approach different samples using varying technique. So long as we recognize that the findings are an approximation of the truth, not the actual truth, and make our decisions accordingly, we can live with some response bias.

The researchers or clients who believe their own results without limitations have fallen into the trap of “professionally desirable reporting bias.”

Marc Gordon
Marc Gordon

Asking different variations of the same question is a good way to filter out the “desirable responding” that many surveys suffer from.

From my own work developing surveys, the only question that seems to mean something of value while still getting a relatively honest response is “How likely are you to recommend us to your friends?”

Jonathan Marek
Jonathan Marek

This is the second biggest problem with survey based research about consumer behavior. The biggest is that when confronted in a survey with something new, consumers often have no idea of what they will really do out in reality. Oh, they will tell you something. They just won’t do what they thought they would do. Unfortunately, consumer reaction to something new is generally exactly what marketers seek to understand.

BTW, many comments above seem to imply that good survey design can solve these problems. It can’t.

George Whalin
George Whalin

As we look at consumer studies and surveys I am always skeptical of what people say they are going to do at some point in the future. A very recent study says that 75% of participants say they have cut back on what they are buying. A majority of these same consumers are also saying they will maintain this approach to spending when the economy improves in the future.

I don’t believe it since the history of consumer spending shows that people have always begun to spend freely as confidence returns and money becomes available after an economic downturn.

Ben Ball
Ben Ball

Len Lewis should get the RetailWire Jerry Lewis award!

There are established techniques for getting around this bias in survey methodology. Anyone who has taken any sort of psychological evaluation has found themselves wondering “wait a minute–didn’t I already answer this question?” The answer, of course, is yes you have–but probably in a slightly different wording. And you will see it again, and again, and again. The evaluator is looking for consistency across your answers, and I am told it works. (At least, that’s why they told me I didn’t get the job.)

Few if any consumer surveys go to this much trouble and expense to try to identify and correct for bias. It just isn’t worth it as long as the primary use of the survey is to measure one group’s answers against another–essentially building in a calibration mechanism. As David Bierbaum said, any marketer who is paying attention has learned how to manipulate survey questions, focus groups and darned near any sort of research to increase the chance of what we euphemistically call a “supporting result.” What matters is how you use what you get.

Judy Deems
Judy Deems

I totally agree that people tend to “embellish” on surveys. We are in the midst of a very sluggish economy and about the only thing left is pride. I am presently working as a Demo Rep in a huge “Box” store. I have had my eyes opened. I see this store being busier than any Retail Establishment I have worked in for the past 12 years. I see every class of people looking for the best value. The flow of customers overwhelms me. People are receptive to the products I demo and are eager to learn the features and benefits. They are hungry for answers. We all want to be something we aren’t and surveys give people a chance to do exactly that. I am sure this is taken into consideration and the good and bad are thrown out and the middle ground is the accepted norm.

Robert Heiblim
Robert Heiblim

This is a fact well known by researchers for some time. On the other hand, many users of research may not be as aware of this factor. Without realizing this, companies may draw quite wrong conclusions, but keep in mind that research is in and of itself a business and as with all business, it is built on giving customers “what they want.” This factor is a key to why actual patterns often are far different from what “research” may point to.

Other comments here make the good point that proper design of questions is important. So is both filtering results as well as applying models for the emotional context of the panel. There are several good companies working in this space and my direct experience is that they give much different and often more reliable results. At the least, users of research should be open to these alternative views. Users beware, as all survey data draws results from…statistics and this too is a dismal science.

28 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
David Rich
David Rich

Social scientists, at least those candid enough to admit to the failings of their profession, have always admitted that survey respondents’ answers to survey questions need to be carefully interpreted. Some respondents–hopefully–tell the truth (as far as they know or understand it), but other respondents will lie or fabricate answers either to confuse, mislead, impress or please the surveyor or surveying institution.

So the answer to the question is, of course their tendencies to fabricate have an influence on surveys of customer behavior. But, smart marketing researchers have built fail-safes into their questionnaires ever since they figured this out. First, researchers will attempt to somewhat mask their intention for asking questions. This neutralizes those out to confuse or please the surveyor. Next, researchers use procedures to identify or trap those telling lies so that their answers can be excluded from the final survey tally. But most of all, good researchers know the limitations of survey research and therefore are disciplined in the amount and type of information they ask respondents for. Even the most accurately collected survey responses are still only right up to some probability. The real challenge of survey research is to carefully interpret responses and to explain their limitations to the client who has commissioned the information gathering.

Mel Kleiman
Mel Kleiman

I keep seeing the same comments presented over and over again in different ways. It really boils down to one question: is this a survey or is this research? Most of what we get are surveys that we take as research. I think most surveys are done to prove what the surveyor wants it to prove.

Gary Edwards, PhD
Gary Edwards, PhD

The article rightly points out the limitations of opinion research vis a vis collecting valid data from consumers. I believe it serves as a clear warning as to what kind of information one can reasonably expect honest answers. My experience has been consumers will give fairly straightforward, candid opinions on what they like or dislike in a retail experience and on which product are more or less desirable.

The issue of providing socially desirable responses, which has been written of in the academic literature for some decades, surfaces as it always has when more sensitive areas are explored. A restaurant challenging patrons to comment on portion sizes for food, or on their desire for what decadent food beckons versus the healthy choice for them to make is fraught with potential invalid answers. All of these kinds of issues where the socially desirable response is prone to counter the “truth” of the decisions consumers are known to make should be approached with caution. It is in the hands of the researcher to pose questions and probe delicate issues as to uncover the truth, no different in a survey than in any kind of social interaction we have.

Li McClelland
Li McClelland

Like others here, I learned how to write poll/survey questions and to conduct surveys and focus groups while in grad school. Once understanding that almost all key questions are written with a bias (either purposeful or unintended), and that nearly all surveys are conducted and paid for with a specific marketing or cause-related purpose which presupposes a POV, then in general it is hard to take survey results seriously. Further, (as this article suggests) since people being surveyed may stretch the truth for social reasons or may not have been selected for adequate relevance to the subject being polled (perhaps intentionally) well, Houston we have a problem!

For all those reasons and more, this article purporting to have done “research” on surveys seems really silly.

Camille P. Schuster, Ph.D.
Camille P. Schuster, Ph.D.

Validity of responses is always a concern in any kind of research. Just because methods are commonly used doesn’t mean that the validity of concepts, questions, and responses can be ignored. These concepts are critical to the validity of the results so presentations of results need to include a description of how validity was assessed.

Gene Detroyer

One always has to use surveys carefully. My colleagues are right with regard to how a survey is drafted. The way a question is asked often times projects a specific answer. Just check out Lou Dobbs on CNN and you’ll see questions that 90% of the respondents can answer only one way. Not surprisingly, the answers are consistent with Mr. Dobbs’ position.

The other issue is the skew generated by the participants. Just the fact that someone answers a survey adds the issue that these people are different than the general population. How odd people must be who fill out diaries for a pittance of a gift? Forget about the behavioral bias noted in the article, just consider when YOU might or might not respond to a survey.

Perhaps, the best way to get information is to mine for it on the internet, though that also has bias. The best way to use it is as something directional but not definitive.

Dr. Stephen Needel

This is not a new finding–Rosenthal and Rosnow wrote about this in Artifact in Behavioral Research back in 1968. We’ve published a number of articles demonstrating this problem in packaging research. Fortunately, there are a number of research techniques that avoid this problem by observing behavior rather than asking questions–we use virtual reality shopping in our work to get around the self-presentation or social desirability problem.

Warren Thayer

Depends on the survey and how sensitive the info is that is being sought, of course. But it’s a real problem, and one reason why a good research firm is worth its weight in gold if it can get at “reality,” with actionable insights that can be used in predicting consumer buying behavior. Lots of smoke and mirrors out there, but few really good firms that can do this on any kind of regular basis.

David Biernbaum

Surveys are as reliable only as one’s ability to construct them, and conduct them. As a marketing professional in the consumer goods industry I am extremely insistent that a relentless amount of thought and planning go in to any efforts to pursue a meaningful survey with the objectives defined very clearly. I will also say that as a marketing person, I would know how to conduct a very skilled survey to get exactly the end-results my brand, or my client so desired, and that’s why, before taking survey results seriously, I’m a huge fan of first exploring who was the surveyor and what might have been his or her end-result goals?

Len Lewis
Len Lewis

How do you know everyone here isn’t lying right now?

Michael Tesler
Michael Tesler

In relation to retailing (not all marketing research) the vacuum of a focus group room and the desire to get along and seem intelligent to a peer group frequently of the opposite sex has virtually nothing to do with how people act in the multi-sensory environments that great stores provide and the variety of attitudes consumers take on when shopping with friends and their emotional rather than intelligent reactions to products.

Doron Levy
Doron Levy

It really does depend on how the survey is worded. I have seen some great questions that force the survey taker to think about an answer reducing the chance of fabricating a response. I always tell marketers to look at things from the customer’s eyes. While facilitating some focus groups, I found that articulating a more detailed question tends to lead to a more detailed and thought-out response. Stay away from one word answers and force the participant to think about their answer. ‘Does this mustard taste good?’ is not the best question for a survey, in my opinion.

Joan Treistman
Joan Treistman

Many years ago I was asked to indicate my preferred measure for estimating purchase interest. On a five point scale, my client wanted to know, did I use top box or top two boxes as a barometer? My answer: I believe the bottom two boxes. It’s always been the case that you can trust what people say they DON’T Like. Forecasting from “likes” is amorphous.

As a professional researcher, it’s my obligation to uncover the truth and that’s why methods of research and questionnaires have to be carefully developed. Too bad there are so many who believe anyone can provide a sound methodology, construct a questionnaire, conduct a survey or uncover truth in a focus group. You wouldn’t choose your doctor the same way. If your survey is less than truthful, look in the mirror and ask yourself why.

Phil Rubin
Phil Rubin

For the work we do on customer marketing–loyalty, relationship and partnership marketing–we often see a very strong correlation between reported data (i.e., answers on surveys) and actual data (e.g., shopper data). Part of this is a function of the audience, part is the study design and part of it is clearly the relationship that respondents have with the category and (if disclosed) the brands involved.

There is a large difference between surveys on social-related topics and those related to customer behavior. Increasingly, customers expect that companies (and especially their marketing departments) know more about them than they usually do, which might account for what we see as often very accurate responses.

Herb Sorensen, Ph.D.
Herb Sorensen, Ph.D.

The results of any survey present a “Picasso image” of the truth. For example, Picasso’s painting of a young woman is clearly recognizable as that of a woman. It may even accentuate or communicate features not readily noticeable, until the artist has called attention to them. But even a photograph is an inadequate representation of the truth. Not many young men would confuse the photo they carry of a beloved as the beloved themselves.

There is nothing wrong with the Picasso, or the photograph. The fault is with non-understanding of their relationship to reality, and the stupid pretense that they ARE reality.

Doug Pruden
Doug Pruden

Marc Gordon suggests that the only question that gets a totally honest answer is “how likely are you to recommend?”. I wonder.

Prof V. Kumar published research on research in the Harvard Business Review Online last Fall demonstrating that that question isn’t delivering the whole truth either. In follow-up research with respondents from a financial services and a telecom study he discovered that “While 68% of the financial services firm’s customers expressed their intention to refer the company to other people, only 33% followed through.” Even more dramatically, while “81% of a telecom firm’s clients thought they’d recommend the company, merely 30% actually did.”

What’s more, very few of those referrals, in either case, actually generated customers (14% at the financial services firm; 12% at the telecom company).

Maybe it’s not The Only Question we need to ask.

Jason Rushin
Jason Rushin

How ironic to have a survey on truth in surveys! 😉

Craig Sundstrom
Craig Sundstrom

I, myself have filled out millions of surveys, and I’ve never exaggerated…even once!)

Seriously though, researchers have long been aware of this (see comments above) and have come up with means of dealing with it (see also above). Journalists, propagandists and others with various ulterior motives, on the other hand, are a problem, often (excitingly) repeating nonsense verbatim.

Joel Warady
Joel Warady

We believe that consumer research has changed forever. Asking questions, and getting responses will not provide you with the information that you need. You will be better served monitoring conversations between people, and listening to what they have to say in real life. Online social network sites can be extremely helpful in this area. And not only will the information be more truthful, but it will be available in a more expedient manner.

Surveys and Focus Groups really are yesterday’s tools. The use of Social Networking will replace these tools.

Dennis Price
Dennis Price

I am with the consensus view on this. My daughter works casually in a market research call center and the stories of how they get people to participate–and then how they change the questions (that the researcher slaved over) AT WILL just because they (the juniors) thinks it makes more sense and they don’t want to repeat themselves. It is hilarious.

John Lofstock
John Lofstock

I agree with Mel Kleiman. These surveys tend to have shades of George Costanza–responses rooted in what people want to believe rather than the truth. Costanza took it to a whole new level:

JERRY: What do you do?

GEORGE: I’m an architect.

JERRY: You’re an architect?

GEORGE: I’m not?

James Tenser

Of course the very first response bias that unavoidably plagues every marketing research study regardless of subject or intent is the response, “Yes, I agree to participate.”

Since all research samples are composed of people who say “yes,” we must accept from the outset that all our research results ignore the major part of the study universe that said “no.” The “no”-sayers may be quite different

It may be argued, therefore, that socially desirable response bias is ensured by the survey sampling process itself. It’s not an artifact but an inherent feature of all survey research.

Now this may seem like a blanket indictment of sample-based research, but we recognize that well-designed studies do in fact bring useful insights–especially when many studies are done over time by different researchers who approach different samples using varying technique. So long as we recognize that the findings are an approximation of the truth, not the actual truth, and make our decisions accordingly, we can live with some response bias.

The researchers or clients who believe their own results without limitations have fallen into the trap of “professionally desirable reporting bias.”

Marc Gordon
Marc Gordon

Asking different variations of the same question is a good way to filter out the “desirable responding” that many surveys suffer from.

From my own work developing surveys, the only question that seems to mean something of value while still getting a relatively honest response is “How likely are you to recommend us to your friends?”

Jonathan Marek
Jonathan Marek

This is the second biggest problem with survey based research about consumer behavior. The biggest is that when confronted in a survey with something new, consumers often have no idea of what they will really do out in reality. Oh, they will tell you something. They just won’t do what they thought they would do. Unfortunately, consumer reaction to something new is generally exactly what marketers seek to understand.

BTW, many comments above seem to imply that good survey design can solve these problems. It can’t.

George Whalin
George Whalin

As we look at consumer studies and surveys I am always skeptical of what people say they are going to do at some point in the future. A very recent study says that 75% of participants say they have cut back on what they are buying. A majority of these same consumers are also saying they will maintain this approach to spending when the economy improves in the future.

I don’t believe it since the history of consumer spending shows that people have always begun to spend freely as confidence returns and money becomes available after an economic downturn.

Ben Ball
Ben Ball

Len Lewis should get the RetailWire Jerry Lewis award!

There are established techniques for getting around this bias in survey methodology. Anyone who has taken any sort of psychological evaluation has found themselves wondering “wait a minute–didn’t I already answer this question?” The answer, of course, is yes you have–but probably in a slightly different wording. And you will see it again, and again, and again. The evaluator is looking for consistency across your answers, and I am told it works. (At least, that’s why they told me I didn’t get the job.)

Few if any consumer surveys go to this much trouble and expense to try to identify and correct for bias. It just isn’t worth it as long as the primary use of the survey is to measure one group’s answers against another–essentially building in a calibration mechanism. As David Bierbaum said, any marketer who is paying attention has learned how to manipulate survey questions, focus groups and darned near any sort of research to increase the chance of what we euphemistically call a “supporting result.” What matters is how you use what you get.

Judy Deems
Judy Deems

I totally agree that people tend to “embellish” on surveys. We are in the midst of a very sluggish economy and about the only thing left is pride. I am presently working as a Demo Rep in a huge “Box” store. I have had my eyes opened. I see this store being busier than any Retail Establishment I have worked in for the past 12 years. I see every class of people looking for the best value. The flow of customers overwhelms me. People are receptive to the products I demo and are eager to learn the features and benefits. They are hungry for answers. We all want to be something we aren’t and surveys give people a chance to do exactly that. I am sure this is taken into consideration and the good and bad are thrown out and the middle ground is the accepted norm.

Robert Heiblim
Robert Heiblim

This is a fact well known by researchers for some time. On the other hand, many users of research may not be as aware of this factor. Without realizing this, companies may draw quite wrong conclusions, but keep in mind that research is in and of itself a business and as with all business, it is built on giving customers “what they want.” This factor is a key to why actual patterns often are far different from what “research” may point to.

Other comments here make the good point that proper design of questions is important. So is both filtering results as well as applying models for the emotional context of the panel. There are several good companies working in this space and my direct experience is that they give much different and often more reliable results. At the least, users of research should be open to these alternative views. Users beware, as all survey data draws results from…statistics and this too is a dismal science.

More Discussions