X
Innovation

Survey silly season is upon us once again

Survey silly season is here again. We all need to be very careful to understand what questions were being asked in the survey, how those questions were being asked, and who was selected to be asked those questions before we use the results of those surveys as the foundation of our own decisions.
Written by Dan Kusnetzky, Contributor

It appears to be the silly season once again. I'm getting message after message offering me the results of companies' surveys. It is clear that the PR folks are hoping that I'll comment on these surveys and help the sponsor make its point. Unfortunately, most of these surveys were not constructed or executed very well. In the end, the sponsor looks silly and takes a big hit on its reputation.

Here are some recent examples:

  • A supplier touted that a recent survey supported the adoption of a feature of one of its products. When I ask about the survey sample, I'm informed that well over 90 attendees of that company's own customers took the time to take the survey at its own customer conference. The results of the study are presented as if they have worldwide significance rather than merely showing the preferences of a few.
  • Another survey makes strong predictions about market purchasing decisions. When I probe about the base of respondents, I'm informed that comments of over 400 "IT specialists" were considered. When I ask how many of them are actually decision makers within their own company, I learn that only a very small percentage of the respondents fit in that category. Will their organizations really pay attention to those opinions? It is not at all clear.

I could review a long list of recent vendor sponsored surveys that have been presented to me, all of which were deeply flawed and clearly misleading. Most of these companies were somewhat successful even though the studies they were flogging were not reliable or useful if the goal is understanding market opinions and behavior. Their studies were mentioned in the media somewhere.

Rather than rushing to believe the results of these studies, it would be very wise to better understand what questions were being asked in the survey, how those questions were being asked, and who was selected to be asked before we use the results of those surveys as the foundation of our own decisions.

Editorial standards