Survey silly season is upon us once again

Survey silly season is upon us once again

Summary: Survey silly season is here again. We all need to be very careful to understand what questions were being asked in the survey, how those questions were being asked, and who was selected to be asked those questions before we use the results of those surveys as the foundation of our own decisions.

SHARE:
TOPICS: Cloud
1

It appears to be the silly season once again. I'm getting message after message offering me the results of companies' surveys. It is clear that the PR folks are hoping that I'll comment on these surveys and help the sponsor make its point. Unfortunately, most of these surveys were not constructed or executed very well. In the end, the sponsor looks silly and takes a big hit on its reputation.

Here are some recent examples:

  • A supplier touted that a recent survey supported the adoption of a feature of one of its products. When I ask about the survey sample, I'm informed that well over 90 attendees of that company's own customers took the time to take the survey at its own customer conference. The results of the study are presented as if they have worldwide significance rather than merely showing the preferences of a few.
  • Another survey makes strong predictions about market purchasing decisions. When I probe about the base of respondents, I'm informed that comments of over 400 "IT specialists" were considered. When I ask how many of them are actually decision makers within their own company, I learn that only a very small percentage of the respondents fit in that category. Will their organizations really pay attention to those opinions? It is not at all clear.

I could review a long list of recent vendor sponsored surveys that have been presented to me, all of which were deeply flawed and clearly misleading. Most of these companies were somewhat successful even though the studies they were flogging were not reliable or useful if the goal is understanding market opinions and behavior. Their studies were mentioned in the media somewhere.

Rather than rushing to believe the results of these studies, it would be very wise to better understand what questions were being asked in the survey, how those questions were being asked, and who was selected to be asked before we use the results of those surveys as the foundation of our own decisions.

Topic: Cloud

About

Daniel Kusnetzky, a reformed software engineer and product manager, founded Kusnetzky Group LLC in 2006. He's literally written the book on virtualization and often comments on cloud computing, mobility and systems software. In his spare time, he's also the managing partner of Lux Sonus LLC, an investment firm.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

1 comment
Log in or register to join the discussion
  • Very good advice.

    Also add info about analyst shops that make mistakes, and then remove them without a word of sorry.

    Or about surveys in different language but summaries in English that have nothing to do with actual data presented in those surveys.

    Or about surveys that take "liberal" approach to defining meaning of the words. (Like OS popularity vs downloaded apps ....)


    But there are also examples of opposite stories. Entities that where cough on cheating (or just simply "renaming") have hard time in proving that they learned the job properly next time they publish something. So some surveys do not get attentions because of bad past of publishing entity.
    przemoli