Since posting Lies, Damn Lies and Statistics on May 5th, I've gotten an interesting collection of back channel Emails from suppliers, other analysts, and, as is sometimes the case, some outraged readers.
Some have asked for suggestions on better ways to present what they've learned through demand-side primary research. My response was usually "make it clear who sponsored the study, who you've spoken to, what you've counted, how you've counted it and don't present far reaching conclusions based upon a very small or focused sample." That is, don't conduct a survey of 100 attendees of a cloud computing event and then report that the responses represent what all potential and actual users of cloud computing are thinking or are planning.
A few other analysts reached out to support their word. After speaking with them, I've come to the conclusion that most of them have executed studies based upon reasonable processes, survey instruments, analysis and then went on to present a reasonable set of conclusions. Their client, usually a supplier of products or services, on the other hand, took that measured analysis and presented something both self serving and somewhat extreme. I've had that experience myself and can commiserate with them.
Although I was rather surprised, I got a few messages from people who responded quite negatively to my post. I want to know how they knew so much about my heritage, eating habits and driving style. (Could Facebook be at fault?) I guess they took my attempt at shining a light on some rather interesting industry practices as poking at them personally. I'm reminded of the old saw "if the shoe fits, wear it" and “If you want to forget all your other troubles, wear too tight shoes.”