X
Business

Notes from Research and Relationships session at Supernova

The following is heavily paraphrased....Max Kalehoff of Neilsen Buzzmetrics: 10 Major Disruptions in Measurement1.
Written by Mitch Ratcliffe, Contributor

The following is heavily paraphrased....

Max Kalehoff of Neilsen Buzzmetrics: 10 Major Disruptions in Measurement

1. Online and digital data changes the way we collect and manage market data.

2. Attention Erosion, Research Aversion: just don't bother me.

3. Speed of Measurement Increases, a competitive issue

4. Data is commoditized and democratized.

5. Passive behavioral and attention measurement grows more pervasive

6. Measurement and analysis of unstructured data is increasingly important and challenging.

7. Consumer-centric measurement and planning for marketers

8. Quality is coming back.

9. Data integration comes of age.

10. Attention-data ownership is a big unanswered question.

Ellen Konar, of Google. Google's perspective is to let customers define products rather than define them for customers. Launch betas and watch feedback--both verbal and behavioral.

Amy Shenkan, McKinsey & Co., works with many different companies on market research. It is still early days in terms of selling and gathering information on the Internet.

Aaron Coldiron, Microsoft (Windows Group). Leads word of mouth marketing efforts, including blogs and outreach in communities around the Net. Working to improve metrics that can be used to improve Microsoft products.

Ellen Konar: (Remember, heavily paraphrasing) In the old days we spent a lot of time eliciting feedback, but today it is more a matter of processing the overwhelming inflow of feedback. It's harder, because there is no framework for the data, it has to be imposed on the feedback. But it is a richer time, because there is so much real-world data to work with.

Amy Shenkan: Because our clients are now much more broadly engaged with the market, in "participatory marketing," we are trying to figure out how to use communities of users to do advertising, marketing and customer service? Consider how NetFlix is using a $1M prize to solicit solutions for improving its recommendation algorithms. It lets them improve the product faster, in an organic way. And they get great press for doing it. So, how do you think about doing this kind of thing in a programmatic way?

Aaron Coldiron: At Microsoft, we're doing some of [what has been discussed], but we're just getting underway. The control of your brand is out of your hands, because people are talking about you [he is acknowledging that brand cannot be defined and managed in isolation, as it used to].

Amy Shenkan: Whatever your brand promise is, what you deliver has to be very close [to that promise]. Cites the Kryptonite lock crisis, where fact belied the brand promise of security for one's bike.

Question: So, how do you systematize this process and what do you lose if you do systematize?

Amy Shenkan: Obviously, you can't control what people say about you, which raises the bar with regard to keeping your promise. That's what scares the big branded companies. You can't get away with [bad things] anymore.

Aaron Coldiron: At Microsoft, we offered everyone a blog. It was scary, but we learned that when you connect engineers to customers, the products get a lot better. He says they also broke the barriers to human relationships, so that Microsoft didn't seem so impersonal and monolithic.

How does Microsoft justify these investments?

Qualitative and quantitative measures. Total volume, tone of discussion. Did we help move the discussion forward? How did we move those messages into the mainstream media? The qualitative measures were in how the customers are talking about our products. He said something about people's feelings about Microsoft's progress on Zune, but it was a muddled thought that sounds like in some meetings execs quote what people say in blogs to support their arguments. Not clear.

Max Kalehoff: Some of my customers are very metrics focused, but where do these quantitative and qualitative insight come together?

Ellen Konar: Tracking sales or market share is watching what is occuring, but it doesn't give you any guidance about what to do next. The kinds of things that tie changes in the market and how they relate to actual interaction with the customer or business decisions--that can be extracted from quantitative information to motivate the company to continue improvements [that it is learning are working].

Pentium bug era, the first time the Net played a role in my research: Where it got interesting was learning from the Net how people were deciding to place blame, to put up with the problem because it wouldn't affect their work, etc.

Amy Shenkan: Making silos within organizations are talking and sharing skills is more important than ever, and it has always been very important.

Aaron Coldiron: We're trying to elevate general community information, but we have focused community efforts, too, such as the MVP program in the developer group. We split apart our efforts and try to tie it back. You don't get a lot of consistency across organizations, but it is working.

Ellen Konar: Google is really different. I had this communicate-up idea about this data when I joined, but I found that all this information is just publicly accessible. It occurred to me that Google is really run like a peer-to-peer network. My contacts in any one week are across lots of different groups, there is no structure [you'd recognize]. It sounds like structure grows out of opportunities inside the company--growing so fast that it is potentially too early to say this is going to be the enduring quality of the company.

Amy Shenkan: Does it work?

Ellen Konar: That's the question, does this work? Some days I think that it doesn't, then the next day it seems like it works incredibly. Stuff that looks like insights travels, data doesn't. Data doesn't travel fast and furious, but true insights do.

Some of it is due to editing, but we have dashboards everyplace. There are more dashboards than you can imagine and not a lot of structure. It does seem to me that people must have alerts on, about specific topics. It is a combination of alerts and this peer-to-peer network.

Aaron Coldiron: Microsoft is actually fairly flat, despite the number of people. In research, there is a lot of duplicative effort.

Me: What we're talking about is distributed leadership, rather that waiting to follow leaders on specific topics.

Shannon Clark (in the audience): The defaults in an organization have a lot to do with what level of information is shared. I assume the default in Google is public.

Aaron Coldiron: You might get alerts on 15 different things, but the manager will set the priorities.

Phil Hughes (in the audience): How much research are your companies now being outsourced by putting raw data outside the firewall?

Max Kalehoff: Coming from the agency/analysis side, marrying outside data with internal is key (but he is talking about bringing the data in, rather than exposing it for public riffing).

Ellen Konar: I think what you are asking is "can the world develop these insights?" Google is constantly testing the boundaries of that. For example, Google Trends--I am sure you'll see it evolve. You can see trends, not absolute values or the numerics, but... you can track trends over time, such at Barbies versus Brats. There is enormous information in [that]. It's fairly clear that market research is becoming less important as a specialty. Searches are coming from India, rather than the United States. You can do a lot of early analysis of trends based on search data. Ad word research is another example. That's valuable data.

Me: But this is still not relying on community research, it is enabling research that takes place and remains outside the company.

There's more, but I have to take a conference call now....

Editorial standards