Microsoft's Scoble questions ZDNet's agenda

Summary:In his last response to my response to his response to my attack on the credibility of his report on Technorati (did you follow that?), Microsoft evangelist Robert Scoble asks and implies  some questions that deserved to be answered.

In his last response to my response to his response to my attack on the credibility of his report on Technorati (did you follow that?), Microsoft evangelist Robert Scoble asks and implies  some questions that deserved to be answered.  Then, I'll mention the irony in all of this.

First, here's the list of questions (both asked and implied) and my answers, directly addressed to Scoble:

  • To do a more "pure" comparison would take a LOT more work.  The implied question is "are you expecting me to do a LOT more work? (Dave Winer commented on the opportunity cost, saying that the truth can take two weeks to find, at the end of which most journalists still get it wrong).  Answer:  I'm glad you used the word pure because there's something about purity of the truth.  There's something about wrapping up work at the end of the day when you know you've done your best to englighten your audience with the truth without leaving it to others to clean up your mess.   So, yes, I am (expecting people to do that work).  But, let me preface this with an admission that I sometimes get it wrong myself as do all journalists.  But I invariably try to get it right with at least a few calls or emails as opposed to leaving it to someone else to spot the error, much less correct it. Perhaps this comes from my background as a former director of PC Week's Labs back in the 90s (and I've been testing products on and off ever since being promoted out of that job).  With a directive to produce comparative reviews that are fair and unbiased (one based on hands-on testing), most of the hard work that goes into such comparisons is on the front end where you need to come up with a testing methodology that does everything possible to level the playing field.  This takes work. A lot of it.  Not only do you have to come up with a fair methodogy, you do have to take into consideration what the vendor says the product is designed to do and who the vendor is targeting with that product (or service).  This is why millions of dollars have been sunk into testing labs.  Not only to come up with methdologies that have to be fair, but tests that are reproducible.  Getting as close to the truth as possible takes work and in this case, it didn't take two weeks worth of work.  It only took a single ten minute phone call.  But based on what you're saying, there is no point at which the opportunity cost of truth vs. your time makes sense. 
  • When you benchmark Microsoft software, do you only benchmark what we tell you to benchmark, or do you benchmark what you're interested in?  Glad you asked.  Over the years, comparisons grew increasingly difficult to do.  For example, there were few points in time where Microsoft Office, Wordperfect Office, and Lotus SmartSuite all had the same feature set.  Same went for Microsoft's LAN Manager (and later NT) vs. Novell's Netware and Banyan's Vines.  The same thing still goes for Lotus Notes/Domino vs. Microsoft Exchange Server.   Each of these products can be used to do some of the same things.  Some of them, they do differently.  Each of them does things the other does not.   Even before any testing begins, you need to develop an understanding of these differences.  At the end of the day, based on what a vendor tells you, you may decide its product doesn't deserve to be in the comparison because it's simply not designed to do what you thought it did.  This happened countless times in our comparative reviews.   Once you decide to move forward, then, you factor that understanding into your methodology.  If you're comparing two products that are closely aligned but not exactly matched based on what the vendors say they do or based on their feature sets (as was the case here), then you must reconcile those differences in your write up.  Not wait for someone to speak up and say, "Excuse me, we never said we did that. In fact, we purposely don't do that."  Finally, just to add even more context to that, I can recall how your own company put a tremendous amount of faith in the testing we did at ZD Labs in Foster City in the 1990s.  An army of Microsofties would camp out in the Labs to discuss methodologies and test results (an opportunity we gave to all vendors) and I can assure you they were very concerned with the fairness of the tests (ask Mike Nash, he's still around in your security group). So, just  because some reviewers blow off fairness, or what the product is intended to do doesn't mean we all should. 
  • You saying every blogger should call vendors before they criticize? I'm saying that it doesn't matter who you are or how you distribute the information you're distributing.  I'm not saying that you deliberately lied.  But the story you told turned out to be both untrue and unfair and I would think that as a human being, you would have the decency to make sure what you're about to say -- whether it's to your wife, your children, or the 3000 people that link to your blog (thereby profoundly magnifying the consequences of the untruth) is true -- that you'd make some effort.. any effort.. to make sure that what you're about to say is the truth.   A ten minute phone call.  That's all it took for me to fact check you.   By the way, there were some comments on your original blog that discussed the duplicate issue that you have discounted.  They went unacknowledged until my blog turned up.  In fact the very first comment by Randy Charles Morin mentions the issue and didn't get ackowledged by you until almost three hours later in the same comments thread.   Later that day, at 11:02pm, James Brunskill picks up on the incorrect sites/links observation that you made.  You didn't admit to the error until 38 hours after Brunskill filed his comment (almost 24 hours after I blogged the error on ZDNet, and more than two days after you wrote the original offending blog).  So, the business about writing first and correcting it within minutes -- presumably within enough time before significant damage is done --  is folly.  Maybe Technorati didn't take on too much water because of this error.  But which of your next errors that takes two days to correct will destroy somones's life or someone's business at the expense of your ten minutes.  Is your ten minutes that important to you?
  • If I had refused to link to your points, or refused to do more research, or refused to correct my post, then I could agree with that. But, you're simply being bombastic now. Why? What's your agenda? I'm glad you asked because this, in my mind, this isn't about Technorati anymore.  It's about the value of truth, something that we here at ZDNet and at CNET Networks are passionate about.  The blogosphere is a fantastic new medium that I'm officially a part of. It's all I do.  And I'll be the first one to acknowledge that you were one of its pioneers.  But now that people are turning to it as a critical source of information, it'd be nice if it had reputation for being a source of pretty credible information.  So, the more trustworthy the blogosphere is, the more newcomers to our blogs will assume they can be trusted.   That's my agenda.  To raise the bar for everyone.  On the same token, one could also ask about your agenda?  In your earlier post, you say I agree that corrections don't always cut it. I too wish for a more accurate reporting system, but this system is pretty darn good at self correcting.  Based on the time it took to get your errors ironed out in this situation, I don't agree and you don't have to wish.  The phone is sitting right next to you.  You go on to say, So far I've been watching for factual mistakes where Microsoft is concerned and there hasn't been that many. So, speaking of agendas, you said you watch for factual mistakes where Microsoft is concerned.  What about where Microsoft isn't concerned? Why don't  the same rules apply there?  I noticed that  Yahoo appears to be going into competition with Technorati.  And, MSN Search (a part of your company) is in competition with Yahoo.  So, as your own blog from July 9, 2005 implies, Microsoft should be connected with the business of RSS search.  It seems like a convenient time to not be very concerned about the accuracy of the statements you make about potential competitors.  Your job is to be an evangelist, right?  There's a phrase/acronym that this industry uses when a vendor's evangelists spread false imformation about other vendors.  It's called FUD: Fear, Uncertainty, and Doubt.  To be honest, I don't think those were your intentions.  So let's stop the insinuations (for example, the one where you imply ZDNet's blogs don't accept comments when it couldn't be more obvious that they do just by looking... How do you make this stuff up anyway?) and stick to the real question which is: Is it really OK to say whatever it is you want to say regardless of whether its true or not and regardless of what toll you take on your victims.  I don't think so.
Finally, the irony in all of this is that it was sparked by Robert Scoble's evaluation of an engine (Technorati) that tries to gauge the the authority of bloggers (relative to other bloggers) and his blog happens to have a pretty high ranking: 33rd out of over 13,000,000 blogs.  This blog, Between the Lines, is ranked no. 2,421.   They say the truth will set you free.   Yeah.  Free to roam around the long tail.   Go figure.   Of course, that's a ranking of authority.  How does authority relate to credibility? You tell me.

Topics: Microsoft

About

David Berlind was fomerly the executive editor of ZDNet. David holds a BBA in Computer Information Systems. Prior to becoming a tech journalist in 1991, David was an IT manager.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Related Stories

The best of ZDNet, delivered

You have been successfully signed up. To sign up for more newsletters or to manage your account, visit the Newsletter Subscription Center.
Subscription failed.