X
Business

Danny Sullivan on Gonzales v Google

Government's subpoena "worse than a fishing expedition," search engine expert says.
Written by ZDNet UK, Contributor

Danny Sullivan of Search Engine Watch takes an in-depth look at the week's big news: the case of Gonzales v. Google, in which the search giant has pushed back against a government subpoena and the Justice Dept. is requesting the court force its compliance. Some of Sullivan's points:

  • Let's not get all starry eyed. Google pushed back in this case, but it may have complied with other governmental requests. Indeed, one of the best points in John Battelle's book "The Search" was the section focusing on the US Patriot Act and how Google (or other search engines) might not even be able to say if it has given out information.
  •  Let's also not get foolish. I personally think that search engines should be following laws and especially helping authorities, even if that means handing over private information. But that has to be done when the proper procedures have been followed, when the right safeguards are in place and a real pressing need for the information is demonstrated.

The government is asking for a random list of URLs in Google's index and a random list of search queries. It emphasizes that no user data need be associated. And it claims that:

Reviewing URLs available through search engines will help us understand what sites users can find using search engines, to estimate the prevalence of harmful-to-minors (HTM) materials among such sites, to characterize those sites, and to measure the effectiveness of content filters in screening HTM materials from those sites.

Reviewing user queries to search engines will help us understand the search behavior of current web users, to estimate how often web users encounter HTM materials through searches, and to measure the effectiveness of filters in screening those materials.

This  Sullivan finds "jawdropping."

[S]ure, they could hand over a list of 1 million URLs. But you have no idea from that list how often any of those URLs actually rank for anything or receive clicks. It is non-data, useless.

Secondly, the list of search queries HAS NO RANKING DATA associated with it. So let's say the DOJ sees a query for "lindsay lohan." They don't know from that data what exactly showed up on Google or another search engines for that query, not from what they've asked for. Since they don't know what was listed, they further can't detect any HTMs that might show up.

In short, gathering this data is worse than a fishing expedition. It's a futile exercise that will proof absolutely nothing about the presence of HTMs in search results. All it proves so far is that the DOJ lawyers (and apparently their experts) haven't a clue about how search engines work. An actual search expert will tear apart whatever "proof" they think they can concoct from the data gathered so far.

 

 

Editorial standards