Can Google's algorithm do subjective quality calls?
"When we try to address challenges like this (site quality) we try to do it in an algorithmic way. There may be one-off situations where for legal reasons or what have you we will intervene manually. But our fundamental approach is to take an algorithmic approach and try to solve it from a technology standpoint."
---Neal Mohan, vice president of product management at Google, speaking Feb. 28 at the Morgan Stanley Technology, Media and Telecommunications Conference.
It has been almost a week since Google flipped the switch on its algorithm, which is now designed to weed out low-quality and more useless Web sites. Sounds good in theory. As I noted before, however, the Google algorithm switch is a slippery slope.
How slippery?
- Mahalo cut 10 percent of its workforce after Google's algorithm change whacked the site.
- Yahoo's Associated Content took a hit according to an initial analysis by Sistrix.
- Sites like Cult of Mac saw traffic fall dramatically.
- Simply put, sites that lose their Google juice might as well go into the Web site protection program.
The big questions in all of this boils down to the following:
- What's the unassailable definition of quality?
- Is an algorithm capable of making a subjective decision (one man's spam is another man's good read)?
- And do we trust Google to be judge and jury via an algorithm we know nothing about?
I'd argue that algorithms won't be able to do subjective judgments well and that means Google will increasingly need to make more editorial calls. Ryan Singel at Wired noted that Cult of Mac's traffic bounced back after Google obviously did something to give the site juice back. Is this the best way to go about this?
There will be more sites complaining about Google's algorithm change and the search giant will probably make a few "one-off" exceptions. The inflection point comes when Google has to make multiple "one-off" calls. Ultimately, we've outsourced the quality call to Google.
Related: