In a previous blog post, Matt Cutts, head of Google's Webspam team, wrote about the progress the team has made in reducing the amount of spam in search engine results. In that post, he hinted at some changes in the works to push spam levels lower, including one that affects sites that copy content from other sites, as well as those that have low levels of original content.
Clearly, there's a blurry line there - or a "slippery slope", as ZDNet's Larry Dignan referred to it in his own post that waved some red flags over how the quality of a site would be judged.
Last week, Cutts posted an update to last week's post on his own blog, announcing that that one specific change to the algorithm was approved at the team's weekly meeting and that it was launched earlier this week. In his post, Cutts explains:
This was a pretty targeted launch: slightly over 2 percent of queries change in some way, but less than half a percent of search results change enough that someone might really notice. The net effect is that searchers are more likely to see the sites that wrote the original content rather than a site that scraped or copied the original site's content.
Read more of "Google launches algorithm change to tackle content copying. Will it help?" at ZDNet.