Matt Cutts, head of the Web spam team at Google, confirmed at the SMX Advanced conference this week that Google Panda will receive an update to version 2.2. This is good news for original content creators who are seeing their original works outranked by sites that blatantly copy (or "scrape") them.
In particular...the Panda algorithm is run against Google’s entire index of pages on an infrequent basis, in order to tag certain sites that should be dinged by it, as opposed to some of its automatic spam detection tools.
For example, Google’s constantly scanning for pages that might use hidden text. If it spots them, then it may assess a penalty.
Google is not constantly scanning for pages that might get hit by its Panda penalty. Instead, Google manually runs that algorithm, which then determines which web sites should be hit by it.
This all just goes to show that as Google continues to grow and deal with spam, the company continues to get manually involved with their index much more so than ever -- at least, more than they've ever been willing to publicly admit, that is.
Personally, I think it's an excellent move for them to apply manually-run algorithms to their index to help flesh out spam and give rightful preference to content owners in the SERPs.
For those of you who are aware of Panda and its reach, have you experienced any drawbacks with your pages or noticed continual ranking issues? What about those of you who search Google frequently; have you noticed improved results in your searches?