How ActiveRain destroyed their search engine rankings with one text file

Let this short cautionary story serve as an example for why you shouldn't always take your rankings and site normalcy for granted!

ActiveRain, the "largest and most active social network in the real estate space," has seen their search engine rankings begin to plummet, thanks to the unfortunate modification of one little text file: robots.txt.

For those who do not know, robots.txt is a file that allows you to -- amongst other things -- tell search engines where not to go on your Web site. So, what's the worst thing you could do to your site's SEO efforts with robots.txt? This:

A robots.txt file, when configured as you see above, tells search engines to go away from your site completely, so that's exactly what's going on right now on, but why? Especially when Google currently shows a properly-formatted robots.txt file residing on the site as recently as February 9, 2012:

This would never be an intentional decision made by anyone who knows about robots.txt... that is, anyone who wasn't up to no good. Sabotage? Malware? Plugin vulnerability? Disgruntled employee? These are all possibilities, though I'm not certain if they are the cause. After all, you would be surprised by how shockingly easy it is to accidentally do things like this -- especially if you have multiple people who perform admin work on a site. Whatever the case may be, ActiveRain ended up with a whole heap of ranking issues and people have begun to notice:

The resolution is simple: fix robots.txt. Unfortunately, search engines tend to work at their own pace, so it may take a while before ActiveRain sees their rankings completely restored back to normalcy once they modify/roll back robots.txt back to its original state. Then again, it may only be a short week or two.

Whatever the case may be, this example serves as a good lesson for all of us to not always take rankings and site normalcy for granted -- especially if multiple people are overseeing your site's operation and development, or if you don't regularly update your CMS or plug-ins. And if it's ever found that an occurrence like this is the result of sabotage (be it a disgruntled employee or a vulnerability), do your due diligence to make sure other things weren't tampered with as well (such as server configuration files and user accounts) which could allow for repeat performances or additional headaches.

Thanks to David Kyle for the heads-up about this!

Update: Robots.txt has now been fixed on the site.

Related Content: