I would love to know how much Internet bandwidth is used by the swarms of spiderbots? Because if bandwidth costs are going to go up, as the telco/cable last mile owners charge third parties for bandwidth, then the spiderbots might get banned.
Spiderbots eat up a huge amount of bandwidth, and if bandwidth gets more expensive, the spiderbots are going to suffer. For example, I get 5 per cent of my traffic from more than 18 spiderbots. They use up about one-third of my bandwidth.
That's a key reason why Google, Yahoo and others, are pushing for net neutrality--equal access to bandwidth--at least the last mile pipe to the home, the most important pipe. If companies are going to have to pay extra to the telcos or cable companies for bandwidth to reach their users, they might not be so pleased to be paying for the bandwidth of the spiderbots.
I'm fortunate that more than 92 per cent of my readers come directly through bookmarks or RSS, so they know where I live. Many other sites depend on 30 per cent to 70 plus per cent of their traffic from the search engines.
And they spend a lot of money to optimize their sites to attract more spiderbots. But this is not all quality traffic, it is mostly fly-by-night web surfers. Web sites should optimize for their readers, not the spiderbots. Let the search engines optimize themselves, that's their job.
When audience numbers stabilize for a web site, and very few new readers come in from the search sites--yet the spiderbots suck up one third of the bandwidth--then things will change. More and more web sites will be posting a Robot.txt file that tells the spiderbots to go away. They will change because the overall visitor experience is slowed down by the bandwidth hungry packs of spiderbots.
We used to have estimates of how much bandwidth is used by email, by SPAM, etc, how about spiderbots? I would love to know: how much Internet bandwidth is used up by the legions of spiderbots, in their constant search to find and copy new content?