X
Business

Should search engines pay tribute to content?

There's some basic math at work here. The smaller your circulation base, the more specialized it is, the better off you'll do with either a paid model or a registration model.
Written by Dana Blankenhorn, Inactive

Tom Foremski, one of the good guys here at ZDNet, is out with a piece suggesting that Google fork over whatever Rupert Murdoch wants in order to keep indexing Fox News.

His argument is that losing access to regularly-updated content would be a big hole in Google's business model, which is based on making everything out there available.

I have two problems with that:

  1. If one publisher can force Google to pay for a link, so can every other publisher. Blackmail never ends.
  2. Publishers have tried this before and failed. Including Murdoch.

I think the first point is more important. Foremski argues that newspapers get little traffic from Google. This is true, mainly because, as he notes, most still haven't got a Clue when it comes to the Internet.

Google has already taken their wire service traffic away, signing side deals with major wire services like AP, posting those stories on Google pages, paying back the ad revenue. The only newspaper content left is local or beat-specific. Most traffic to those stories comes from the local area.

So if the traffic flow is modest, why should Google be paying? Just to protect its reputation?

There is no need to argue the point. We can run an experiment.

Let Bing or Yahoo pay Murdoch, and let Murdoch put robots.txt files on all his properties, keeping them away from Google's crawlers. See what happens. If there's real market share to be gained here, Google's competitors will be happy to buy it.

On to the second point. Both The Wall Street Journal and The New York Times have tried the paid model. The Journal maintains it, but offers links to the full-text of its stories, through Google. The Times gave up its Times Select program as a money-loser.

Yes, The Journal began its program before Google bought it. But I also remember a time, about a decade ago, when Fox site managers were transferring every link deep into their site to the home page. It was maddening. They stopped.

There's some basic math at work here. The smaller your circulation base, the more specialized it is, the better off you'll do with either a paid model or a registration model.

Lots and lots of journals allow only access to abstracts if you're not a registered user. The New England Journal of Medicine is an example. And there are many publications only available to paying customers, who are notified of updates via e-mail.

The problem is that if you want a mass audience on the Internet, you have to make yourself available to a mass audience. Newspapers are mass circulation publications. Throwing up registration windows or pay walls hasn't worked for them.

But, again, Murdoch (and every other publisher) is perfectly free to try. Just write a simple robots.txt file forbidding indexing. Poof, you're invisible to the search engines' spiders.

There's no real controversy here, IMHO. Publishers are free to conduct what experiments they want with Google, either seeking to tweak it to get more audience, or block it to access a smaller, elite audience.

History says Murdoch is barking at the Moon. But he likes to write his own history. Let him try.

Editorial standards