Although the importance of monthly traffic statistics has deflated along with the dot-com bubble, the lists still command the attention of investors, competitors and employees--if only for bragging rights. Nevertheless, the rankings have long been criticized for producing vague and conflicting data.
Now complaints are focusing on a wave of online ads called "pop-ups" that automatically launch Web pages and artificially boost audience counts for the sites that use them.
The issue was highlighted this week when Jupiter Media Metrix published a monthly list that ranks a little-known camera maker's Web site as among the Internet's top five destinations. The camera maker, X10, has been quietly climbing the charts on the strength of an aggressive ad campaign, which accounted for a stunning 95 percent of its Web traffic for the month of May, according to Jupiter.
X10's placement among the top digital media properties jarred Internet advertising experts, who said ads need to be weeded out of numbers aimed at judging the popularity of Web sites.
Comparing traffic to X10's ads with the traffic generated by destination Web sites such as Yahoo is like comparing apples and oranges, said Rich LeFurgy, chairman of the Interactive Advertising Bureau, an online marketing advocacy and standards group.
X10's ad "clearly is not a Web site. It clearly has to be looked at differently," LeFurgy said. "It's not relevant vs. Web site traffic. It's only relevant for the absolute reach and presence of the...ad itself."
Jupiter defended its decision to include X10's traffic, even though it was almost entirely generated by a vast campaign that has included pop-under ads on such places as The New York Times' Web site. Pop-under ads open a new browser window underneath a Web page and become visible when a person closes the main window. Although Web surfers may not intentionally go to X10's Web site, visiting the site is part of their Web browsing experience, said Jupiter spokesman Max Kalehoff.
Jupiter's philosophy is to count any user experience as valid, regardless of whether it is intentional.
Kalehoff acknowledged that other companies have used similar techniques to improve their traffic rankings, but he said such techniques do taint Jupiter's reports.
Even if a person did not intentionally visit a site, "we report what users experience because that's what the industry wants to know about," Kalehoff said. "It's our job to report where people are going, what people are doing."
X10 representatives could not immediately be reached for comment.
Underscoring ongoing conflicts in the Web statistics business, Jupiter Media Metrix's primary competitor, Nielsen/NetRatings, does not count impressions of X10's ads in its rankings of the most heavily visited Web sites. Nielsen/NetRatings saw the impressions of X10's ad as separate from that of traffic to Web sites, said Peggy O'Neill, an analyst at the research firm.
"It's not traffic; it's advertising," O'Neill said. "It's one thing for a person to go to a site voluntarily, whether they click their way there or if they went to a site on their own. If it was served up to them--if they were kidnapped and sent there--it's hardly the same thing."
X10 ranked No. 123 among U.S. properties on Nielsen/NetRatings' list. The research firm filtered out all traffic to their pop-up and pop-under ads unless the viewer clicked on the ad for more information.
Traffic measurement companies admit that their methods are inexact but defend their data as a useful yardstick for ad buyers and others who want a general snapshot of the state of the Web.
The top-10 lists are "high-profile, but they're not where actual buying decisions are made," O'Neill said.
Companies looking to advertise on the Net are much more interested in demographics of a particular Web site than its rank, she said. O'Neill further predicted that the controversy over the X10 ads would not further sully the image of online advertising or of the ranking systems.
A spokeswoman for Excite@Home, which saw its ranking drop from fifth to sixth place because of the X10 ad campaign, agreed with O'Neill.
"Overall, we're not concerned," said Tonie Hansen, director for the trade marketing group at Excite@Home. "We use (Jupiter) to track our competition, but we don't just look at any one month. We look at the trending...We know if they're supplanted one month by a company like X10."
Measurement systems have long been criticized for a lack of common standards and lax procedures that have left them open to manipulation.
Last year, for example, Jupiter revised its July ranking after it became aware of discrepancies affecting a company that it had placed among the top 20 Web sites for that month. eFront, a group of affiliate Web sites, was inadvertently given credit for traffic from sites with which it had no formal relationship.
After the revision, eFront fell below the top 500 sites for the month. According to Jupiter, the July revision is the only time the company has restated its rankings.
Another major factor that has hurt the credibility of the Web measurement system is the lack of common standards and methods for tracking usage. Jupiter and Nielsen/NetRatings, the two major Web measurement companies, take similar approaches, but differences in the details means they rarely publish the same results.
For example, Nielsen/NetRatings breaks Microsoft-owned Web properties into two separate entities for ranking purposes, while Jupiter combines them into a single category.
Both work from samples of Web surfers and then extrapolate to determine total Web usage. But they use different sample sizes.
Jupiter gathers data from 100,000 people worldwide, including 60,000 Web users in the United States, who install tracking programs on their computers. The software measures almost all of their user interaction, from what programs they use to what Web sites they visit.
Nielsen/NetRatings has 225,000 sample Internet users, with 70,000 in the United States. The U.S. panel sample consists of 62,000 at-home users and 8,000 at-work users.
Critics say that those sample sizes are too small to avoid big distortions as the number of visitors declines. Those discrepancies are most apparent below the top 50 but can have an affect among the top 10 sites as well, according to a 1999 report by Yahoo.
By some counts, the samples would have to reach 1 million or more Internet users to escape this problem.
Although Jupiter says it will continue to count pop-up-driven traffic, readers and the advertisers themselves could take the issue into their own hands.
"I'm sure that the novelty of these pop-under ads is going to wear off," said Excite@Home's Hansen. "It's not clear whether people are going to these sites or click on them accidentally or just don't know how to turn them off."