Imagine a busy intersection at rush hour. All the cars are moving slowly, but making progress. Everyone will eventually get home but it might take a while. To relieve the problem the city could build more or bigger roads but that's expensive. Meanwhile, along comes an ambulance that needs to get through. The ambulance is not treated like every other vehicle. It turns on a siren and cars move out of the way to let it pass. The driver may even press his magic traffic light remote to make the light turn green for him. Nobody minds too much because they can imagine themselves being in that ambulance, needing to get to the hospital right away. And besides, it doesn't happen very often, right?
Internet companies like Comcast are the city that builds the roads, and the vehicles are your data packets, mixed with the data from all other net users. The companies would rather not build more roads (increase the peak bandwidth), but they want to let certain types of traffic pass through more easily. On the face of it, this might sound reasonable. But a recent incident with Comcast and P2P traffic points out some of the the pitfalls.
Suppose half the vehicles on the road were taxis. If taxis were taken out of the picture, then traffic would flow much better for everybody else, and we could put off those road improvements, right? So how do we discourage taxis? Do we charge them a higher fee so that we can afford new roads? Do we regulate that only a certain number of taxis are allowed to be on the road at any given time? Or do we sent agents out to the taxi companies to yank the spark plugs out of a bunch of them to keep them out of service?
Peer to peer file transfers are like the taxis of the internet. An investigation by the Associated Press discovered that Comcast was actively trying to delay or reduce this traffic to boost performance for their other customers. The method Comcast used was interesting. Using our taxi metaphor, they got on the frequency used by the taxi drivers, posed as controllers, and ordered them to return to base. Users would just see that P2P file transfers would terminate for no apparent reason.
After denying the practice for a few days after the report, yesterday Mitch Bowling, senior vice president of Comcast, admitted the company was interfering with P2P transfers. But he sees nothing wrong with it:
During periods of heavy peer-to-peer congestion, which can degrade the experience for all customers, we use several network management technologies that, when necessary, enable us to delay — not block — some peer-to-peer traffic. However, the peer-to-peer transaction will eventually be completed as requested.
Is this the future of net "un-neutrality" that internet service providers have in mind? They've already redefined what "unlimited" means. What's next, detecting FTP transfers and causing some percentage of them to just fail? Redirecting you to a lower-resolution version of your video and pictures that don't look quite the same? Maybe they could remove some of the unimportant words in news stories, of heck, just leave out the ones that criticize their policies. Those a probably a waste of space anyway.
If ISPs really want to make a dent in unnecessary traffic, I say leave file transfers alone and tackle the real menace - spam. Pull their spark plugs and send all their messages back home. Nobody wants to pull over for the "Spambulance" to go through.