Net neutrality — which, in crude terms, is the principle that the Internet ought to treat every packet equally and not privilege some at the expense of others — is one of those interesting cases where righteousness may be the enemy of rationality. At the root of it is a visceral belief that the end-to-end architecture of the Net is something very precious (and the key to understanding why the network has sparked such a tidal wave of innovation); those of us who share that belief tend to be paranoid about the lobbying of large corporations who would like to violate the principle for what we see as narrow commercial ends.
But the truth is that net neutrality is a very complicated issue — as real experts like Jon Crowcroft often point out. It may be, for example, that righteous adherence to neutrality may blind us to the fact that, in some circumstances, it may not yield optimal results. Which is why I was interested to read this thoughtful piece in MIT’s Technology Review this morning.
At the end of February, the Federal Communications Commission (FCC) held a public hearing at Harvard University, investigating claims that the cable giant Comcast had been stifling traffic sent over its network using the popular peer-to-peer file-sharing protocol BitTorrent. Comcast argued that it acted only during periods of severe network congestion, slowing bandwidth-hogging traffic sent by computers that probably didn’t have anyone sitting at them, anyway. But critics countered that Comcast had violated the Internet’s prevailing principle of “Net neutrality,” the idea that network operators should treat all the data packets that travel over their networks the same way.
So far, the FCC has been reluctant to adopt hard and fast rules mandating Net neutrality; at the same time, it has shown itself willing to punish clear violations of the principle. But however it rules in this case, there are some Internet experts who feel that Net neutrality is an idea that may have outlived its usefulness…
The article goes on to cite the views of Mung Chiang, a Princeton computer scientist, who specialises in nonlinear optimization of communication systems. He argues that,
in the name of Net neutrality, network operators and content distributors maintain a mutual ignorance that makes the Internet less efficient. Measures that one group takes to speed data transfers, he explains, may unintentionally impede measures taken by the other. In a peer-to-peer network, “the properties based on which peers are selected are influenced to a large degree by how the network does its traffic management,” Chiang says. But the peer selection process “will have impact in turn on the traffic management.” The result, he says, can be a feedback loop in which one counterproductive procedure spawns another.
Programs using BitTorrent, for instance, download files from a number of different peers at once. But if a particular peer isn’t sending data quickly enough, Chiang says, the others might drop it in favor of one that’s more reliable. Activity patterns among BitTorrent users can thus change very quickly. Network operators, too, try to maximize efficiency; if they notice a bandwidth bottleneck, they route around it. But according to Chiang, they operate on a much different timescale. A bottleneck caused by BitTorrent file transfers may have moved elsewhere by the time the network operator responds to it. Traffic could end up being rerouted around a vanished bottleneck and down a newly congested pipe.