P4P: rethinking file sharing

The thorniest problem in making decisions about internet policy is how to balance the public interest against the vested interests of companies and other incumbents of the status quo. The task is made more difficult by the fact that often there is nobody to speak for the public interest, whereas vested interested are organised, vocal and very rich. The result is usually evidence-free policymaking in which legislators give to vested interests everything they ask for, and then some.

The copyright wars provide a case-study in this skewed universe. When P2P file-sharing appeared, the record and movie industries campaigned to have the entire technology banned. (Larry Lessig used to tell a wonderful story about how he arrived at his office in Stanford Law one day and found two of the university’s network police there. They were going to disconnect him from the network because he had P2P software running on his machine. The fact that Larry used P2P as a way of distributing his own written works had apparently never occurred to them. And Stanford is a pretty smart place.)

So the idea that P2P technology might have licit as well as illicit uses was ignored by nearly everyone. And yet P2P was — and remains — a really important strategic technology, for all kinds of reasons (see, for example, Clay Shirk’s great essay about PCs being the ‘dark matter’ of the Internet). In fact, one could argue — and I have — that it’s such an important strategic technology that the narrow business interests of the content industries ought never to be allowed to stifle it. Evidence-based policy-making would therefore attempt to strike a balance between the social benefits of P2P on the one hand, and those aspects of it that happen to be inconvenient (or profit-threatening) for a particular set of industries at a particular time in history.

All of which makes this report in Technology Review particularly interesting.

‘Peer-to-peer’ (P2P) is synonymous with piracy and bandwidth hogging on the Internet. But now, Internet service providers and content companies are taking advantage of technology designed to speed the delivery of content through P2P networks. Meanwhile, standards bodies are working to codify the technology into the Internet’s basic protocols.

Rather than sending files to users from a central server, P2P file-sharing networks distribute pieces of a file among thousands of computers and help users find and download this data directly from one another. This is a highly efficient way to distribute data, resistant to the bottlenecks that can plague centralized distribution systems, but it uses large amounts of bandwidth. Even as P2P traffic slowly declines as a percentage of overall Internet traffic, it is still growing in volume. In June, Cisco estimated that P2P file-sharing networks transferred 3.3 exabytes (or 3.3 billion trillion bytes) of data per month.

While a PhD student at Yale University in 2006, Haiyong Xie came up with the idea of ‘provider portal for peer-to-peer’ or P4P, as a way to ease the strain placed on networking companies by P2P. This system reduces file-trading traffic by having ISPs share specially encoded information about their networks with peer-to-peer ‘trackers’–servers that are used to locate files for downloading. Trackers can then make file sharing more efficient by preferentially connecting computers that are closer and reducing the amount of data shared between different ISPs.

During its meetings last week in Japan, the Internet Engineering Task Force, which develops Internet standards, continued work on building P4P into standard Internet protocols…

.