Fantasy land: or the myth of the Digger’s omniscience

Jeff Jarvis has an interesting post about the

“swine flu of stupidity spreading about the Murdoch meme of blocking Google from indexing a site’s content (to which Google always replies that you’ve always been able to do that with robots.txt – so go ahead if you want). I love that The Reach Group (TRG), a German consulting company, has quantified just how damaging that would be to Google: hardly at all.”

The analysis is interesting — see Jeff’s post for the details. He’s also done a useful job of summarising some of the dafter ideas sparked by commentators’ awe of Rupert Murdoch. Sample:

Jason Calicanis fantasizes about Microsoft paying The New York Times to leave Google’s index for Bing. Let me explain why that would never happen. 1. The Times is not stupid. 2. Times subsidiary About.com – the only bright spot these days in the NYTimesCo’s P&L – gets 80% of its traffic and 50% of its revenue from Google. 3. See rule No. 1.

Michael Arrington then joined in the fantasy saying that News Corp. could change the balance by shifting to Bing, but ends his post with his own reality check: MySpace – increasingly a disaster in News Corp’s P&L – is attempting to negotiate its $300 million deal with Google.

Microsoft can suck up to European publishers all it wants – even adopting their ACAP “standard,” which no one in the search industry is saluting because, as Google often points out, it addresses the desires only of a small proportion of sites and it would end up aiding spammers – but it won’t make a damned bit of difference.

As Erick Schonfeld reports, also on TechCrunch, if WSJ.com turned off Google it would lose 25% of its web traffic. He quotes Hitwise, which says 15% comes from Google search, 12% from Google News – and 7% from Drudge (aggregator), and 2% from Real Clear Politics (aggregator).

It’s like I said:

The prevailing sentiment however can be summed up as a paradox: nobody thinks that a “screw-you-Google” strategy makes sense, but they assume that Murdoch knows something they don’t, and that the strategy will make sense when all is revealed. In that way, the Digger is rather like Warren Buffett: his past investment record is so good that people are wary of questioning his judgment.

That was the Media Week That Was

Talk about a sign of the times! Haymarket has just announced that the current issue of Media Week will be the last to appear in print. Excerpt:

Following the restructure, MediaWeek will no longer be published in print but will continue online under the control of a full-time editor and drawing on the resources of an enlarged Brand Republic news team. MediaWeek.co.uk has monthly traffic in excess of 80,000 unique users along with two daily email bulletins that are read by more than 25,000 commercial media professionals.

The successful MediaWeek Awards, annual Media 360 conference and other marketing communications events run under the MediaWeek brand continue unaffected.

Revolution will now be distributed as a quarterly supplement with Marketing magazine, and will be backed by a new blogging initiative in 2010…

Planning to endure

From David Isenberg’s classic essay — “The Rise of the Stupid Network”

Former Shell Group Planning Head, Arie deGeus, in his master work, The Living Company (Harvard, Boston, 1997), examined thousands of companies to try to discover what it takes to adapt to changing conditions. He found that the life expectancy of the average company was only 40 years – this means that telephone company culture is in advanced old age. De Geus also studied 27 companies that had been able to survive over 100 years. He concluded that managing for longevity – to maximize the chances that a company will adapt to changes in the business climate – is very different than managing for profit. For example, in the former, employees are part of a larger, cohesive whole, a work community. In the latter, employees are ‘resources’ to be deployed or downsized as business dictates.

This is interesting in the context of the Google Book Agreement, the responsibilities of academic libraries in the area of digital preservation and curation and the Arcadia Project. When people say to me (about digitisation) “Why not let Google [rather than, say, the University Library] do it?” I ask them to name commercial companies that have been around for 800 years.

P4P: rethinking file sharing

The thorniest problem in making decisions about internet policy is how to balance the public interest against the vested interests of companies and other incumbents of the status quo. The task is made more difficult by the fact that often there is nobody to speak for the public interest, whereas vested interested are organised, vocal and very rich. The result is usually evidence-free policymaking in which legislators give to vested interests everything they ask for, and then some.

The copyright wars provide a case-study in this skewed universe. When P2P file-sharing appeared, the record and movie industries campaigned to have the entire technology banned. (Larry Lessig used to tell a wonderful story about how he arrived at his office in Stanford Law one day and found two of the university’s network police there. They were going to disconnect him from the network because he had P2P software running on his machine. The fact that Larry used P2P as a way of distributing his own written works had apparently never occurred to them. And Stanford is a pretty smart place.)

So the idea that P2P technology might have licit as well as illicit uses was ignored by nearly everyone. And yet P2P was — and remains — a really important strategic technology, for all kinds of reasons (see, for example, Clay Shirk’s great essay about PCs being the ‘dark matter’ of the Internet). In fact, one could argue — and I have — that it’s such an important strategic technology that the narrow business interests of the content industries ought never to be allowed to stifle it. Evidence-based policy-making would therefore attempt to strike a balance between the social benefits of P2P on the one hand, and those aspects of it that happen to be inconvenient (or profit-threatening) for a particular set of industries at a particular time in history.

All of which makes this report in Technology Review particularly interesting.

‘Peer-to-peer’ (P2P) is synonymous with piracy and bandwidth hogging on the Internet. But now, Internet service providers and content companies are taking advantage of technology designed to speed the delivery of content through P2P networks. Meanwhile, standards bodies are working to codify the technology into the Internet’s basic protocols.

Rather than sending files to users from a central server, P2P file-sharing networks distribute pieces of a file among thousands of computers and help users find and download this data directly from one another. This is a highly efficient way to distribute data, resistant to the bottlenecks that can plague centralized distribution systems, but it uses large amounts of bandwidth. Even as P2P traffic slowly declines as a percentage of overall Internet traffic, it is still growing in volume. In June, Cisco estimated that P2P file-sharing networks transferred 3.3 exabytes (or 3.3 billion trillion bytes) of data per month.

While a PhD student at Yale University in 2006, Haiyong Xie came up with the idea of ‘provider portal for peer-to-peer’ or P4P, as a way to ease the strain placed on networking companies by P2P. This system reduces file-trading traffic by having ISPs share specially encoded information about their networks with peer-to-peer ‘trackers’–servers that are used to locate files for downloading. Trackers can then make file sharing more efficient by preferentially connecting computers that are closer and reducing the amount of data shared between different ISPs.

During its meetings last week in Japan, the Internet Engineering Task Force, which develops Internet standards, continued work on building P4P into standard Internet protocols…

.