Fantasy land: or the myth of the Digger’s omniscience

Jeff Jarvis has an interesting post about the

“swine flu of stupidity spreading about the Murdoch meme of blocking Google from indexing a site’s content (to which Google always replies that you’ve always been able to do that with robots.txt – so go ahead if you want). I love that The Reach Group (TRG), a German consulting company, has quantified just how damaging that would be to Google: hardly at all.”

The analysis is interesting — see Jeff’s post for the details. He’s also done a useful job of summarising some of the dafter ideas sparked by commentators’ awe of Rupert Murdoch. Sample:

Jason Calicanis fantasizes about Microsoft paying The New York Times to leave Google’s index for Bing. Let me explain why that would never happen. 1. The Times is not stupid. 2. Times subsidiary About.com – the only bright spot these days in the NYTimesCo’s P&L – gets 80% of its traffic and 50% of its revenue from Google. 3. See rule No. 1.

Michael Arrington then joined in the fantasy saying that News Corp. could change the balance by shifting to Bing, but ends his post with his own reality check: MySpace – increasingly a disaster in News Corp’s P&L – is attempting to negotiate its $300 million deal with Google.

Microsoft can suck up to European publishers all it wants – even adopting their ACAP “standard,” which no one in the search industry is saluting because, as Google often points out, it addresses the desires only of a small proportion of sites and it would end up aiding spammers – but it won’t make a damned bit of difference.

As Erick Schonfeld reports, also on TechCrunch, if WSJ.com turned off Google it would lose 25% of its web traffic. He quotes Hitwise, which says 15% comes from Google search, 12% from Google News – and 7% from Drudge (aggregator), and 2% from Real Clear Politics (aggregator).

It’s like I said:

The prevailing sentiment however can be summed up as a paradox: nobody thinks that a “screw-you-Google” strategy makes sense, but they assume that Murdoch knows something they don’t, and that the strategy will make sense when all is revealed. In that way, the Digger is rather like Warren Buffett: his past investment record is so good that people are wary of questioning his judgment.

That was the Media Week That Was

Talk about a sign of the times! Haymarket has just announced that the current issue of Media Week will be the last to appear in print. Excerpt:

Following the restructure, MediaWeek will no longer be published in print but will continue online under the control of a full-time editor and drawing on the resources of an enlarged Brand Republic news team. MediaWeek.co.uk has monthly traffic in excess of 80,000 unique users along with two daily email bulletins that are read by more than 25,000 commercial media professionals.

The successful MediaWeek Awards, annual Media 360 conference and other marketing communications events run under the MediaWeek brand continue unaffected.

Revolution will now be distributed as a quarterly supplement with Marketing magazine, and will be backed by a new blogging initiative in 2010…

Planning to endure

From David Isenberg’s classic essay — “The Rise of the Stupid Network”

Former Shell Group Planning Head, Arie deGeus, in his master work, The Living Company (Harvard, Boston, 1997), examined thousands of companies to try to discover what it takes to adapt to changing conditions. He found that the life expectancy of the average company was only 40 years – this means that telephone company culture is in advanced old age. De Geus also studied 27 companies that had been able to survive over 100 years. He concluded that managing for longevity – to maximize the chances that a company will adapt to changes in the business climate – is very different than managing for profit. For example, in the former, employees are part of a larger, cohesive whole, a work community. In the latter, employees are ‘resources’ to be deployed or downsized as business dictates.

This is interesting in the context of the Google Book Agreement, the responsibilities of academic libraries in the area of digital preservation and curation and the Arcadia Project. When people say to me (about digitisation) “Why not let Google [rather than, say, the University Library] do it?” I ask them to name commercial companies that have been around for 800 years.

P4P: rethinking file sharing

The thorniest problem in making decisions about internet policy is how to balance the public interest against the vested interests of companies and other incumbents of the status quo. The task is made more difficult by the fact that often there is nobody to speak for the public interest, whereas vested interested are organised, vocal and very rich. The result is usually evidence-free policymaking in which legislators give to vested interests everything they ask for, and then some.

The copyright wars provide a case-study in this skewed universe. When P2P file-sharing appeared, the record and movie industries campaigned to have the entire technology banned. (Larry Lessig used to tell a wonderful story about how he arrived at his office in Stanford Law one day and found two of the university’s network police there. They were going to disconnect him from the network because he had P2P software running on his machine. The fact that Larry used P2P as a way of distributing his own written works had apparently never occurred to them. And Stanford is a pretty smart place.)

So the idea that P2P technology might have licit as well as illicit uses was ignored by nearly everyone. And yet P2P was — and remains — a really important strategic technology, for all kinds of reasons (see, for example, Clay Shirk’s great essay about PCs being the ‘dark matter’ of the Internet). In fact, one could argue — and I have — that it’s such an important strategic technology that the narrow business interests of the content industries ought never to be allowed to stifle it. Evidence-based policy-making would therefore attempt to strike a balance between the social benefits of P2P on the one hand, and those aspects of it that happen to be inconvenient (or profit-threatening) for a particular set of industries at a particular time in history.

All of which makes this report in Technology Review particularly interesting.

‘Peer-to-peer’ (P2P) is synonymous with piracy and bandwidth hogging on the Internet. But now, Internet service providers and content companies are taking advantage of technology designed to speed the delivery of content through P2P networks. Meanwhile, standards bodies are working to codify the technology into the Internet’s basic protocols.

Rather than sending files to users from a central server, P2P file-sharing networks distribute pieces of a file among thousands of computers and help users find and download this data directly from one another. This is a highly efficient way to distribute data, resistant to the bottlenecks that can plague centralized distribution systems, but it uses large amounts of bandwidth. Even as P2P traffic slowly declines as a percentage of overall Internet traffic, it is still growing in volume. In June, Cisco estimated that P2P file-sharing networks transferred 3.3 exabytes (or 3.3 billion trillion bytes) of data per month.

While a PhD student at Yale University in 2006, Haiyong Xie came up with the idea of ‘provider portal for peer-to-peer’ or P4P, as a way to ease the strain placed on networking companies by P2P. This system reduces file-trading traffic by having ISPs share specially encoded information about their networks with peer-to-peer ‘trackers’–servers that are used to locate files for downloading. Trackers can then make file sharing more efficient by preferentially connecting computers that are closer and reducing the amount of data shared between different ISPs.

During its meetings last week in Japan, the Internet Engineering Task Force, which develops Internet standards, continued work on building P4P into standard Internet protocols…

.

Freakonomics, horseshit and bullshit

If, like me, you are puzzled about why apparently sensible people are seduced by the glib half-truths peddled by Levitt and Dubner in Freakonomics and, now, Superfreakonomics then a quick read of Elizabeth Kolbert’s New Yorker review will serve as a useful antidote.

In their chapter on climate change, the two Chicago chancers make great play with Victorian predictions about how our major cities would be buried in horseshit. You know the stuff: New York had 150,000 horses in 1880, each of them producing 22 lbs of ordure a day; people predicted that by 1930 horseshit in the city would be three stories high. Same story for London, etc. etc. But technology, in the form of electric power and the internal combustion engine came to our rescue. So — they cheerily maintain – the same thing will happen with climate change.

Levitt and Dubner maintain, in their breezy knowall style, that the global warming threat has been exaggerated and that there is uncertainty about how exactly the earth will respond to rising levels of carbon dioxide. And, just as with horse manure, solutions are bound to present themselves. “Technological fixes are often far simpler, and therefore cheaper, than the doomsayers could have imagined”.

Although they clearly know little about technology, the two lads are keen advocates of it. Well, certain kinds of technology anyway. They have no time for boring old stuff like wind turbines, solar cells, biofuels which are are all, in their view, more trouble than they’re worth because they’re aimed at reducing CO2 emissions, which is “the wrong goal”. Cutting back is difficult and annoying. Who really wants to use less oil? What we really need, they think, is ways of “re-engineering” the planet.

Er, how, exactly? Well, how about a huge fleet of fibreglass ships equipped with machines that would increase cloud cover over the oceans? Or a vast network of tubes for sucking cold water from the depths of the ocean? (I am not making this up.) Best of all, they say, why not mimic the climactic effect of volcanic eruptions? All that is needed is a way of pumping vast quantities of sulphur dioxide into the stratosphere. This could be done by sending up an 18-mile-long hose. “For anyone who loves cheap and simple solutions, things don’t get much better”.

Eh? In her review, Elizabeth Kolbert refers to Raymond Pierrehumbert’s wonderful ‘open letter’ to Levitt that was published in the RealClimate blog. This says, in part:

By now there have been many detailed dissections of everything that is wrong with the treatment of climate in Superfreakonomics , but what has been lost amidst all that extensive discussion is how really simple it would have been to get this stuff right. The problem wasn’t necessarily that you talked to the wrong experts or talked to too few of them. The problem was that you failed to do the most elementary thinking needed to see if what they were saying (or what you thought they were saying) in fact made any sense. If you were stupid, it wouldn’t be so bad to have messed up such elementary reasoning, but I don’t by any means think you are stupid. That makes the failure to do the thinking all the more disappointing. I will take Nathan Myhrvold’s claim about solar cells, which you quoted prominently in your book, as an example.

Pierrehumbert then does a scarifying dissection of Myhrvold’s nutty arithmetic, which is interesting not just because it shows how a supposedly-clever ex-Microsoft guru can make a complete fool of himself, but also because it shows how Levitt — who, after all, makes the claim that his statistical ingenuity makes him more insightful than the rest of us — can’t do arithmetic either.

Pierrehumbert, like Levitt, holds a prestigious Chair in the University of Chicago, so connoisseurs of academic dialogue will enjoy this paragraph in the prefatory section of his ‘open letter’:

I am addressing this to you rather than your journalist-coauthor because one has become all too accustomed to tendentious screeds from media personalities (think Glenn Beck) with a reckless disregard for the truth. However, if it has come to pass that we can’t expect the William B. Ogden Distinguished Service Professor (and Clark Medalist to boot) at a top-rated department of a respected university to think clearly and honestly with numbers, we are indeed in a sad way.

Amen to that. There is really only one good term for describing much of the Levitt/Dubner oeuvre: bullshit. What’s amazing — and depressing — is how many people seem to fall for it (at least if the sales figures for their books are anything to go by). What they remind me of most is those pop psychologists who make a living from giving glib keynote presentations about optical illusions to business conferences.

Turning Fleet Street into Quality Street

This morning’s Observer column.

If you want to return to the past, it makes sense to understand it, and here we run into some puzzles. Take the notion that, in the good ol’ days of print, customers paid for content.

Shortly before writing that sentence I was handed a copy of the London Evening Standard, which contained lots of ‘content’ but was, er, free. And although this is the most conspicuous example in the UK of printed content being given away, free newspapers have been thriving for decades. The only thing that marks out the Standard from a provincial freesheet is that its content is of a higher class. So even in the newspaper world, lots of content has been free for ages…

Ye Olde Gunne Shoppe

Look what’s just appeared in Cambridge — ye olde sweetbread shoppe, complete with glass jars full of bulls eyes etc. They serve sweets in brown paper bags, just as in William Brown’s day. Before that it was a cigar shop, and before that a gun dealer. Wonder what it’ll be next.

Did you know…?

… that a litre of air contains 100,000 billion billion protons?

Neither did I. Just thought you’d like to know this interesting but useless fact.