Planning to endure

From David Isenberg’s classic essay — “The Rise of the Stupid Network”

Former Shell Group Planning Head, Arie deGeus, in his master work, The Living Company (Harvard, Boston, 1997), examined thousands of companies to try to discover what it takes to adapt to changing conditions. He found that the life expectancy of the average company was only 40 years – this means that telephone company culture is in advanced old age. De Geus also studied 27 companies that had been able to survive over 100 years. He concluded that managing for longevity – to maximize the chances that a company will adapt to changes in the business climate – is very different than managing for profit. For example, in the former, employees are part of a larger, cohesive whole, a work community. In the latter, employees are ‘resources’ to be deployed or downsized as business dictates.

This is interesting in the context of the Google Book Agreement, the responsibilities of academic libraries in the area of digital preservation and curation and the Arcadia Project. When people say to me (about digitisation) “Why not let Google [rather than, say, the University Library] do it?” I ask them to name commercial companies that have been around for 800 years.

P4P: rethinking file sharing

The thorniest problem in making decisions about internet policy is how to balance the public interest against the vested interests of companies and other incumbents of the status quo. The task is made more difficult by the fact that often there is nobody to speak for the public interest, whereas vested interested are organised, vocal and very rich. The result is usually evidence-free policymaking in which legislators give to vested interests everything they ask for, and then some.

The copyright wars provide a case-study in this skewed universe. When P2P file-sharing appeared, the record and movie industries campaigned to have the entire technology banned. (Larry Lessig used to tell a wonderful story about how he arrived at his office in Stanford Law one day and found two of the university’s network police there. They were going to disconnect him from the network because he had P2P software running on his machine. The fact that Larry used P2P as a way of distributing his own written works had apparently never occurred to them. And Stanford is a pretty smart place.)

So the idea that P2P technology might have licit as well as illicit uses was ignored by nearly everyone. And yet P2P was — and remains — a really important strategic technology, for all kinds of reasons (see, for example, Clay Shirk’s great essay about PCs being the ‘dark matter’ of the Internet). In fact, one could argue — and I have — that it’s such an important strategic technology that the narrow business interests of the content industries ought never to be allowed to stifle it. Evidence-based policy-making would therefore attempt to strike a balance between the social benefits of P2P on the one hand, and those aspects of it that happen to be inconvenient (or profit-threatening) for a particular set of industries at a particular time in history.

All of which makes this report in Technology Review particularly interesting.

‘Peer-to-peer’ (P2P) is synonymous with piracy and bandwidth hogging on the Internet. But now, Internet service providers and content companies are taking advantage of technology designed to speed the delivery of content through P2P networks. Meanwhile, standards bodies are working to codify the technology into the Internet’s basic protocols.

Rather than sending files to users from a central server, P2P file-sharing networks distribute pieces of a file among thousands of computers and help users find and download this data directly from one another. This is a highly efficient way to distribute data, resistant to the bottlenecks that can plague centralized distribution systems, but it uses large amounts of bandwidth. Even as P2P traffic slowly declines as a percentage of overall Internet traffic, it is still growing in volume. In June, Cisco estimated that P2P file-sharing networks transferred 3.3 exabytes (or 3.3 billion trillion bytes) of data per month.

While a PhD student at Yale University in 2006, Haiyong Xie came up with the idea of ‘provider portal for peer-to-peer’ or P4P, as a way to ease the strain placed on networking companies by P2P. This system reduces file-trading traffic by having ISPs share specially encoded information about their networks with peer-to-peer ‘trackers’–servers that are used to locate files for downloading. Trackers can then make file sharing more efficient by preferentially connecting computers that are closer and reducing the amount of data shared between different ISPs.

During its meetings last week in Japan, the Internet Engineering Task Force, which develops Internet standards, continued work on building P4P into standard Internet protocols…

.

Freakonomics, horseshit and bullshit

If, like me, you are puzzled about why apparently sensible people are seduced by the glib half-truths peddled by Levitt and Dubner in Freakonomics and, now, Superfreakonomics then a quick read of Elizabeth Kolbert’s New Yorker review will serve as a useful antidote.

In their chapter on climate change, the two Chicago chancers make great play with Victorian predictions about how our major cities would be buried in horseshit. You know the stuff: New York had 150,000 horses in 1880, each of them producing 22 lbs of ordure a day; people predicted that by 1930 horseshit in the city would be three stories high. Same story for London, etc. etc. But technology, in the form of electric power and the internal combustion engine came to our rescue. So — they cheerily maintain – the same thing will happen with climate change.

Levitt and Dubner maintain, in their breezy knowall style, that the global warming threat has been exaggerated and that there is uncertainty about how exactly the earth will respond to rising levels of carbon dioxide. And, just as with horse manure, solutions are bound to present themselves. “Technological fixes are often far simpler, and therefore cheaper, than the doomsayers could have imagined”.

Although they clearly know little about technology, the two lads are keen advocates of it. Well, certain kinds of technology anyway. They have no time for boring old stuff like wind turbines, solar cells, biofuels which are are all, in their view, more trouble than they’re worth because they’re aimed at reducing CO2 emissions, which is “the wrong goal”. Cutting back is difficult and annoying. Who really wants to use less oil? What we really need, they think, is ways of “re-engineering” the planet.

Er, how, exactly? Well, how about a huge fleet of fibreglass ships equipped with machines that would increase cloud cover over the oceans? Or a vast network of tubes for sucking cold water from the depths of the ocean? (I am not making this up.) Best of all, they say, why not mimic the climactic effect of volcanic eruptions? All that is needed is a way of pumping vast quantities of sulphur dioxide into the stratosphere. This could be done by sending up an 18-mile-long hose. “For anyone who loves cheap and simple solutions, things don’t get much better”.

Eh? In her review, Elizabeth Kolbert refers to Raymond Pierrehumbert’s wonderful ‘open letter’ to Levitt that was published in the RealClimate blog. This says, in part:

By now there have been many detailed dissections of everything that is wrong with the treatment of climate in Superfreakonomics , but what has been lost amidst all that extensive discussion is how really simple it would have been to get this stuff right. The problem wasn’t necessarily that you talked to the wrong experts or talked to too few of them. The problem was that you failed to do the most elementary thinking needed to see if what they were saying (or what you thought they were saying) in fact made any sense. If you were stupid, it wouldn’t be so bad to have messed up such elementary reasoning, but I don’t by any means think you are stupid. That makes the failure to do the thinking all the more disappointing. I will take Nathan Myhrvold’s claim about solar cells, which you quoted prominently in your book, as an example.

Pierrehumbert then does a scarifying dissection of Myhrvold’s nutty arithmetic, which is interesting not just because it shows how a supposedly-clever ex-Microsoft guru can make a complete fool of himself, but also because it shows how Levitt — who, after all, makes the claim that his statistical ingenuity makes him more insightful than the rest of us — can’t do arithmetic either.

Pierrehumbert, like Levitt, holds a prestigious Chair in the University of Chicago, so connoisseurs of academic dialogue will enjoy this paragraph in the prefatory section of his ‘open letter’:

I am addressing this to you rather than your journalist-coauthor because one has become all too accustomed to tendentious screeds from media personalities (think Glenn Beck) with a reckless disregard for the truth. However, if it has come to pass that we can’t expect the William B. Ogden Distinguished Service Professor (and Clark Medalist to boot) at a top-rated department of a respected university to think clearly and honestly with numbers, we are indeed in a sad way.

Amen to that. There is really only one good term for describing much of the Levitt/Dubner oeuvre: bullshit. What’s amazing — and depressing — is how many people seem to fall for it (at least if the sales figures for their books are anything to go by). What they remind me of most is those pop psychologists who make a living from giving glib keynote presentations about optical illusions to business conferences.

Turning Fleet Street into Quality Street

This morning’s Observer column.

If you want to return to the past, it makes sense to understand it, and here we run into some puzzles. Take the notion that, in the good ol’ days of print, customers paid for content.

Shortly before writing that sentence I was handed a copy of the London Evening Standard, which contained lots of ‘content’ but was, er, free. And although this is the most conspicuous example in the UK of printed content being given away, free newspapers have been thriving for decades. The only thing that marks out the Standard from a provincial freesheet is that its content is of a higher class. So even in the newspaper world, lots of content has been free for ages…

Ye Olde Gunne Shoppe

Look what’s just appeared in Cambridge — ye olde sweetbread shoppe, complete with glass jars full of bulls eyes etc. They serve sweets in brown paper bags, just as in William Brown’s day. Before that it was a cigar shop, and before that a gun dealer. Wonder what it’ll be next.

Did you know…?

… that a litre of air contains 100,000 billion billion protons?

Neither did I. Just thought you’d like to know this interesting but useless fact.

Microsoft to fund ACAP development?

The search by newspaper publishers for DRM-for-papers continues. ACAP (Automated Content Access Protocol) is currently their Great White Hope. This report from TechCrunch suggests that Microsoft might be getting in on the act.

Our sources say Microsoft has pledged to help fund research and engineering into ACAP to the tune of about will put £100,000. This is the more granular version of the robots.txt protocol which has been proposed by publishers to enable them to have a more sophisticated response to search engine crawlers. However, we understand that Microsoft won’t be involved in developing the protocol, just the financial funding.

For years, Google has characterised the debate about search engines as “you are either in our index or not in it, there is no half-way house.” But the Automated Content Access Protocol ”ACAP” proposes a far more layered response, allowing full access or just access to some content of a site. Unsurprisingly, it’s been developed by a consortium of the World Association of Newspapers, European Publishers Council and International Publishers Association. Proposed in 2006, it has been criticised as being biased towards publishers rather than search engines, specifically Google, and few non-ACAP members have adopted the protocol. Some call it the “DRM of newspaper web sites”. That said some 1,600 traditional publishers have signed up to using ACAP.

But if Bing starts to play ball with ACAP, this could change the game. Suddenly newspapers will have a stick, and a heavyweight enforcer in the shape of Bing, with which to beat Google. Google would have a choice – either recognise the ACAP protocol in order to get some level of access to newspaper sites, or just ignore it…