Davos, 1472

Just caught up with this lovely dispatch from Davos by Jeff Jarvis.

I began this trip to Europe with my pilgrimage to the Gutenberg Museum in Mainz (blogged earlier). I recall Jon Naughton’s Observer column in which he asked us to imagine that we are pollsters in Mainz in 1472 asking whether we thought this invention of Gutenberg’s would disrupt the Catholic church, fuel the Reformation, spark the Scientific Revolution, change our view of education and thus childhood, and change our view of societies and nations and cultures. Pshaw, they must have said.

Ask those questions today. How likely do you think it is that every major institution of society–every industry, all of education, all of government–will be disrupted; that we will rethink our idea of nations and cultures; that we will reimagine education; that we will again alter even economics? Pshaw?

Welcome to Davos 1472.

John McCarthy RIP

John McCarthy has died. Good obit by Jack Schofield in the Guardian tonight.

In 1955, the computer scientist John McCarthy, who has died aged 84, coined the term “artificial intelligence”, or AI. His pioneering work in AI – which he defined as "the science and engineering of making intelligent machines" – and robotics included the development of the programming language Lisp in 1958. This was the second such high-level language, after Fortran, and was based on the idea of computing using symbolic expressions rather than numbers.

McCarthy was also the first to propose a “time-sharing” model of computing. In 1961, he suggested that if his approach were adopted, “computing may some day be organised as a public utility, just as the telephone system is a public utility,” and that that utility could become the basis of a significant new industry. This is the way that ‘cloud computing’ is being sold today.

However, when obliged to choose between the time-sharing work at the Massachusetts Institute of Technology (MIT) and AI, he chose AI. He said: “The ultimate effort is to make computer programs that can solve problems and achieve goals in the world as well as humans. However, many people involved in particular research areas are much less ambitious.”

Steve Jobs: commented

The Observer asked me to read Steve Jobs’s 2005 Stanford commencement address and add my comments to the text.

**The commencement address is one of the more venerable – and respectable – traditions of American academia, especially at elite universities such as Stanford and Harvard. Because Steve Jobs died at such a relatively young age (56) this is destined to be regarded as a classic. But it faces stiff competition – as the list maintained by humanity.org testifies. Jobs’s address is up against Barack Obama’s lecture to Wesleyan University in 2008, Elie Wiesel’s talk at DePaul University in 1997, Václav Havel’s lecture on “Civilisation’s Thin Veneer” at Harvard in 1995 and George Marshall’s address to the same university in 1947 – to list just four. But Jobs’s address has an unbearable poignancy just now, especially for those who knew him well. John Gruber, the blogger and technology commentator, saw him fairly recently and observed: “He looked old. Not old in a way that could be measured in years or even decades, but impossibly old. Not tired, but weary; not ill or unwell, but rather, somehow, ancient. But not his eyes. His eyes were young and bright, their weapons-grade intensity intact.” The address also reveals something of Jobs’s humanity, something that tended to get lost in the afterglow of Apple’s astonishing corporate resurgence. **

LATER: In my comments I related one of my favourite stories about Jobs — the one where he drops the first iPod prototype in a fish-tank to demonstrate that it’s too big. Frank Stajano emailed to say that it may be apocryphal — he’d heard it many years ago about Akio Morita and Sony’s Walkman. In trying to check I found this nice piece by D.B. Grady, who also tells the story but cautions “I have no way of knowing if it is true, so take it for what it’s worth. I think it nicely captures the man who changed the world four times over.”

Agreed. As the Italians say, if it ain’t true then it ought to be. (Hmmm… on reflection, I can’t find a source for that adage either. Apologies if I’ve been rude to the citizens of that lovely country.)

Remembering Maurice Wilkes

Today, the Cambridge Computer Lab will be honouring Maurice Wilkes with an afternoon of talks and reminiscences. I’m looking forward to it. He was such an amazing, practical man.

Here’s the programme:

Andy Hopper: Introduction
Martin Campbell-Kelly: Beginnings
David Barron: Pioneering
David Hartley: Service
Andrew Herbert: Research
Don Gaubatz: America
Andy Harter: Industry
Andy Hopper: Back to the Lab
Discussion

Journal of the cyber-plague years

My piece in today’s Observer.

In 1971, Bob Thomas, an engineer working for Bolt, Beranek and Newman, the Boston company that had the contract to build the Arpanet, the precursor of the internet, released a virus called the "creeper" on to the network. It was an experimental, self-replicating program that infected DEC PDP-10 minicomputers. It did no actual harm and merely displayed a cheeky message: "I'm the creeper, catch me if you can!" Someone else wrote a program to detect and delete it, called – inevitably – the "reaper".

Although nobody could have known it 40 years ago, it was the start of something big, something that would one day threaten to undermine, if not overwhelm, the networked world…

Freedom from the Cloud?

This morning’s Observer column.

“The novelties of one generation,” said George Bernard Shaw, “are only the resuscitated fashions of the generation before last.” An excellent illustration is provided by the computing industry, which – despite its high-tech exterior – is as prone to fashion swings as the next business. Witness the current excitement about the news that, on 2 March, Apple is due to announce details of the new iPad, the latest incarnation of what the Register disrespectfully calls an “uber-popular fondleslab”. Yves Saint Laurent would have killed for that kind of excitement about a forthcoming collection.

To put the hysteria into some kind of context, however, consider how we got into this mess…

Keeping a record

This morning’s Observer column.

A few months ago, I went to an intriguing talk given by Lorcan Dempsey, who is a leading authority on the role of libraries in the digital world. One of the slides in his presentation really made me sit up. The context was an account of how different academic libraries are going about the archiving of digital material. The slide in question focused on Emory University, a wealthy, private research university in Atlanta, Georgia. Like many such institutions, it has been buying up the papers of well-known writers and already has a fine collection of Irish scribblers in its archives. But it also has the papers of Salman Rushdie and this was the subject of the slide that startled me.

Why? Because it showed that Emory’s Rushdie archive included not only the writer’s papers, but also his old computers and hard drives. And there, on the slide, was the symbol for an old Apple Macintosh computer and in its directory listing was a folder entitled, simply, “My Money”. And at that moment, if you will forgive the pun, the penny dropped…

There are also some good (critical) comments by readers.

How are the mighty fallen

This morning’s Observer column.

You have to feel sorry for Sony sometimes. I mean to say, there it was on Wednesday in Berlin, at the IFA consumer electronics show, launching a new music and video download service called Qriocity (it’s like “curiosity”, only it couldn’t get the domain name, I suppose) – and what happens? Steve Jobs goes on stage in San Francisco and announces that Apple is having another go at the TV download business.

And guess who gets all the media coverage?

How are the mighty fallen. I remember a time when Sony dominated the gadgetry business, when it was a synonym for elegant design and advanced functionality, when Walkmans ruled the world. It had shops in upmarket malls where young males came to drool. And now? Ask a teenager about Sony and s/he will reply: “Aren’t they the outfit that makes flat-screen TVs and DVD players and other stuff for adults?”

Why the YouTube-Viacom ruling is good news

From The Atlantic Wire.

For three years, media and legal observers have been anticipating the outcome of Viacom's $1 billion lawsuit against Google's video site, YouTube. Viacom, which owns MTV, Paramount Pictures and programs such as South Park and The Daily Show, alleged that YouTube willingly exploited its copyrighted content. Google, on the other hand, maintained that the Digital Millennium Copyright Act relieves it from checking user-generated material before it's posted.

On Wednesday, U.S. District Judge Louis Stanton ruled in favor of Google, saying that when YouTube received "specific notice that a particular item infringed a copyright, they swiftly removed it." While Viacom promises to appeal the ruling, its prospects don't look promising. Web enthusiasts and legal experts, meanwhile, are musing about what this means for the Web at large.

At the moment, these views are:

  • The judgment “reinforces the pro-sharing ethos of the Web”
  • It “ensures YouTube’s long-term survival” by easing Google’s caution about where it places ads on the service
  • It “loosens the rules on content-hosting sites”. (Er, except in Italy, perhaps)
  • It represents a major setback for media companies
  • All true. The big story is that while Viacom may be big, Google is bigger. There’s a new 800-lb gorilla on the block.

    The Facebook question

    This morning’s Observer column:

    Is Facebook now “too big to fail”? I don’t mean in the sense that the taxpayer would have to pick up the pieces if it went under, but in the sense that the social networking service has achieved a position of such dominance in the online ecosystem that its eclipse is unthinkable. Is Facebook, in other words, the next Microsoft or Google?

    The question is prompted by a couple of milestones recently passed by Facebook. The first is that it now has more than 400 million members. The second is industry gossip predicting that its revenues for 2010 will exceed a billion dollars. Other straws in the wind are estimates of the size of the “Facebook economy” – ie the ecosystem of applications, services and products that has evolved around the service; and the moral panics it now triggers in the mainstream media – a sure sign that they fear a competitor…