Lifetime achievement

My friend Quentin has — deservedly — been given a Lifetime Achievement Award (called a Lovie after Ada Lovelace) for inventing the webcam. Here’s the presentation speech by Sophie Wilson (who designed the instruction set for the ARM processor and so also helped to shape our networked world):

And here is Quentin’s acceptance speech. He must have been moved by the award, because he briefly blanks as he’s getting into his stride. Normally, he’s the most fluent speaker I know. But note his graceful and witty recovery, once he’s found his notes.

This is IMHO long-overdue recognition for a technology pioneer.

The education of Mark Zuckerberg

This morning’s Observer column:

One of my favourite books is The Education of Henry Adams (published in 1918). It’s an extended meditation, written in old age by a scion of one of Boston’s elite families, on how the world had changed in his lifetime, and how his formal education had not prepared him for the events through which he had lived. This education had been grounded in the classics, history and literature, and had rendered him incapable, he said, of dealing with the impact of science and technology.

Re-reading Adams recently left me with the thought that there is now an opening for a similar book, The Education of Mark Zuckerberg. It would have an analogous theme, namely how the hero’s education rendered him incapable of understanding the world into which he was born. For although he was supposed to be majoring in psychology at Harvard, the young Zuckerberg mostly took computer science classes until he started Facebook and dropped out. And it turns out that this half-baked education has left him bewildered and rudderless in a culturally complex and politically polarised world…

Read on

Sixty years on

Today is the 60th anniversary of the day that the Soviet Union announced that it had launched a satellite — Sputnik — in earth orbit. The conventional historical narrative (as recounted, for example, in my book and in Katie Hafner and Matthew Lyon’s history) is that this event really alarmed the American public, not least because it suggested that the Soviet Union might be superior to the US in important fields like rocketry and ballistic missiles. The narrative goes on to recount that the shock resulted in a major shake-up in the US government which — among other things — led to the setting up of ARPA — the Advanced Research Projects Agency — in the Pentagon. This was the organisation which funded the development of ARPANET, the packet-switched network that was the precursor of the Internet.

The narrative is accurate in that Sputnik clearly provided the impetus for a drive to produce a massive increase in US capability in science, aerospace technology and computing. But the declassification of a trove of hitherto-secret CIA documents (for example, this one) to mark the anniversary suggests that the CIA was pretty well-informed about Soviet capabilities and intentions and that the launch of a satellite was expected, though nobody could guess at the timing. So President Eisenhower and the US government were not as shocked as the public, and they clearly worked on the principle that one should never waste a good crisis.

Lessons of history?

This morning’s Observer column:

The abiding problem with writing about digital technology is how to avoid what the sociologist Michael Mann calls “the sociology of the last five minutes”. There’s something about the technology that reduces our collective attention span to that of newts. This is how we wind up obsessing over the next iPhone, the travails of Uber, Facebook being weaponised by Russia, Samsung’s new non-combustible smartphone and so on. It’s mostly a breathless search for what Michael Lewis once called “the new new thing”.

We have become mesmerised by digital technology and by the companies that control and exploit it. Accordingly, we find it genuinely difficult to judge whether a particular development is really something new and unprecedented or just a contemporary variant on something that is much older…

Read on

Citizen Jane

Last night we went to see Matt Tyrnauer’s documentary about the legendary battle between the journalist, author and activist Jane Jacobs and the famous (or infamous, depending on your point of view) New York city planner, Robert Moses.

In the scale of his ambition to impose modernist order on the chaos of the city, Moses’s only historical peer is Baron Haussmann, the planner appointed by Napoléon III to transform Paris from the warren of narrow cobbled streets in which radicals could foment revolution, into the city of wide boulevards we know today. (Among the advantages of said boulevards was that they afforded military units a clear line of fire in the event of trouble.)

Behind the film lie two great books, only one of which is mentioned in the script. The first — and obvious one — is Jacobs’s The Life and Death of Great American Cities, a scarifying attack on the modernist ideology of Le Corbusier and his disciples and a brilliant defence of the organic complexity of urban living, with all its chaos, untidiness and diversity. The other book is Robert Caro’s magnificent biography of Moses — The Power Broker: Robert Moses and the Fall of New York. This is a warts-and-all portrayal of a bureaucratic monster, but it also explains how Moses, like many a monster before him, started out as an earnest, idealistic, principled reformer — very much a Progressive.

He was educated at Yale, Oxford (Wadham) and Columbia, where he did a PhD. At Oxford he acquired an admiration for the British civil service, with its political neutrality and meritocratic ethos — but also a rigid distinction between an ‘Administrative’ class (those engaged in policy-making) and the bureaucratic drones (whose role was merely to implement policy). When he returned to the US, Moses found that the political machines of American cities had zero interest in meritocracy, and eventually realised that his path to success lay in hitching his wagon to a machine politician — New York governor Al Smith, a brilliant political operator with an eighth-grade education. Moses started by building public parks, but quickly acquired such a wide range of powers and authority that became was, effectively, the ‘Master Builder’ who remodelled the city of New York.

Because Tyrnauer’s film focusses more on Jacobs, Moses’s Progressive origins are understandably ignored and what comes over is only his monstrous, bullying side — exemplified in his contemptuous aside that “Those who can, build; those who can’t criticize”. In that sense, Moses is very like that other monster, Lyndon Johnson, to whose biography Caro has devoted most of his career.

My interest in Moses was first sparked by something written by a friend, Larry Lessig. In an essay that was a prelude to Code: And Other Laws of Cyberspace, his first major book, Lessig discussed the role of architecture in regulating behaviour and wrote:

“Robert Moses built highway bridges along the roads to the beaches in Long Island so that busses could not pass under the bridges, thereby assuring that only those with cars (mainly white people) would use certain public beaches, and that those without cars (largely African Americans) would be driven to use other beaches, so that social relations would be properly regulated.”

In later years, this assertion — about the effectiveness of the low underpasses as racial filters — has been challenged, but Lessig’s central proposition (that architecture constrains and determines behaviour) is amply demonstrated in the film. The squalid, slum-like conditions that Moses sought to demolish did indeed enable and determine behaviour: it was the rich, organic, chaotic, vibrant life that Jacobs observed and celebrated. And when those slums were replaced by the ‘projects’ beloved of Moses, le Corbusier et al — high-rise apartment blocks set in rational configurations — they also determined behaviour, mostly by eliminating vibrancy and life and creating urban wastelands of crime, social exclusion and desperation.

A second reflection sparked by the film is its evocation of the way the automobile destroyed cities. Moses believed that what was good for General Motors was good for America and he was determined to make New York automobile-friendly by building expressways that gouged their way through neighbourhoods and rendered them uninhabitable. His first defeat at Jacobs’s hands came when his plan to extend Fifth Avenue by running a road through Washington Square was overturned by inspired campaigning by women he derided as mere “housewives”. But in the end the defeat that really broke him was the failure of his plan to run a motorway through Manhattan.

A third reflection is, in a way, the inverse of that. The film provides a vivid illustration of the extent to which the automobile changed the shape not just of New York but of all American cities (and also positions in intercourse, as some wag once put it; American moralists in the 1920s used to fulminate that cars were “brothels on wheels”) But those vehicles were all driven by humans. If autonomous cars become the norm in urban areas, then the changes they will bring in due course could be equally revolutionary. After all, if mobility is what we really require, then car ownership will decline — as will demand for on- and off-street parking. Pavements (or sidewalks, as they call them in the US) can be made wider, enabling more of the street life that Jacobs so prized.

The final reflection on the film is gloomier. Towards the end, the camera began to pan, zoom and linger on the monstrous cities that are now being built in China, Asia and India — the areas of the world that are going to be more dominant in coming centuries. And as one looks at the resulting forests of high-rise, soulless towers it looks as though the ghosts of Le Corbusier and Moses have come back to earth and are gleefully setting about creating the dysfunctional slums of the future. Which reminds me of a passage in a book I’m currently reading — Edward Luce’s The Retreat of Western Liberalism. “History does not end”, Luce writes. “it is a timeless repetition of human folly and correction”.

Amen to that.

Bob Taylor RIP

Bob Taylor, the man who funded the Arpanet (the military precursor of the Internet), has died at the age of 85. He also funded much of Doug Engelbart’s ‘augmentation’ research at SRI. After Arpanet was up and running, Bob left to found the Computer Science Lab at Xerox PARC. His ambition for CSL, he said, was to hire the 50 best computer scientists and engineers in the US and let them do their stuff. He didn’t get 50, but he did get some real stars — including Bob Metcalfe, Chuck Thacker, David Boggs, Butler Lampson, Alan Kay and Charles Simonyi who — in three magical years — invented much of the technology we use today: bitmapped windowing interfaces, Ethernet and the laser printer, networked workstations, collaborative working, to name just a few. They were, in the words of one chronicler “dealers of lightning”. Bob’s management style was inspired. His philosophy was to hire the best and give them their heads. His job, he told his geeks, was to protect them from The Management. And he was as good as his word.

Xerox, needless to say, fumbled the future the company could have owned. Steve Jobs saw what Bob’s team were doing and realised its significance. He went back to Apple and started the Macintosh project to bring it to the masses.

Bob and I had a friend in common — Roger Needham, the great computer scientist, who worked with Bob after he had left PARC to run the DEC Systems Research Center in California. When Roger was diagnosed with terminal cancer his Cambridge colleagues organised a symposium and a festschrift in his honour. Bob and I co-wrote one of the essays in that collection. Its title — “Zen and the Art of Research Management” — captured both Bob’s and Roger’s management style.

The NYT obit is properly respectful of Bob’s great contribution to our world. One of the comments below it comes from Alan Kay who was one of the CSL stars. He writes:

Bob fully embraced the deeply romantic “ARPA Dream” of personal computing and pervasive networking. His true genius was in being able to “lead by getting others to lead and cooperate” via total commitment, enormous confidence in his highly selected researchers expressed in all directions, impish humor, and tenacious protection of the research. He was certainly the greatest “research manager” in his field, and because of this had the largest influence in a time of the greatest funding for computing research. It is impossible to overpraise his impact and to describe just how he used his considerable personality to catalyze actions.

The key idea was to have a great vision yet not try to drive it from the funders on down, but instead “fund people not projects” by getting the best scientists in the world to “find the problems to solve” that they thought would help realize the vision. An important part of how this funding was carried out was not just to find the best scientists, but to create them. Many of the most important researchers at Xerox PARC were young researchers in ARPA funded projects. Bob was one of the creators of this process and carried it out at ARPA, Xerox PARC, and DEC. He was one of those unique people who was a central factor in a deep revolution of ideas.

Yep: unique is the word. May he rest in peace.

Image courtesy of Palo Alto Research Center

On the sociology of the last five minutes…

… or what Adam Gopnik calls “presentism” in his review article on books by Pankaj Misha, Joel Mokyr and Yuval Noah Harari.

Of all the prejudices of pundits, presentism is the strongest. It is the assumption that what is happening now is going to keep on happening, without anything happening to stop it. If the West has broken down the Berlin Wall and McDonald’s opens in St. Petersburg, then history is over and Thomas Friedman is content. If, by a margin so small that in a voice vote you would have no idea who won, Brexit happens; or if, by a trick of an antique electoral system designed to give country people more power than city people, a Donald Trump is elected, then pluralist constitutional democracy is finished. The liberal millennium was upon us as the year 2000 dawned; fifteen years later, the autocratic apocalypse is at hand. Thomas Friedman is concerned.

You would think that people who think for a living would pause and reflect that whatever is happening usually does stop happening, and something else happens in its place; a baby who is crying now will stop crying sooner or later. Exhaustion, or a change of mood, or a passing sound, or a bright light, something, always happens next. But for the parents the wait can feel the same as forever, and for many pundits, too, now is the only time worth knowing, for now is when the baby is crying and now is when they’re selling your books.

And so the death-of-liberalism tomes and eulogies are having their day, with the publishers who bet on apocalypse rubbing their hands with pleasure and the ones who gambled on more of the same weeping like, well, babies.

Characteristically good piece by one of the New Yorker‘s best writers. He’s not overly impressed by Mishra or Harari, and prefers Mokyr’s less grandiose interest in people who make and do things as the real movers of history.

The real word-processor


Matthew Kirschenbaum has written a fascinating bookTrack Changes: a Literary History of Word Processing — and in following a link to interviews with him I came on this lovely image, which made me laugh out loud.

Also: word processor seemed such a strange term for a tool designed (presumably) to aid composition. I always thought of it alongside the food processor that became a staple in so many modern kitchens (though never in ours), the whole point of which was to reduce everything to an undifferentiated pulp. (Or so I thought, anyway, never having used one.)

Kirschenbaum clears up the mystery: it seems that the ‘processor’ term came from IBM, who were marketing an office document-processing system which envisaged a process which took the document from initial outline to finished printed version to filed-away copy.

Remember, remember the ninth of September


For personal reasons I have vivid memories of 9/11, so today is always a sombre day in my calendar. But I was suddenly reminded this morning of how some of my Internet buddies rose magnificently to the challenge of the day. This is Dave Winer’s blog, for example. And here are Jeff Jarvis’s audio reports, as unforgettable now as they were then.

And then this memoir by the WSJ‘s John Bussey.

The long history of ‘cyber’

My Observer piece on Thomas Rid’s alternative history of computing, The Rise of the Machines: the Lost History of Cybernetics:

Where did the “cyber” in “cyberspace” come from? Most people, when asked, will probably credit William Gibson, who famously introduced the term in his celebrated 1984 novel, Neuromancer. It came to him while watching some kids play early video games. Searching for a name for the virtual space in which they seemed immersed, he wrote “cyberspace” in his notepad. “As I stared at it in red Sharpie on a yellow legal pad,” he later recalled, “my whole delight was that it meant absolutely nothing.”

How wrong can you be? Cyberspace turned out to be the space that somehow morphed into the networked world we now inhabit, and which might ultimately prove our undoing by making us totally dependent on a system that is both unfathomably complex and fundamentally insecure. But the cyber- prefix actually goes back a long way before Gibson – to the late 1940s and Norbert Wiener’s book, Cybernetics, Or Control and Communication in the Animal and the Machine, which was published in 1948.

Cybernetics was the term Wiener, an MIT mathematician and polymath, coined for the scientific study of feedback control and communication in animals and machines. As a “transdiscipline” that cuts across traditional fields such as physics, chemistry and biology, cybernetics had a brief and largely unsuccessful existence: few of the world’s universities now have departments of cybernetics. But as Thomas Rid’s absorbing new book, The Rise of the Machines: The Lost History of Cybernetics shows, it has had a long afterglow as a source of mythic inspiration that endures to the present day…

Read on