Archive for the 'History' Category

Bletchley Park and the erosion of the freedoms it was set up to defend

[link] Sunday, June 22nd, 2014

This morning’s Observer column.

It’s terrific that Bletchley Park has not only been rescued from the decay into which the site had fallen, but brilliantly restored, thanks to funding from the National Lottery (£5m), Google (which donated £500,000) and the internet security firm McAfee. I’ve been to the Park many times and for years going there was a melancholy experience, as one saw the depredations of time and weather inexorably outpacing the valiant efforts of the squads of volunteers who were trying to keep the place going.

Even at its lowest ebb, Bletchley had a magical aura. One felt something akin to what Abraham Lincoln tried to express when he visited Gettysburg: that something awe-inspiring had transpired here and that it should never be forgotten. The code-breaking that Bletchley Park achieved was an astonishing demonstration of the power of collective intelligence and determination in a quest to defeat the gravest threat that this country had ever faced.

When I was last there, the restoration was almost complete, and I was given a tour on non-disclosure terms, so I had seen what the duchess saw on Wednesday. The most striking bit is the restoration of Hut 6 exactly as it was, complete with all the accoutrements of the tweedy, pipe-smoking genuises who worked in it, right down to the ancient typewriters, bound notebooks and the Yard-O-Led mechanical pencil that one of them possessed.

Hut 6 is significant because that was where Gordon Welchman worked…

Read on

Coase and the Penguin

[link] Sunday, September 8th, 2013

This morning’s Observer column remembering Ronald Coase.

When the news broke last week that Ronald Coase, the economist and Nobel laureate, had died at the age of 102, what came immediately to mind was Keynes’s observation that “practical men, who believe themselves to be quite exempt from any intellectual influences, are usually the slaves of some defunct economist”. Most of the people running the great internet companies of today have probably never heard of Coase, but, in a way, they are all his slaves, because way back in 1932 he cracked the problem of explaining how firms are structured, and how and why they change as circumstances change. Coase might have been ancient, but he was certainly not defunct…

How Microsoft spent a decade asleep at the wheel

[link] Sunday, July 21st, 2013

This morning’s Observer column.

Coincidentally, in that same year, Gates stepped down from his position as CEO and began the slow process of disengaging from the company. What he failed to notice was that the folks he left in charge, chief among them one Steve Ballmer, were prone to sleeping at the wheel.

How else can one explain the way they failed to notice the importance of (successively) internet search, online advertising, smartphones and tablets until the threat was upon them? Or were they just lulled into somnolence by the sound of the till ringing up continuing sales from the old staples of Windows and Office?

But suddenly, that soothing tinkle has become less comforting. PC sales are starting to decline sharply , which means that Microsoft’s comfort zone is likewise set to shrink. Last week, we had the first indication that Ballmer & Co have woken up. In a 2,700-word internal memo rich in management-speak drivel , Ballmer announced a “far-reaching realignment of the company that will enable us to innovate with greater speed, efficiency and capability in a fast-changing world”.

Sic transit gloria mundi

[link] Sunday, January 27th, 2013

This morning’s Observer column.

Nothing lasts forever: if history has any lesson for us, it is this. It’s a thought that comes from rereading Paul Kennedy’s magisterial tome, The Rise and Fall of the Great Powers, in which he shows that none of the great nation-states or empires of history – Rome; imperial Spain in 1600; France in either its Bourbon or Bonapartist manifestations; the Dutch republic in 1700; Britain in its imperial glory – succeeded in maintaining its global ascendancy for long.

What has this got to do with technology? Well, it provides us with a useful way of thinking about two of the tech world’s great powers.

The Idea Factory

[link] Sunday, February 26th, 2012

Nobody who writes about the history of computing can ignore Bell Labs, that astonishing institution in New Jersey that created so much of the technology we nowadays take for granted. An interesting essay in the NYT has brought it back into focus for me because I’m fascinated by the problem of how to manage creative people in such a way that their creativity is liberated, not stifled, by the organisation that funds them. (Many years ago I co-authored a paper on the subject with Bob Taylor — the guy who funded the ARPAnet and later ran the Computer Systems Lab at Xerox PARC during the time when its researchers invented most of the computing technology we use today. The title of our essay was “Zen and the Art of Research Management” and it was published in December 2003 in a volume of essays dedicated to Roger Needham.)

The NYT article is by Jon Gertner, who is the author of a forthcoming book on Bell Labs entitled The Idea Factory: Bell Labs and the Great Age of American Innovation. It’s on my wish list.

At Bell Labs, the man most responsible for the culture of creativity was Mervin Kelly. Probably Mr. Kelly’s name does not ring a bell. Born in rural Missouri to a working-class family and then educated as a physicist at the University of Chicago, he went on to join the research corps at AT&T. Between 1925 and 1959, Mr. Kelly was employed at Bell Labs, rising from researcher to chairman of the board. In 1950, he traveled around Europe, delivering a presentation that explained to audiences how his laboratory worked.

His fundamental belief was that an “institute of creative technology” like his own needed a “critical mass” of talented people to foster a busy exchange of ideas. But innovation required much more than that. Mr. Kelly was convinced that physical proximity was everything; phone calls alone wouldn’t do. Quite intentionally, Bell Labs housed thinkers and doers under one roof. Purposefully mixed together on the transistor project were physicists, metallurgists and electrical engineers; side by side were specialists in theory, experimentation and manufacturing. Like an able concert hall conductor, he sought a harmony, and sometimes a tension, between scientific disciplines; between researchers and developers; and between soloists and groups.

ONE element of his approach was architectural. He personally helped design a building in Murray Hill, N.J., opened in 1941, where everyone would interact with one another. Some of the hallways in the building were designed to be so long that to look down their length was to see the end disappear at a vanishing point. Traveling the hall’s length without encountering a number of acquaintances, problems, diversions and ideas was almost impossible. A physicist on his way to lunch in the cafeteria was like a magnet rolling past iron filings…

Present at the Creation

[link] Sunday, February 26th, 2012

George Dyson has written a fascinating book about the building of the first stored-program computer by John von Neumann and his colleagues at the Institute of Advanced Studies at Princeton. After I’d finished the book I had an email exchange with him, an edited version of which appears in this morning’s Observer.

Once upon a time, a “computer” was a human being, usually female, who did calculations set for her by men in suits. Then, in the 1940s, something happened: computers became machines based on electronics. The switch had awesome implications; in the end, it spawned a technology that became inextricably woven into the fabric of late-20th- and early 21st-century life and is now indispensable. If the billions of (mostly unseen) computers that now run our industrialised support systems were suddenly to stop working, then our societies would very rapidly grind to a halt.

So the question of where this Promethean force sprang from is an intriguing one, as interesting in its way as the origins of the industrial revolution…

Photograph shows the book on sale in Heffers bookshop in Cambridge yesterday.

Davos, 1472

[link] Thursday, February 9th, 2012

Just caught up with this lovely dispatch from Davos by Jeff Jarvis.

I began this trip to Europe with my pilgrimage to the Gutenberg Museum in Mainz (blogged earlier). I recall Jon Naughton’s Observer column in which he asked us to imagine that we are pollsters in Mainz in 1472 asking whether we thought this invention of Gutenberg’s would disrupt the Catholic church, fuel the Reformation, spark the Scientific Revolution, change our view of education and thus childhood, and change our view of societies and nations and cultures. Pshaw, they must have said.

Ask those questions today. How likely do you think it is that every major institution of society–every industry, all of education, all of government–will be disrupted; that we will rethink our idea of nations and cultures; that we will reimagine education; that we will again alter even economics? Pshaw?

Welcome to Davos 1472.

John McCarthy RIP

[link] Tuesday, October 25th, 2011

John McCarthy has died. Good obit by Jack Schofield in the Guardian tonight.

In 1955, the computer scientist John McCarthy, who has died aged 84, coined the term “artificial intelligence”, or AI. His pioneering work in AI – which he defined as "the science and engineering of making intelligent machines" – and robotics included the development of the programming language Lisp in 1958. This was the second such high-level language, after Fortran, and was based on the idea of computing using symbolic expressions rather than numbers.

McCarthy was also the first to propose a “time-sharing” model of computing. In 1961, he suggested that if his approach were adopted, “computing may some day be organised as a public utility, just as the telephone system is a public utility,” and that that utility could become the basis of a significant new industry. This is the way that ‘cloud computing’ is being sold today.

However, when obliged to choose between the time-sharing work at the Massachusetts Institute of Technology (MIT) and AI, he chose AI. He said: “The ultimate effort is to make computer programs that can solve problems and achieve goals in the world as well as humans. However, many people involved in particular research areas are much less ambitious.”

Steve Jobs: commented

[link] Sunday, October 9th, 2011

The Observer asked me to read Steve Jobs’s 2005 Stanford commencement address and add my comments to the text.

**The commencement address is one of the more venerable – and respectable – traditions of American academia, especially at elite universities such as Stanford and Harvard. Because Steve Jobs died at such a relatively young age (56) this is destined to be regarded as a classic. But it faces stiff competition – as the list maintained by humanity.org testifies. Jobs’s address is up against Barack Obama’s lecture to Wesleyan University in 2008, Elie Wiesel’s talk at DePaul University in 1997, Václav Havel’s lecture on “Civilisation’s Thin Veneer” at Harvard in 1995 and George Marshall’s address to the same university in 1947 – to list just four. But Jobs’s address has an unbearable poignancy just now, especially for those who knew him well. John Gruber, the blogger and technology commentator, saw him fairly recently and observed: “He looked old. Not old in a way that could be measured in years or even decades, but impossibly old. Not tired, but weary; not ill or unwell, but rather, somehow, ancient. But not his eyes. His eyes were young and bright, their weapons-grade intensity intact.” The address also reveals something of Jobs’s humanity, something that tended to get lost in the afterglow of Apple’s astonishing corporate resurgence. **

LATER: In my comments I related one of my favourite stories about Jobs — the one where he drops the first iPod prototype in a fish-tank to demonstrate that it’s too big. Frank Stajano emailed to say that it may be apocryphal — he’d heard it many years ago about Akio Morita and Sony’s Walkman. In trying to check I found this nice piece by D.B. Grady, who also tells the story but cautions “I have no way of knowing if it is true, so take it for what it’s worth. I think it nicely captures the man who changed the world four times over.”

Agreed. As the Italians say, if it ain’t true then it ought to be. (Hmmm… on reflection, I can’t find a source for that adage either. Apologies if I’ve been rude to the citizens of that lovely country.)

Remembering Maurice Wilkes

[link] Monday, June 27th, 2011

Today, the Cambridge Computer Lab will be honouring Maurice Wilkes with an afternoon of talks and reminiscences. I’m looking forward to it. He was such an amazing, practical man.

Here’s the programme:

Andy Hopper: Introduction
Martin Campbell-Kelly: Beginnings
David Barron: Pioneering
David Hartley: Service
Andrew Herbert: Research
Don Gaubatz: America
Andy Harter: Industry
Andy Hopper: Back to the Lab
Discussion