My ADAPT Centre lecture at Trinity College, Dublin on April 29, 2015. Text here.
YouTube turns ten this year. ArsTechnica has a nice post that reflects on its history and its significance.
The site has become so indispensable that it feels like a basic part of the Internet itself rather than a service that lives on top of it. YouTube is just the place to put videos, and it’s used by everyone from individuals to billion-dollar companies. It’s obvious to say, but YouTube revolutionized Web video. It made video uploading and playback almost as easy as uploading a picture, handled all the bandwidth costs, and it allowed anyone to embed those videos onto other sites.
The scale of YouTube gets more breathtaking every year. It has a billion users in 61 languages, and 12 days of video are uploaded to the site every minute—that’s almost 50 years of video every day. The site just continues growing. The number of hours watched on YouTube is up 50 percent from last year.
It’s easy to forget YouTube almost didn’t make it. Survival for the site was a near-constant battle in the early days. The company not only fought the bandwidth monster, but it faced an army of lawyers from various media companies that all wanted to shut the video service down. But thanks to cash backing from Google, the site was able to fend off the lawyers. And by staying at the forefront of Web and server technology, YouTube managed to serve videos to the entire Internet without being bankrupted by bandwidth bills…
Great read. Recommended.
This morning’s Observer column.
It’s terrific that Bletchley Park has not only been rescued from the decay into which the site had fallen, but brilliantly restored, thanks to funding from the National Lottery (£5m), Google (which donated £500,000) and the internet security firm McAfee. I’ve been to the Park many times and for years going there was a melancholy experience, as one saw the depredations of time and weather inexorably outpacing the valiant efforts of the squads of volunteers who were trying to keep the place going.
Even at its lowest ebb, Bletchley had a magical aura. One felt something akin to what Abraham Lincoln tried to express when he visited Gettysburg: that something awe-inspiring had transpired here and that it should never be forgotten. The code-breaking that Bletchley Park achieved was an astonishing demonstration of the power of collective intelligence and determination in a quest to defeat the gravest threat that this country had ever faced.
When I was last there, the restoration was almost complete, and I was given a tour on non-disclosure terms, so I had seen what the duchess saw on Wednesday. The most striking bit is the restoration of Hut 6 exactly as it was, complete with all the accoutrements of the tweedy, pipe-smoking genuises who worked in it, right down to the ancient typewriters, bound notebooks and the Yard-O-Led mechanical pencil that one of them possessed.
Hut 6 is significant because that was where Gordon Welchman worked…
This morning’s Observer column remembering Ronald Coase.
When the news broke last week that Ronald Coase, the economist and Nobel laureate, had died at the age of 102, what came immediately to mind was Keynes’s observation that “practical men, who believe themselves to be quite exempt from any intellectual influences, are usually the slaves of some defunct economist”. Most of the people running the great internet companies of today have probably never heard of Coase, but, in a way, they are all his slaves, because way back in 1932 he cracked the problem of explaining how firms are structured, and how and why they change as circumstances change. Coase might have been ancient, but he was certainly not defunct…
This morning’s Observer column.
Coincidentally, in that same year, Gates stepped down from his position as CEO and began the slow process of disengaging from the company. What he failed to notice was that the folks he left in charge, chief among them one Steve Ballmer, were prone to sleeping at the wheel.
How else can one explain the way they failed to notice the importance of (successively) internet search, online advertising, smartphones and tablets until the threat was upon them? Or were they just lulled into somnolence by the sound of the till ringing up continuing sales from the old staples of Windows and Office?
But suddenly, that soothing tinkle has become less comforting. PC sales are starting to decline sharply , which means that Microsoft’s comfort zone is likewise set to shrink. Last week, we had the first indication that Ballmer & Co have woken up. In a 2,700-word internal memo rich in management-speak drivel , Ballmer announced a “far-reaching realignment of the company that will enable us to innovate with greater speed, efficiency and capability in a fast-changing world”.
This morning’s Observer column.
Nothing lasts forever: if history has any lesson for us, it is this. It’s a thought that comes from rereading Paul Kennedy’s magisterial tome, The Rise and Fall of the Great Powers, in which he shows that none of the great nation-states or empires of history – Rome; imperial Spain in 1600; France in either its Bourbon or Bonapartist manifestations; the Dutch republic in 1700; Britain in its imperial glory – succeeded in maintaining its global ascendancy for long.
What has this got to do with technology? Well, it provides us with a useful way of thinking about two of the tech world’s great powers.
Nobody who writes about the history of computing can ignore Bell Labs, that astonishing institution in New Jersey that created so much of the technology we nowadays take for granted. An interesting essay in the NYT has brought it back into focus for me because I’m fascinated by the problem of how to manage creative people in such a way that their creativity is liberated, not stifled, by the organisation that funds them. (Many years ago I co-authored a paper on the subject with Bob Taylor — the guy who funded the ARPAnet and later ran the Computer Systems Lab at Xerox PARC during the time when its researchers invented most of the computing technology we use today. The title of our essay was “Zen and the Art of Research Management” and it was published in December 2003 in a volume of essays dedicated to Roger Needham.)
The NYT article is by Jon Gertner, who is the author of a forthcoming book on Bell Labs entitled The Idea Factory: Bell Labs and the Great Age of American Innovation. It’s on my wish list.
At Bell Labs, the man most responsible for the culture of creativity was Mervin Kelly. Probably Mr. Kelly’s name does not ring a bell. Born in rural Missouri to a working-class family and then educated as a physicist at the University of Chicago, he went on to join the research corps at AT&T. Between 1925 and 1959, Mr. Kelly was employed at Bell Labs, rising from researcher to chairman of the board. In 1950, he traveled around Europe, delivering a presentation that explained to audiences how his laboratory worked.
His fundamental belief was that an “institute of creative technology” like his own needed a “critical mass” of talented people to foster a busy exchange of ideas. But innovation required much more than that. Mr. Kelly was convinced that physical proximity was everything; phone calls alone wouldn’t do. Quite intentionally, Bell Labs housed thinkers and doers under one roof. Purposefully mixed together on the transistor project were physicists, metallurgists and electrical engineers; side by side were specialists in theory, experimentation and manufacturing. Like an able concert hall conductor, he sought a harmony, and sometimes a tension, between scientific disciplines; between researchers and developers; and between soloists and groups.
ONE element of his approach was architectural. He personally helped design a building in Murray Hill, N.J., opened in 1941, where everyone would interact with one another. Some of the hallways in the building were designed to be so long that to look down their length was to see the end disappear at a vanishing point. Traveling the hall’s length without encountering a number of acquaintances, problems, diversions and ideas was almost impossible. A physicist on his way to lunch in the cafeteria was like a magnet rolling past iron filings…
George Dyson has written a fascinating book about the building of the first stored-program computer by John von Neumann and his colleagues at the Institute of Advanced Studies at Princeton. After I’d finished the book I had an email exchange with him, an edited version of which appears in this morning’s Observer.
Once upon a time, a “computer” was a human being, usually female, who did calculations set for her by men in suits. Then, in the 1940s, something happened: computers became machines based on electronics. The switch had awesome implications; in the end, it spawned a technology that became inextricably woven into the fabric of late-20th- and early 21st-century life and is now indispensable. If the billions of (mostly unseen) computers that now run our industrialised support systems were suddenly to stop working, then our societies would very rapidly grind to a halt.
So the question of where this Promethean force sprang from is an intriguing one, as interesting in its way as the origins of the industrial revolution…
Photograph shows the book on sale in Heffers bookshop in Cambridge yesterday.
Just caught up with this lovely dispatch from Davos by Jeff Jarvis.
I began this trip to Europe with my pilgrimage to the Gutenberg Museum in Mainz (blogged earlier). I recall Jon Naughton’s Observer column in which he asked us to imagine that we are pollsters in Mainz in 1472 asking whether we thought this invention of Gutenberg’s would disrupt the Catholic church, fuel the Reformation, spark the Scientific Revolution, change our view of education and thus childhood, and change our view of societies and nations and cultures. Pshaw, they must have said.
Ask those questions today. How likely do you think it is that every major institution of society–every industry, all of education, all of government–will be disrupted; that we will rethink our idea of nations and cultures; that we will reimagine education; that we will again alter even economics? Pshaw?
Welcome to Davos 1472.
John McCarthy has died. Good obit by Jack Schofield in the Guardian tonight.
In 1955, the computer scientist John McCarthy, who has died aged 84, coined the term “artificial intelligence”, or AI. His pioneering work in AI – which he defined as "the science and engineering of making intelligent machines" – and robotics included the development of the programming language Lisp in 1958. This was the second such high-level language, after Fortran, and was based on the idea of computing using symbolic expressions rather than numbers.
McCarthy was also the first to propose a “time-sharing” model of computing. In 1961, he suggested that if his approach were adopted, “computing may some day be organised as a public utility, just as the telephone system is a public utility,” and that that utility could become the basis of a significant new industry. This is the way that ‘cloud computing’ is being sold today.
However, when obliged to choose between the time-sharing work at the Massachusetts Institute of Technology (MIT) and AI, he chose AI. He said: “The ultimate effort is to make computer programs that can solve problems and achieve goals in the world as well as humans. However, many people involved in particular research areas are much less ambitious.”