Nostalgia isn’t a business model

On a whim, we went to see The Artist last night, and enjoyed it a lot. It’s a nicely-made, cleverly-written, amusing movie about technological transition in a media industry — the switch from silent movies to “talkies”. It stars Jean Dujardin as George Valentin, a matinee idol of the silent era who cannot come to terms with the fact that the technology has moved on, and the stunningly beautiful Bérénice Bejo as a young actress who adapts easily to talkies and becomes as big a star as Valentin once was. There are three other memorable performances: George Goodman as Al Zimmer, the boss of Kinograph Studios — an archetypal Hollywood mogul; James Cromwell as Clifton, Valentin’s faithful chauffeur; and Valentin’s entrancing dog, played by a canine named Uggie. Personally, I would have given the dog an Oscar.

Two things struck me about the film. Firstly, it took me back to Tim Wu’s book, The Master Switch: The Rise and Fall of Information Empires (Borzoi Books), in which he surveys the history of the great US communications industries of the 20th century. Al Zimmer is clearly a version of Adolph Zukor, the mogul who created the modern studio system by industrialising movie production and vertically integrating the business so that he controlled everything from the actors’ contracts and production budgets down to the cinemas in which the products were screened. Wu’s contention is that the history of the telephone, movie, radio and TV industries displays a common feature: they all go through a cycle, in which they start out chaotic and gloriously creative (but anarchic), until eventually a charismatic entrepreneur arrives who brings order to the chaos by offering consumers a more dependable and technologically superior product, the popularity of which enables him to capture the industry. Zukor did that for the movie business, just as — in our time — Steve Jobs did it for the music business. And in The Artist Zimmer immediately grasps the significance of audio and goes for it like an ostrich at a brass doorknob (as PG Wodehouse would have said).

The story line of the film is centred on Valentin’s obstinate refusal to accept the reality created by the new technology. It’ll never catch on, he sneers. People don’t want to hear actors talking. And here I was immediately reminded of the way print journalists reacted to the Internet — and the way academics reacted to Wikipedia. The only difference is that Valentin is rescued from his own obdurate torpor at the last minute, and learns to turn a new trick, whereas some journalists and academics would sooner die than change their minds.

But then, that’s life.

Google’s glass eye

From The Register:

Google has been showing off the expected capabilities of the augmented reality spectacles that it is calling Project Glass.

The early concept designs show wire-framed glasses with a display above the right eye which shows off personal schedules and location-based information. Also included is a camera, a microphone for calls and voice recognition, a GPS, and (presumably) a wireless connector to make the whole thing work. The Chocolate Factory has set up a page on Google+ to get feedback and released a video about a ukulele-playing hipster to show how the glasses would likely work.

On reading (and not understanding?) Heidegger

This morning’s Observer column.

If you write about technology, then sooner or later you’re going to meet a smartarse who asks whether you’ve read Heidegger’s The Question Concerning Technology. Having encountered a number of such smartarses in recent years, I finally decided to do something about it, and obtained a copy of the English translation, published in 1977 by Harper & Row. Having done so, I settled down with a glass of sustaining liquor and embarked upon the pursuit of enlightenment.

Big mistake. “To read Heidegger,” writes his translator, William Lovitt, “is to set out on an adventure.” It is. Actually, it’s like embarking on one of those nightmares in which you’re wading through quicksand and every time you grasp a rope or a rock it comes apart in your hand. And it turns out that Heidegger’s fiendish technique is actually to lure you into said quicksand.

What we learned from the BBC Micro

This morning’s Observer column.

The BBC Micro is 30 this year. It got its name from a BBC project to enhance the nation’s computer literacy. The broadcasters wanted a machine around which they could base a major factual series, The Computer Programme, showing how computers could be used, not just for programming but also for graphics, sound and vision, artificial intelligence and controlling peripheral devices. So a technical specification was drawn up by the BBC’s engineers and put to a number of smallish companies then operating in the embryonic market for “micro” computers.

Two of these companies were based in Cambridge. One was Sinclair Research, the eponymous vehicle of Clive Sinclair, a self-made man who worshipped his creator. The other was Acorn, a company co-founded by an ex-Sinclair employee, Chris Curry, and Hermann Hauser, an aristocratic-looking Austrian physicist. The story of the rivalry between these picturesque outfits has been memorably told in Micro Men, a TV film that combined a riveting technological tale with brilliantly comical dialogue (and which is still available on YouTube).

Acorn got the BBC contract, for reasons that baffled Sinclair but nobody else.

In the world of Big Data the man with only Excel is blind

This morning’s Observer column.

One of the most famous quotes in the history of the computing industry is the assertion that “640KB ought to be enough for anybody”, allegedly made by Bill Gates at a computer trade show in 1981 just after the launch of the IBM PC. The context was that the Intel 8088 processor that powered the original PC could only handle 640 kilobytes of Random Access Memory (RAM) and people were questioning whether that limit wasn’t a mite restrictive.

Gates has always denied making the statement and I believe him; he’s much too smart to make a mistake like that. He would have known that just as you can never be too rich or too thin, you can also never have too much RAM. The computer on which I’m writing this has four gigabytes (GB) of it, which is roughly 6,000 times the working memory of the original PC, but even then it sometimes struggles with the software it has to run.

But even Gates could not have foreseen the amount of data computers would be called upon to handle within three decades…

Raspberry Pi today. Jam tomorrow

This morning’s Observer column.

The Raspberry Pi project – a philanthropic effort to create the contemporary equivalent of the BBC Micro of yesteryear – has graduated from idealistic vapourware dreamed up in Cambridge to a finished, deliverable product manufactured in China. (In a nice touch, the Pi device comes in two flavours, Model A and Model B, just like the BBC machine, which was also designed in Cambridge.) Over the next few months, we’ll see container-loads of the little computer boards delivered to these shores. The time has come, therefore, to start thinking about how this astonishing breakthrough can be exploited in our schools.

Here are a few suggestions.

First, we need to jettison some baggage from the past. In particular, we have to accept that ICT has become a toxic brand in the context of British secondary schools…

The Idea Factory

Nobody who writes about the history of computing can ignore Bell Labs, that astonishing institution in New Jersey that created so much of the technology we nowadays take for granted. An interesting essay in the NYT has brought it back into focus for me because I’m fascinated by the problem of how to manage creative people in such a way that their creativity is liberated, not stifled, by the organisation that funds them. (Many years ago I co-authored a paper on the subject with Bob Taylor — the guy who funded the ARPAnet and later ran the Computer Systems Lab at Xerox PARC during the time when its researchers invented most of the computing technology we use today. The title of our essay was “Zen and the Art of Research Management” and it was published in December 2003 in a volume of essays dedicated to Roger Needham.)

The NYT article is by Jon Gertner, who is the author of a forthcoming book on Bell Labs entitled The Idea Factory: Bell Labs and the Great Age of American Innovation. It’s on my wish list.

At Bell Labs, the man most responsible for the culture of creativity was Mervin Kelly. Probably Mr. Kelly’s name does not ring a bell. Born in rural Missouri to a working-class family and then educated as a physicist at the University of Chicago, he went on to join the research corps at AT&T. Between 1925 and 1959, Mr. Kelly was employed at Bell Labs, rising from researcher to chairman of the board. In 1950, he traveled around Europe, delivering a presentation that explained to audiences how his laboratory worked.

His fundamental belief was that an “institute of creative technology” like his own needed a “critical mass” of talented people to foster a busy exchange of ideas. But innovation required much more than that. Mr. Kelly was convinced that physical proximity was everything; phone calls alone wouldn’t do. Quite intentionally, Bell Labs housed thinkers and doers under one roof. Purposefully mixed together on the transistor project were physicists, metallurgists and electrical engineers; side by side were specialists in theory, experimentation and manufacturing. Like an able concert hall conductor, he sought a harmony, and sometimes a tension, between scientific disciplines; between researchers and developers; and between soloists and groups.

ONE element of his approach was architectural. He personally helped design a building in Murray Hill, N.J., opened in 1941, where everyone would interact with one another. Some of the hallways in the building were designed to be so long that to look down their length was to see the end disappear at a vanishing point. Traveling the hall’s length without encountering a number of acquaintances, problems, diversions and ideas was almost impossible. A physicist on his way to lunch in the cafeteria was like a magnet rolling past iron filings…

How to handle 15 billion page views a month

Ye Gods! Just looked at the stats for Tumblr.

500 million page views a day

15B+ page views month

~20 engineers

Peak rate of ~40k requests per second

1+ TB/day into Hadoop cluster

Many TB/day into MySQL/HBase/Redis/Memcache

Growing at 30% a month

~1000 hardware nodes in production

Billions of page visits per month per engineer

Posts are about 50GB a day. Follower list updates are about 2.7TB a day.

Dashboard runs at a million writes a second, 50K reads a second, and it is growing.

And all this with about twenty engineers!