Focussing after the fact

This morning’s Observer column.

“From today painting is dead” is an aphorism often attributed to Paul Delaroche, a 19th-century French painter, upon seeing the first daguerreotypes (though Wikipedia maintains there is no compelling evidence that he actually said it). In a way, it was a misjudgment on the same epic scale as Thomas Watson’s celebrated observation that the total world market for computers was five machines. What Delaroche was presumably getting at was that painting as a naturalistic representation of reality was terminally threatened by the arrival of the new technology of “painting with light”. If that is indeed what he meant, then he was only partly right.

What brought Delaroche to mind was the announcement of the Lytro light field camera, which goes on sale next year. Based on some discoveries made by a Stanford student, Ren Ng, the camera turns the normal process of compose-focus-shoot on its head. Instead you just point the Lytro at whatever you want to photograph, and then you can retrospectively focus in on any part of the image. As the New York Times explained: “With Lytro's camera, you can focus on any point in an image taken with a Lytro after you’ve shot the picture. When viewing a Lytro photograph on your computer, you can simply click your mouse on any point in the image and that area will come into focus. Change the focal point from the flower to the child holding the flower. Make the background blurry and the foreground clear. Do the opposite – you can change the focal point as many times as you like.”

Digital Darwinism

This morning’s Observer column.

This is a story about digital Darwinism. Once upon a time, the abiding nightmare of authors and students who used their PCs and laptops to compose books, dissertations and essays was that a random accident – theft of a laptop, perhaps, or a hard-disk crash – would be enough to vaporise years of irreplaceable work. So we all resorted to primitive schemes to protect against that terrible eventuality. In the early days, these took the form of piles of floppy disks stored at other locations; after that, we “burned” the precious files on to blank CDs; later still, we copied them on to USB sticks and flash drives that went in our pockets or on our keyrings; finally, we were even driven to emailing the damned things to ourselves.

Then along came an idea that made all these stratagems look, well, clumsy. It was called Dropbox. You logged on to the Dropbox site, registered (for free) and downloaded a small program (called a ‘client’) on to your computer. Once installed, this program created a special folder – helpfully labelled “Dropbox” – which appeared on your desktop. From then on, you saved any file that you wanted to back up in your Dropbox folder.

So far, so mundane. But even as you continued with your writing, the Dropbox client was busy in the background…

Pleasant surprises #362

Just as the advance publicity for my new book begins to gather momentum (we’ve just learned, for example, that a major US university is planning to use it as a class text), my earlier book seems to be having a new lease of life. This is the window display in Heffers, the big Cambridge academic bookshop today. (My Brief History is the title on the far right.)

Thanks to Brian for the pic.

Why the Web might be a transient

As I observed the other day, one of the things that drove me to write From Gutenberg to Zuckerberg was exasperation at the number of people who thought the Web is the Internet. In lecturing about this I developed a provocative trope in which I said that, although the Web is huge, in 50 years time we may see it as just a blip in the evolution of the Net. This generally produced an incredulous reaction.

So it’s interesting to see Joe Hewitt arguing along parallel lines. Unlike me, he suggests a process by which the Web might be sidelined. “The arrogance of Web evangelists is staggering”, he writes.

They take for granted that the Web will always be popular regardless of whether it is technologically competitive with other platforms. They place ideology above relevance. Haven’t they noticed that the world of software is ablaze with new ideas and a growing number of those ideas are flat out impossible to build on the Web? I can easily see a world in which Web usage falls to insignificant levels compared to Android, iOS, and Windows, and becomes a footnote in history. That thing we used to use in the early days of the Internet.

My prediction is that, unless the leadership vacuum is filled, the Web is going to retreat back to its origins as a network of hyperlinked documents. The Web will be just another app that you use when you want to find some information, like Wikipedia, but it will no longer be your primary window. The Web will no longer be the place for social networks, games, forums, photo sharing, music players, video players, word processors, calendaring, or anything interactive. Newspapers and blogs will be replaced by Facebook and Twitter and you will access them only through native apps. HTTP will live on as the data backbone used by native applications, but it will no longer serve those applications through HTML. Freedom of information may be restricted to whatever our information overlords see fit to feature on their App Market Stores.

I hope he’s wrong and given that he’s a serious and successful Apps developer he has an axe to grind. But his blog makes one think…

If you want to create jobs at home, don’t rely on startups

This morning’s Observer column.

[Tom] Friedman is a significant figure because his pulpit on the NYT enables him subliminally to insert ideas into the collective unconscious of America’s ruling elite. Which is why something he wrote recently needs to be challenged. “If we want to bring down unemployment in a sustainable way”, he writes, “funding more road construction will do it. We need to create a big bushel of new companies – fast. We’ve got to get more Americans working again for their own dignity… Good-paying jobs don’t come from bailouts. They come from startups.”

When Samuel Johnson was asked how he would refute Bishop Berkeley’s philosophical proposition about the non-existence of matter, he famously kicked a stone and said: “I refute it thus!” Not having a convenient stone, I pick up the nearest object that lies to hand. It’s an iPhone. “Designed by Apple in California”, it says on the back. “Assembled in China”.

Now of course it’s a long time since Apple was a startup, but the iPhone still refutes Friedman’s hypothesis…

Steve Jobs and Napoleon: an exchange

My Observer piece about Steve Job’s place in history prompted some interesting responses, in particular an email from my friend, Gerard de Vries, who is an eminent philosopher of science. “With all the articles about the genius of Apple’s Jobs around”, he wrote

Tolstoy’s War and Peace came to my mind. This is how historians used to write about Napoleon: as the genius, the inspirer, the man who saw everything coming far ahead and who designed sophisticated strategies to win his battles.

That image was destroyed by Tolstoy.

Was Napoleon in command? Well, he may have given commands but – as Tolstoy’s novel stresses – a courier had to deliver them and maybe the courier got lost in the fog, or got shot halfway and even if he arrived at the right spot and succeeded to find the officers of the regiment, the command to attack may have been completely irrelevant because just a half hour before the courier arrived, the enemy had decided to launch a full attack and all Napoleon’s troops can now do is pray and hide, or flee. Tolstoy’s point is that Napoleon’s power is projected onto him – first by his admiring staff and troops and later by historians. Napoleon plays that he is “Napoleon” – that he is in command, that he knows what he is doing. But in fact he too was a little cog in a big machine. When the machine got stuck, the genius of Napoleon disappeared. But in our historical narratives, we tend to mix up cause and effect. So the story is that the machine got stuck because Napoleon’s genius ran out.

Isn’t this also the case with Jobs? He played his role as the genius CEO and was lucky. Is there really more to say?

The best advice to generals, Tolstoy remarked somewhere, is to publish your strategy after the battle. That’s the only way to ensure that your strategy relates to what has happened.

I was intrigued by this ingenious, left-field approach. It reminded me of something that Gerard had said to me when we first worked together way back in 1978 – that War and Peace was quite a good text for students embarking on the history and philosophy of science, where one of the most important obligations is to resist the Whig interpretation of history — which is particularly seductive in the case of science.

I replied,

I don’t think Tolstoy’s analysis fits the Jobs case exactly, for two reasons: we have corroborated accounts by eyewitnesses/subordinates of Jobs’s decision-making at crucial junctures of the story (when the likely outcomes were not at all certain); and there’s the fact that Jobs’s strategy was consistent in an interesting way, namely that his determination to keep the Mac a closed system was a short-term disaster (because it left the field wide open for Microsoft and Wintel) but a long-term masterstroke (because it’s now what enables Apple to produce such impressively functional mobile devices).

To which Gerard responded:

I’m less convinced by the eye-witness reports: Napoleon’s staff also thought well about his judgements and determination (until the French were defeated and had to retreat from Russia, of course). What IS however a good point is that Job’s name appears on a large number of patent-applications (there was an interesting report about that in IHT/NYT last week which also pointed out that this could not only be motivated by the wish to boost Job’s (internal and external) company stature, as patent offices are keen to check whether the people who appear on patent applications have really contributed to the innovation.) Jobs seems to have been active not only on the level of “strategy” but also on the level of detailed engineering and design work in Apple (and that would be a difference with Napoleon: I don’t think Napoleon ever did some shooting himself. As I remember, he kept a safe distance from the actual fighting).

The more one thinks about this stuff, the more one realises how important it is to try and see technological stories in a wider context. For example, I vividly remember how Jobs was castigated in the 1980s for his determination to maintain absolute control over both hardware and software — in contrast to Microsoft, which prospered because anyone could make DOS and Windows boxes. Now the cycle has come full circle with Google realising that it will have to buy a handset manufacturer if it is to be able to guarantee “outstanding user experiences” (i.e. iPhone-like performance) for Android phones.

And that, in turn, brings to mind Umberto Eco’s lovely essay explaining why the Mac is a Catholic system and the PC is a Protestant one.

Later, another friend, Jon Crowcroft commented:

“Well Jobs is a Buddhist and Gates is agnostic – that certainly tells you something. People I know that talked to Jobs on various projects support the idea he had a major hand in project successes. I think his early failure was a common one of being too early to market; once he got re-calibrated after Apple bought NeXT to get him back, then he had it all sussed.”

Still later: Comparison between Apple and Microsoft is also interesting, as David Nicholls pointed out in an email. In terms of market cap, Apple is now worth considerably more, but:

While it is true that Apple is doing amazingly well at the moment, and ‘gaining ground’ over Microsoft, when it comes to the total amount of money made over the years, Microsoft is still well ahead.

I did a quick bit of digging and found that Apple’s total Net Income from 2001 to 2010 (the only figures I could find) is around $35.5 billion. In the same period Microsoft’s equivalent is $119 billion. These figures aren’t corrected for inflation but that obviously won’t affect the relative amounts.

Microsoft’s figures are available back to 1991, and the 1991-2010 total is around $151 billion.

Steve Jobs’s place in history

I’ve written a long piece about Steve Jobs for today’s Observer. Extract:

When the time comes to sum up Jobs’s achievements, most will portray him as a seminal figure in the computing industry. But Jobs is bigger than that.

To understand why, you have to look at the major communications industries of the 20th century – the telephone, radio and movies. As Tim Wu chronicles it in his remarkable book, The Master Switch, each of these industries started out as an open, irrationally exuberant, chaotic muddle of incompatible standards, crummy technology and chancers. The pivotal moment in the evolution of each industry came when a charismatic entrepreneur arrived to offer consumers better quality, higher production values and greater ease of use.

With the telephone it was Theodore Vail of AT&T, offering a unified nationwide network and a guarantee that when you picked up the phone you always got a dial tone. With radio it was David Sarnoff, who founded RCA. With movies it was Adolph Zukor, who created the Hollywood studio system.

Jobs is from the same mould. He believes that using a computer should be delightful, not painful; that it should be easy to seamlessly transfer music from a CD on to a hard drive and thence to an elegant portable player; that mobile phones should be powerful handheld computers that happen to make voice calls; and that a tablet computer is the device that is ushering us into a post-PC world. He has offered consumers a better proposition than the rest of the industry could – and they jumped at it. That’s how he built Apple into the world’s most valuable company. And it’s why he is really the last of the media moguls.

David Pogue had an insightful piece about Jobs in the New York Times. This passage caught my eye:

In Silicon Valley, success begets success. And at this point, few companies have as high a concentration of geniuses — in technology, design and marketing — as Apple. Leaders like the design god Jonathan Ive and the operations mastermind Tim Cook won’t let the company go astray.

So it’s pretty clear that for the next few years, at least, Apple will still be Apple without Mr. Jobs as involved as he’s been for years.

But despite these positive signs, there’s one heck of a huge elephant in the room — one unavoidable reason why it’s hard to imagine Apple without Mr. Jobs steering the ship: personality.

His personality made Apple Apple. That’s why no other company has ever been able to duplicate Apple’s success. Even when Microsoft or Google or Hewlett-Packard tried to mimic Apple’s every move, run its designs through the corporate copying machine, they never succeeded. And that’s because they never had such a single, razor-focused, deeply opinionated, micromanaging, uncompromising, charismatic, persuasive, mind-blowingly visionary leader.

By maintaining so much control over even the smallest design decisions, by anticipating what we all wanted even before we did, by spotting the promise in new technologies when they were still prototypes, Steve Jobs ran Apple with the nimbleness of a start-up company, even as he built it into one of the world’s biggest enterprises.

“I believe Apple’s brightest and most innovative days are ahead of it,” Mr. Jobs wrote in his resignation letter.

That’s a wonderful endorsement. But really? Can he really mean that Apple’s days will be brighter and more innovative without him in the driver’s seat?

And Charles Arthur argues that Jobs’s greatest legacy lies in being the man who persuaded the world to pay for online content:

Jobs pried open many content companies’ thinking, because his focus was always on getting something great to the customer with as few obstacles as possible. In that sense, he was like a corporate embodiment of the internet; except he thought people should pay for what they got. He always, always insisted you should pay for value, and that extended to content too. The App and Music Store remains one of the biggest generators of purely digital revenue in the world, and certainly the most diverse; while Google’s Android might be the fastest-selling smartphone mobile OS, its Market generates pitiful revenues, and I haven’t heard of anyone proclaiming their successes from selling music, films or books through Google’s offerings.

Jobs’s resignation might look like the end of an era, and for certain parts of the technology industry it is. For the content industries, it’s also a loss: Jobs was a champion of getting customers who would pay you for your stuff. The fact that magazine apps like The Daily haven’t set the world alight (yet?) isn’t a failure of the iPad (which is selling 9m a quarter while still only 15 months old; at the same point in the iPod’s life, just 219,000 were sold in the financial quarter, compared with the 22m – 100 times more – of its peak). It’s more like a reflection of our times.

So if you’re wondering how Jobs’s departure affects the media world, consider that it’s the loss of one of the biggest boosters of paid-for content the business ever had. Who’s going to replace that?

Digital Rot

Sobering blog post by Ken Rockwell.

One day it dawned on me, after I heard about more than one friend buying an old Nikon D1 or D1X for $75, that these old digital cameras are worth far less precisely because they are clogged with worthless digital guts, instead of just having a hole for film.

The D1 and D1X was a Nikon F5 with a sensor and some computer junk thrown in, just as the long forgotten Nikon D2Xs is the current F6 with digital guts. People paid Nikon four times as much for the cameras with the digital guts.

My friends paid $5,500 for the D1X new, and I paid $4,500 for my new D1H back in their day, but the D1X is worthless today because it’s only got the resolution of a Nikon D50 and runs more slowly than a D90.

While a used D1X today is hardly worth the cost of packing and shipping, a used F5 still sells for hundreds of dollars because it takes film.

An old D2H is only worth about $500 on eBay , while a used F6 still goes for four figures. The F6 is still the world’s best 35mm film camera.

Even though the digital cameras cost about four times the price of their film equivalents when new, the digital cameras are worth far less after a couple of years.

It’s true. My Leica M4 film camera is worth more now than when I bought it years ago. But my (digital) M8 has depreciated out of sight. Why? Because its sensor (and image processor) are, well, effectively stone-age devices already.