The Queen and I

HMQ and I have one thing in common: we both use Leica cameras. This famous photograph is of her with a M3. My first Leica was an M2, which I bought from an antiquarian bookseller towards the end of my time as a student. It came with a 50mm Summicron lens. He accepted a facsimile edition of Newton’s Principia in part exchange.

The difference between HMQ and me is therefore simple: she got her cameras as a gift from Leitz, whereas I have always paid (through the nose) for them. It’s said that she requested that they should come without serial numbers, so they engraved her initials on them instead.

Since 1925, Leica has had a tradition of giving cameras with special serial numbers to a select group of prominent people. As well as the Queen they include photographers Alfred Eisenstat and Henri Cartier-Bresson, the co-inventors of Kodachrome film (Leopold Mannes and Leo Godowsky Jr.), US president Dwight Eisenhower and Woody Allen (serial number 3,555,555).

Why Allen? Well, it turns out that he’s an avid Leica enthusiast. After being presented with the camera (an M8.2, for those who are interested in such things) he explained that when he was preparing for the film Vicky Cristina Barcelona and had to choose a camera for actress Scarlett Johansson to use (her character plays a photographer), he chose a Leica.

Now there’s product placement for you.

What Facebook is really up to (maybe)

I’m not what you’d describe as a natural Facebooker. Sure I have a FB page and a bunch of ‘friends’ but I visit the site only rarely, and that’s mainly to find out what my kids or my friends’ kids are up to. Some Facebookers make the mistake of thinking that I’m an active user, but that’s because I’ve arranged for Twitter (of which I am an active user) to feed my tweets to FB where they appear as Updates.

All of which is by way of saying that my views on social networking are not based on deep personal experience and so should be taken with a pinch of salt. But, hey! this is a blog and blogs are places for unfinished thoughts, work-in-progress and the like. So here goes.

At the moment Facebook has two distinctive features. The first is a huge subscriber base — 900 million and counting. The second is a higher level of user engagement than any other service on the Web. Something like half of the users log in every day, and when logged in spend more time on the site than users of any other site. so the key question for anyone wanting to understand the Facebook phenomenon is: what’s driving them to do this?

I think I know the answer: it’s photographs — photographs that they or their friends have uploaded. Two reasons for thinking this: (1) the staggering statistics of how many photographs FB now hosts (100 billion according to some estimates), and the rate at which Facebookers upload them every day; and (2) personal observation of friends and family who are users of the site. Of course people also log into Facebook to read their friends’ status updates, newsfeeds/timelines, messages, wall-posts etc. But more than anything else they want to see who’s posted pics from last night, who’s been tagged in these images, and where (i.e. at which social event) the pictures were taken.

Hold that thought for a moment while we move to consider the speculation (now rampant) that Facebook is developing a smartphone. Henry Blodget thinks that this would be a crazy idea for seven different reasons, and I’m inclined to agree with him. (But then I would have said — probably did say — the same about Steve Jobs’s decision to enter the mobile phone market.)

Zuck & Co aren’t crazy — not in that way anyway — so let’s assume that they share Blodget’s view — that building a phone qua phone would be a daft idea. And yet everyone’s convinced that they are building something. So what is it?

Dave Winer, whom I revere, thinks it’s a camera. And not just any old camera, either, but what Dave calls a social camera. Here’s how he described it.

Here’s an idea that came to me while waiting for a train to Genova. I was standing on a platform, across a pair of tracks a man was taking a picture of something in my direction. I was in the picture, the camera seemed to be pointed at me.

I thought to yell my email address across the tracks asking him to send me a copy of the picture. (Assuming he spoke English and I could be heard over the din of the station.)

Then I thought my cell phone or camera could do that for me. It could be beaming my contact info. Then I had a better idea. What if his camera, as it was taking the picture, also broadcast the bits to every other camera in range. My camera, sitting in my napsack would detect a picture being broadcast, and would capture it. (Or my cell phone, or iPod.)

Wouldn’t this change tourism in a nice way? Now the pictures we bring home would include pictures of ourselves. Instead of bringing home just pictures that radiate from me, I’d bring home all pictures taken around me while I was traveling.

Of course if you don’t want to broadcast pictures you could turn the feature off. Same if you don’t want to receive them.

A standard is needed, but the first mover would set it, and there is an incentive to go first because it would be a viral feature. Once you had a Social Camera, you’d want other people to have one. And you’d tell them about it.

Dave wrote that in 2007, which is 35 Internet-years ago, and I remember thinking that it was a bit wacky when I first read it. But then I’m a serious (or at any rate an inveterate) photographer, and for me photographs are essentially private things — artefacts I create for my own satisfaction. Of course I am pleased if other people like them, but that’s just icing on the cake.

Over the years, though, my views changed. I bought an Eye-Fi card, which is basically just an SD card with onboard WiFi. Stick into your camera and it wirelessly transmits the images you take to a computer on the same Wi-Fi network. It’s fun (though a bit slow unless you keep the image size down) but can be useful at times — e.g. in event photography.

And then came the iPhone. The first two versions had crappy cameras, just like most other mobile phones. But from the 3S model onwards, the iPhone cameras improved to the point where they’re almost as good as the better point-and-shoot digital compacts. Given the First Law of Photography, which is that the best camera is the one you have with you, and given that most people always carry a phone whereas only hardcore snappers like me always carry a separate camera, it was only a matter of time before the market for compact cameras began to feel Christensen-type disruption. And so it has proved, to the point where the most popular ‘camera’ amongst Flickr users is now the iPhone 4.

Now start joining up these dots — as Dave Winer did in this blog post — and you can see the glimmer of an intriguing possibility. Consider: if we accept that (i) the Facebook geeks are smart, (ii) social photography is Facebook’s addictive glue, (iii) cameras have morphed into cameraphones and (iv) Facebook recently paid an apparently insane amount of money for Instagram, then maybe the device that Zuck & Co are incubating is actually a camera which has photo-sharing built in. And if it also happens to make phone calls and send texts well, that would be a bonus.

Neat, eh?

Dave Winer: still programming after all those years

Dave Winer (one of my heroes) has been programming for 37 years. He writes about that in reflective mood:

Some conclusions may be in order.

First, most people don’t program that long. The conventional wisdom is that you “move up” into management long before you’ve been coding for 37 years. Only thing is I don’t see programming as a job, I see it as a creative act. I drew a big circle shortly after I started, and said I was going to fill the circle. So until the circle is full, I still have more to do.

“Legacy”: the use and abuse of a term

The word “legacy” crops up a lot in discussions about innovation in cyberspace, so it was good to find thoughtful essay about it by Stephen Page, current CEO of Faber & Faber, the eminent publishing house of which TS Eliot was famously a Director.

In any revolution, language matters. One powerful word in the digital revolution is “legacy”. There is a conscious attempt to employ the word pejoratively, to suggest that existing media businesses – publishers, in the case of books – are going to fail to make the leap to a new world.In common usage, the first meaning of legacy is an inheritance, or something handed down from the past. A second meaning, more specific and recent, denotes technological obsolescence, or dramatic business-model shift. These two meanings have been fused to imply inevitable irrelevance for those with history, especially in media. This is a sleight of hand that would be sloppy if it wasn’t so considered.

Let’s deal with technological obsolescence. Media businesses are not technology businesses, but they can be particularly affected by technology shifts. I run a so-called legacy publishing house, Faber & Faber. Most of our business is based on licensing copyrights from writers and pursuing every avenue to find readers and create value for those writers. We are agnostic about how we do this. For our first 80 years, we could only do it through print formats (books); now we can do it through books, ebooks, online learning (through our Academy courses), digital publishing (such as the Waste Land app) and the web. Technology shifts have tended to result in greater opportunity, not less.

Implicit, I suppose, in the pejorative use of the term legacy is that we at Faber, like other publishers, don’t get it – “it” being the new economy, the new rules. There is something in this, of course. It’s harder to transform an existing business into one with a new culture and cost structure than to start afresh. Any existing business, no matter what old-world strength it has, will fail if it is not bold enough to attack its own DNA where necessary. The ailing photography firm Eastman Kodak is widely cited as a recent example of this phenomenon. But this is business failure due to cultural stasis. There is nothing inevitable in failure for existing businesses, but they have particular issues to figure out: simply adhering to old business practices will lead to failure. Failure will not be because of technology, but through failure to react to technology. In fact, it could be regarded as squandering the opportunity of a beneficial legacy.

He’s right about the two meanings. A legacy can be a source of mindless complacency — the kind of mindset one finds in the trust-fund Sloanes who hang out in Belgravia and Chelsea. But it can also be a source of strength — as in the case of Faber, who seem to me to be approaching the challenges of digital technology with imagination and vision. For example, the wonderful Waste Land App produced by my friend Max Whitby and his colleagues at TouchPress required access to the Eliot papers and rights held by Faber. So they used their ‘legacy’ to add value to a digital product in a distinctive and valuable way.

But others legatees in the publishing (and other content industries) have viewed their inheritances in different and less imaginative ways. Think, for example, of the way Stephen Joyce has relentlessly used his control of the Joyce estate to prevent imaginative uses of his grandfather’s works. (Mercifully, Ulysses is now finally out of copyright and therefore beyond Stephen’s baleful reach, which is what has enabled TouchPress to embark on an imaginative App based around a new translation that will come out later this year.). Or of the way some legatees have viewed their inheritances as guarantees that the digital revolution will never threaten their hold on a market.

Still, Mr Page is right: “legacy” is too often used as a term of patronising abuse by tech evangelists who think that they have “the future in their bones” (as C.P. Snow put it in his famous Rede Lecture all those years ago.)

Hypocrisy rules OK

What’s surprising about JP Morgan Chase’s admission that it ‘lost’ $2billion in the trading of complex financial derivatives is not that it happened but what it demonstrates about how our democracies have been captured by a banking system which continues to thumb its nose at legislators. For Jamie Dimon, the chief executive of JPMorgan Chase who had to reveal the cock-up in a call to analysts, was — and no doubt remains — a fierce opponent of tighter regulation of the banking system, and especially of any rule that might constrain banks from unduly risky behaviour. “What Mr. Dimon did not say”, observed the New York Times,

is that the loss also occurred because of a continued lack, nearly four years after the crisis, of rules and regulators up to the task of protecting taxpayers and the economy from the excesses of too big to fail banks; and, yes, of protecting the banks from their executives’ and traders’ destructive risk-taking.

The fact that JPMorgan’s loss — which Mr. Dimon has warned could “easily get worse” — is not enough to topple the bank, is not the point. What matters is that JPMorgan, like the nation’s other big banks, is still engaged in activities that can provoke catastrophic losses. If policy makers do not strengthen reform, then luck is the only thing preventing another meltdown.

All of which brings to mind Mr Dimon’s ferocious opposition to the Dobb-Frank Act, which was the Congressional response to the banking catastrophe, compliance with which — he claimed — would cost his bank $400m-$600m annually. The Dobb-Frank Act was also the object of sustained ridicule by the Economist magazine in a long piece last February (which quoted Dimon’s estimate of the cost of compliance). The piece attacked what it portrayed as flaws in “the confused, bloated law passed in the aftermath of America’s financial crisis”, but failed to observe that one reason for the complexity of Dodd-Frank was the unconscionable complexity of the financial system that it was attempting to regulate.

The other thing that, strangely, escaped the notice of Economist is the need to make a cost-benefit assessment of initiatives like Dodd-Frank. Sure, the costs it imposes on banks are no doubt heavy; sure, it will give employment to lawyers and form-fillers. But what about the costs that the banking meltdown has inflicted on our societies (as some readers of the Economist pointed out in letters to the Editor the following week)?

But perhaps the most acute angle on the irony of JP Morgan Chase’s screw-up is provided by this chart by Derek Thompson of The Atlantic:

The NYT points out that the Dodd-Frank Act also calls for new rules on derivatives — including transparent trading and requirements for banks to back their trades with collateral and capital. If such rules were in place, JPMorgan’s trades could not have escaped notice by regulators and market participants. In the face of heavy lobbying, the derivatives’ rules have also been delayed or watered down. But guess what?

There are now several bills in the House, with bipartisan support, to weaken the Dodd-Frank law on derivatives. One of those would let the banks avoid Dodd-Frank regulation by conducting derivatives deals through foreign subsidiaries. The JPMorgan loss was incurred in its London office, which doesn’t lessen the effect here.

Finally — and needless to say — Mitt Romney has called for repealing the Dodd-Frank Act. Sometimes, one wonders if there is any intelligent life left on earth.

The Disruptive Innovator

It’s not often that one comes on books that change the way one thinks. Examples that have that kind of impact on me are Donald Schon’s The Reflective Practitioner: How Professionals Think in Action, Neil Postman’s The Disappearance of Childhood , Thomas Kuhn’s The Structure of Scientific Revolutions and Howard Gardner’s Frames of Mind: The Theory of Multiple Intelligences.

And of course Clayton Christensen’s The Innovator’s Dilemma, which has shaped my thinking about innovation ever since I read it many years ago. But although I knew his work, I knew very little about the man himself, which is why I found Larissa MacFarquhar’s New Yorker profile of him such riveting reading.

It appears in the May 14 issue and is, alas, behind the paywall, but the online summary gives a flavour of it.

In industry after industry, Christensen discovered, the new technologies that had brought the big, established companies to their knees weren’t better or more advanced—they were actually worse. The new products were low-end, dumb, shoddy, and in almost every way inferior. But the new products were usually cheaper and easier to use, and so people or companies who were not rich or sophisticated enough for the old ones started buying the new ones, and there were so many more of the regular people than there were of the rich, sophisticated people that the companies making the new products prospered. Christensen called these low-end products “disruptive technologies,” because, rather than sustaining technological progress toward better performance, they disrupted it.

It’s eerie how Christensen’s analysis still resonates. A few weeks ago, Kamal Munir from the Judge Business School gave a terrific talk in my Arcadia Seminar series at Cambridge University Library about his investigation of how Kodak fumbled the digital future. By any definition, Kodak was a great company which not only dominated its market, but had effectively created that market. And yet when the early digital cameras (like the Sony Mavica) arrived, the crappy technical quality of the images they produced was one of the factors that led Kodak to underestimate the threat that they would represent to its future. (Another factor was that the margins on digital photography were minuscule compared with the 70 per cent margins that Kodak was squeezing out of analog photography.)

And the innovation story goes on. We’re seeing it currently in the Higher Education business. Traditional universities are expensive and inefficient as teaching institutions, but most of them persist in believing that their USPs are such that scrappy online alternatives will never pose a serious threat. And it’s true that at the moment most online offerings are still pretty chaotic, variable and uncoordinated. But if Christensen’s analysis is correct, the challengers will eventually prove “good enough” for many customers (especially as the costs of traditional university courses continue to escalate) — with the result that he observed all those decades ago in industries like disk storage and steel-making. Caveat vendor.

Interestingly, MacFarquhar says that one of the people who first spotted Christensen’s work was Andy Grove:

One of the first C.E.O.s to understand the significance of Christensen’s idea was Andy Grove, the C.E.O. of Intel. Grove heard about it even before Christensen published his book, “The Innovator’s Dilemma,” in 1997. Intel brought out the Celeron chip, a cheap product that was ideal for the new low-end PCs, and within a year it had captured thirty-five per cent of the market. Soon afterward, Andy Grove stood up at the COMDEX trade show, in Las Vegas, holding a copy of “The Innovator’s Dilemma,” and told the audience that it was the most important book he’d read in ten years.

I’ve always thought that Grove was one of the most insightful CEOs of all time. He also understood the real significance of the Internet long before most people got it — as when he declared in 1999 that “in five years’ time all companies will be Internet companies or they won’t be companies at all”. What he meant was that the Net would become like the telephone or mains electricity: a utility that would transform the world in which everyone did business. Grove was much ridiculed for the declaration at the time. But he had the last laugh.

Bertrand Russell’s Ten Commandments

1. Do not feel absolutely certain of anything.

2. Do not think it worth while to proceed by concealing evidence, for the evidence is sure to come to light.

3. Never try to discourage thinking for you are sure to succeed.

4. When you meet with opposition, even if it should be from your husband or your children, endeavor to overcome it by argument and not by authority, for a victory dependent upon authority is unreal and illusory.

5. Have no respect for the authority of others, for there are always contrary authorities to be found.

6. Do not use power to suppress opinions you think pernicious, for if you do the opinions will suppress you.

7. Do not fear to be eccentric in opinion, for every opinion now accepted was once eccentric.

8. Find more pleasure in intelligent dissent than in passive agreement, for, if you value intelligence as you should, the former implies a deeper agreement than the latter.

9. Be scrupulously truthful, even if the truth is inconvenient, for it is more inconvenient when you try to conceal it.

10. Do not feel envious of the happiness of those who live in a fool’s paradise, for only a fool will think that it is happiness.

From the consistently terrific Brain Pickings.