Joe Schumpeter and the truth about technology

This morning’s Observer column.

Pondering the role of entrepreneurship and innovation in this process, Schumpeter argued that capitalism renews itself in periodic waves of traumatic upheaval. He was not the first to have this idea, but he was the first to come up with a memorable term for the process: Schumpeter called them waves of “creative destruction”.

We’re living through one such wave at the moment, but our public discourse about it is lopsided. That’s because the narrative tends to be dominated by enthusiasts and evangelists, by people who, like the “cybertheorists” Poole detests, tend to focus on the creative side of the Schumpeterian wave. At the same time, people who are sceptical or fearful about the new technology tend to be labelled – and sometimes derided – as luddites or technophobes.

The trouble is that Schumpeter meant what he said: innovation is a double-edged sword.

Three more cycles — and then what?

From the NYT Bits Blog.

The coming sensor innovations, said Bernard Meyerson, an I.B.M. scientist and vice president of innovation, are vital ingredients in what is called cognitive computing. The idea is that in the future computers will be increasingly able to sense, adapt and learn, in their way.

That vision, of course, has been around for a long time — a pursuit of artificial intelligence researchers for decades. But there seem to be two reasons that cognitive computing is something I.B.M., and others, are taking seriously these days. The first is that the vision is becoming increasingly possible to achieve, though formidable obstacles remain. I wrote an article in the Science section last year on I.B.M.’s cognitive computing project.

The other reason is a looming necessity. When I asked Dr. Meyerson why the five-year prediction exercise was a worthwhile use of researchers’ time, he replied that it helped focus thinking. Actually, his initial reply was a techie epigram. “In a nutshell,” he said, “seven nanometers.”

Dr. Meyerson, who has a Ph.D. in solid-state physics, was talking about the physical limits on the width of semiconductor circuits, when they can’t be shrunk any further. (The width of a human hair is roughly 80,000 nanometers.) Today, the most advanced chips have circuits 22 nanometers in width. Next comes 14 nanometers, then 10 and then 7, Dr. Meyerson said.

“We have three more cycles, and then the biggest knobs for improving performance in silicon are gone.” he said. “You have to change the architecture, use a different approach.”

“With a cognitive computer, you train it rather than program it,” Dr. Meyerson said.

Hmmm…

The tyranny of algorithms

This morning’s Observer column.

Keynes’s observation (in his General Theory) that “practical men who believe themselves to be quite exempt from any intellectual influences, are usually the slaves of some defunct economist” needs updating. Replace “economist” with “algorithm”. And delete “defunct”, because the algorithms that now shape much of our behaviour are anything but defunct. They have probably already influenced your Christmas shopping, for example. They have certainly determined how your pension fund is doing, and whether your application for a mortgage has been successful. And one day they may effectively determine how you vote…

Nooks and crannies in eBooks

Lovely story which highlights one of the hidden affordances of eBooks:

Some weeks ago I decided that I wanted to read Tolstoy’s War and Peace. Lou Ann loaned me her copy. At more than 1100 pages, reading it in bed required as much strength as balancing a box of bricks in my hands. In my senior years I have developed arthritis in my thumbs, which made the effort not only difficult, but painful.

I had read about half of the novel when I was given the gift of a Nook, the e-reader from Barnes and Noble. Although I am committed to supporting my neighborhood independent book store (Books to be Red), and enjoying honest-to-goodness books, the .99 Nook edition was so lightweight that it has made reading War and Peace a genuine pleasure. For those of you who have not tackled this tome as yet, it is a page-turner.

As I was reading, I came across this sentence: “It was as if a light had been Nookd in a carved and painted lantern….” Thinking this was simply a glitch in the software, I ignored the intrusive word and continued reading. Some pages later I encountered the rogue word again. With my third encounter I decided to retrieve my hard cover book and find the original (well, the translated) text.

For the sentence above I discovered this genuine translation: “It was as if a light had been kindled in a carved and painted lantern….”

Someone at Barnes and Noble (a twenty year old employee? or maybe the CEO?) had substituted every incidence of “kindled” with “Nookd!”

I was shocked. Almost immediately I found it hilarious…then outrageous…then both. It is definitely clever. But it raises many questions. E-books can be manipulated at will by the purveyors of the downloadable software. Here is a classic work of fiction (some claim it is the greatest novel every written) used for a sophomoric and/or commercial prank. What else might be changed in an e-book? Fears of manipulation for economic, political, religious, or other ideological ends come to mind. It makes one wary of the integrity of any digital version of not only War and Peace…but any e-book.

Yep. Great blog post.

Getting things done — with a keyboard

Last week I blogged about the Logitech Ultra-thin Keyboard which doubles as a cover for the iPad. I’ve now been using it for over a week and am even more impressed. Today, for example, I had a long inter-city train journey during which the combination of the keyboard and the iPad’s battery life enabled me to get a really useful amount of writing and other stuff done.

From now on I’m not leaving home without it.

Thinking about the unthinkable

This morning’s Observer column.

Then Google launched its autonomous vehicle (aka self-driving car) project. By loading a perfectly ordinary Toyota Prius with $250,000-worth of sensors and computing equipment, the company created a vehicle that can safely navigate even the more congested road conditions. So far, these cars have logged something like half-a-million accident-free miles, which implies that robotic cars are actually far safer than ones driven by humans.

For me, the implication of the Google car is not necessarily that Kurzweil’s “singularity” is near, but that our assumptions about the potential of computers – and, therefore, artificial intelligence – urgently need revising. We need to think seriously about this stuff, along the lines demonstrated by the philosopher David Chalmers in a terrific paper, or by Erik Brynjolfsson and Andrew McAfee in their book, Race Against the Machine.

Planned Obsolescence v2.0

Nick Bilton had a perceptive piece in the New York Times about Apple’s product strategy.

Philip W. Schiller, Apple’s vice president for marketing, strode across the stage of the California Theater in San Jose last week trumpeting the virtues of new Apple products. As he caressed the side of the latest iMac personal computer, he noted how thin it was — five millimeters, 80 percent thinner than the last one. Then he said, with an air of surprise, as if he’d just thought of it: “Isn’t it amazing how something new makes the previous thing instantly look old?”

Umm, yes, Mr. Schiller, you design your products that way. It’s part of a strategy that Apple has perfected. How else can the company persuade people to replace their perfectly fine iPhone, iPad, iMac and iEverything else year after year?

It’s called planned obsolescence and it’s an old marketing trick. Mr Bilton traces it back to Brooks Stevens, an American industrial designer who specialised in automobile design in the 1950s. He’s the guy who inspired cosmetic changes (tail fins etc) on American gas-guzzlers of the period to ensure that new models always made their predecessors look dated.

But actually the idea goes back even further than that. Wikipedia traces it to Bernard London’s 1932 pamphlet entitled Ending the Depression Through Planned Obsolescence, the nub of which was that the government should impose legal obsolescence on consumer articles in order to stimulate and perpetuate consumption.

The funny thing about Apple’s strategy is how blatant it is. I have an iPhone 4 which is a perfectly satisfactory device, in the sense that it does everything I need from a phone. But with the launch of the iPhone 5 my handset has suddenly become the oldest iPhone that the company will support. It’s been scheduled for obsolescence, in other words, not because of any functional inadequacy but because its continuation threatens Apple’s corporate need to have me ‘upgrade’ to a device that I don’t actually need.

When researching his piece, Mr Bilton spoke to Don Norman, who is a real design guru IMHO and who observed that consumer electronics companies like Apple

have adopted the same marketing techniques the automobile industry perfected decades ago. Introduce fancy upgrades to the top and then, each year, push them down to lower-tiered products. This way, customers on every level feel the need to buy a newer version. “This is an old-time trick — they’re not inventing anything new,” he said. “Yet it’s to the detriment of the consumer and the environment, but perhaps to the betterment of the stockholder.”

He added: “For Apple, you forgot the other trick: change the plugs!” While the rest of the electronics industry has adopted micro-USB ports, Apple just changed the proprietary ports and plugs on all of its latest devices — laptops, iPads and iPhones included.

Spot on. We laugh derisively at our fathers’ (and grandfathers’) pathetic obsessions with tail-fins and chrome fittings. And then we contemplate the long queues of mugs lining up to buy the latest glass rectangle from Cupertino and ask: are we getting smarter?

Answer: no.

Fifty years on

Fifty years ago this month, many of us wondered if we were on the brink of nuclear Armageddon as the Kennedy Administration confronted the Soviet Union over the latter’s stationing of nuclear missiles in Cuba. The way JFK and his colleagues handled the crisis is probably the most studied case-study in crisis management in history — see, for example, The Kennedy Tapes: Inside the White House During the Cuban Missile Crisis, but it’s still fascinating.

To mark the anniversary, the JFK Memorial Library has put together a remarkable web production which not only contains an excellent narrative of the evolution and resolution of the crisis, but also a riveting portfolio of documents, photographs, movies and audio recordings of the secret deliberations of Kennedy and his advisers. It takes time to absorb, but it’s worth it. And it’s a brilliant illustration of what the Web can do if used imaginatively.

Homage to Pandemonium

Diana Athill had a lovely piece in yesterday’s Guardian which starts like this:

When factory chimneys reared up during the Olympic opening ceremony I thought at once: “Pandaemonium – he must have read it” – then “Oh nonsense, it was published almost 30 years ago and one never sees it around nowadays.” But Danny Boyle had, indeed, read it. Humphrey Jennings’s great work did inspire an occasion with which nearly everyone in this country was going to fall in love.

It made me sit up because Humphrey Jennings also flashed into my mind when I watched the recording of the Opening Ceremony. (We were travelling on the night and so missed the live transmission.) Pandaemonium 1660-1886: The Coming of the Machine as Seen by Contemporary Observers has been one of my favourite books for years, and nestles on my bookshelves as a kind of antidote to the ravings of Paul Johnson (see picture). What astounded me when I first read it is how clearly and perceptively the people who lived through the first Industrial Revolution saw and understood what was happening. As we live though another industrial revolution are we as perceptive? I don’t think so.

What I didn’t know until I read the Athill piece is that she had been its editor at Andre Deutsch. She writes knowledgeably (and movingly) about its genesis:

It came about when, as a thankyou to the people of a Welsh village where he had been making a film, Jennings gave a series of talks about the industrial revolution for which he collected extracts from many sources. From then on he never ceased collecting, and his purpose was clear: he was going to make a book presenting not the political or the economic history, but the human history of the industrial revolution. He would not describe or analyse; rather, people who had experienced it would show what it was like.

Jennings died before the book was published so it was edited for publication by his daughter and his friend, Charles Madge. Jennings was a documentary film-maker, and in a way Pandemonium is actually a film in print format. It lets its witnesses speak for themselves. It’s lovely to know that it has been revived and reissued.

Smart meters and dumb government

This morning’s Observer column.

Underpinning the argument for smart meters are a number of assumptions. One is that, if consumers know how much electricity they are using at any given moment, then they will become more careful about how they use it. Another is that smart metering will enable utility companies to vary the cost per unit on an hourly basis. So electricity might cost 2p a unit at 3am but 12p a unit at 6pm, when the nation gets home, starts cooking and switches on the TV. The combination of these two charges should mean that peak demand is reduced, thereby making operation of the grid easier and less wasteful.

There’s a good case for rethinking the way we supply and charge for electricity, because if we go on as we are – with a dumb grid, dumb meters and accelerating demand – then we’ll eventually find ourselves with the problems that the Indians experienced recently. And that doesn’t bear thinking about.

The problem is that the way the government is approaching the issue doesn’t exactly inspire confidence.