Plutocratic hysteria

Lovely NYTimes column by Paul Krugman about the hysterical reaction of America’s financial and political elite to the Occupy Wall Street demonstrations.

What’s going on here? The answer, surely, is that Wall Street’s Masters of the Universe realize, deep down, how morally indefensible their position is. They’re not John Galt; they’re not even Steve Jobs. They’re people who got rich by peddling complex financial schemes that, far from delivering clear benefits to the American people, helped push us into a crisis whose aftereffects continue to blight the lives of tens of millions of their fellow citizens.

Yet they have paid no price. Their institutions were bailed out by taxpayers, with few strings attached. They continue to benefit from explicit and implicit federal guarantees — basically, they’re still in a game of heads they win, tails taxpayers lose. And they benefit from tax loopholes that in many cases have people with multimillion-dollar incomes paying lower rates than middle-class families.

This special treatment can’t bear close scrutiny — and therefore, as they see it, there must be no close scrutiny. Anyone who points out the obvious, no matter how calmly and moderately, must be demonized and driven from the stage. In fact, the more reasonable and moderate a critic sounds, the more urgently he or she must be demonized, hence the frantic sliming of Elizabeth Warren.

So who’s really being un-American here? Not the protesters, who are simply trying to get their voices heard. No, the real extremists here are America’s oligarchs, who want to suppress any criticism of the sources of their wealth.

Right on!

Myles celebrated

Lovely celebration of Flann O’Brien by Roger Boylan.

He finished “At Swim-Two-Birds” when he was 28 and sent it off to Longmans, a London publisher, where by a rare stroke of good luck Graham Greene was reader. “I read it with continual excitement, amusement and the kind of glee one experiences when people smash china on the stage,” recalled Greene, who urged publication. From Paris, James Joyce, in a blurb written to help promote the book, pronounced its author “a real writer, with the true comic spirit.” O’Nolan was cautiously optimistic. But the cosmic balance was soon restored. War broke out and in 1940 the Luftwaffe destroyed the London warehouse in which the entire print run of the novel was stored; fewer than 250 had been sold. Then in 1941 Joyce, who had promised to help with publicity, suddenly died, along with O’Nolan’s hopes for the book. “[I]t must be a flop,” he wrote, wallowing in gloom. “I guess it is a bum book anyhow.”

In fact, it’s every bit the masterpiece Greene said it was—a thrilling mix of wild experimentation and traditional Irish storytelling. Stylistically, “At Swim-Two-Birds” runs the gamut from mock-epic … to a kind of arch naturalism… The narrative is divided into three parts, described with admiration by Jorge Luis Borges: “A student in Dublin writes a novel about the proprietor of a Dublin public house, who writes a novel about the habitués of his pub (among them, the student), who in their turn write novels in which proprietor and student figure along with other writers about other novelists.” It’s an intricate puzzle played for laughs, a novel simultaneously subversive of, and reverent towards, the Irish epic tradition. It was ten years before the Luftwaffe’s draconian edits were reversed and the book was reprinted…

Gutenberg to Zuckerberg: an interview

As many readers of this blog know, I have a new book coming out in January in which I try to distil what I think people should know about the Internet. My Open University colleague Monica Shelley has done an interview with me about it which has just gone on the departmental website. Here it is for anyone outside the firewall. The book has nine big ideas in it (seven plus or minus two in homage to George Miller). Monica wisely decided to focus on the most basic five ideas; otherwise she’d have been there all day. Thanks to her and to Joe Mills, who shot and edited the clip.

What’s significant about the new iPhone

This morning’s Observer column.

Tuesday would be – so the hype machine assured us – iPhone 5 day. But Tuesday came and went and it turned out to be only iPhone 4S day, and the assembled chorus drawn from the Apple-obsessed region of the blogosphere and the “analysts” of Wall Street howled their frustration. Which made one wonder what these people expected – an iPhone 5 that did teleportation? It also made one wonder if anyone on Wall Street has ever heard of the sigmoid function, the universal s-shaped learning curve that shows a progression from small beginnings and accelerates rapidly before creeping slowly towards its maximum point.

The point is that the iPhone has been through the acceleration phase and is now at the point where it can only get incrementally better. What CEO Tim Cook and his colleagues announced on Tuesday represented an implicit acknowledgment of that reality: they announced an incrementally improved product…

Steve Jobs: commented

The Observer asked me to read Steve Jobs’s 2005 Stanford commencement address and add my comments to the text.

**The commencement address is one of the more venerable – and respectable – traditions of American academia, especially at elite universities such as Stanford and Harvard. Because Steve Jobs died at such a relatively young age (56) this is destined to be regarded as a classic. But it faces stiff competition – as the list maintained by humanity.org testifies. Jobs’s address is up against Barack Obama’s lecture to Wesleyan University in 2008, Elie Wiesel’s talk at DePaul University in 1997, Václav Havel’s lecture on “Civilisation’s Thin Veneer” at Harvard in 1995 and George Marshall’s address to the same university in 1947 – to list just four. But Jobs’s address has an unbearable poignancy just now, especially for those who knew him well. John Gruber, the blogger and technology commentator, saw him fairly recently and observed: “He looked old. Not old in a way that could be measured in years or even decades, but impossibly old. Not tired, but weary; not ill or unwell, but rather, somehow, ancient. But not his eyes. His eyes were young and bright, their weapons-grade intensity intact.” The address also reveals something of Jobs’s humanity, something that tended to get lost in the afterglow of Apple’s astonishing corporate resurgence. **

LATER: In my comments I related one of my favourite stories about Jobs — the one where he drops the first iPod prototype in a fish-tank to demonstrate that it’s too big. Frank Stajano emailed to say that it may be apocryphal — he’d heard it many years ago about Akio Morita and Sony’s Walkman. In trying to check I found this nice piece by D.B. Grady, who also tells the story but cautions “I have no way of knowing if it is true, so take it for what it’s worth. I think it nicely captures the man who changed the world four times over.”

Agreed. As the Italians say, if it ain’t true then it ought to be. (Hmmm… on reflection, I can’t find a source for that adage either. Apologies if I’ve been rude to the citizens of that lovely country.)

His Steveness: the flip side

When I was a kid I was brought up to believe that one should never speak ill of the dead, at least in the immediate aftermath of their demise. I made an exception for Charlie Haughey, but then so did many others. In the last two days we’ve seen an avalanche of affectionate, admiring stuff about Steve Jobs, and most of it has — understandably — tended to gloss over the fact that no omelette was ever made without breaking eggs, and no great corporate height has ever been scaled without cracking some heads.

So it’s been interesting to see two more detached assessments of Jobs emerge. The first, by John Cassidy in the New Yorker, takes issue with the idea that jobs was an ‘artist’. If he was, he writes,

he was a great artist only in the sense that Bob Dylan and Andy Warhol are great artists: talented jackdaws who took other people’s half-baked innovations and converted them into beautifully made products with mass appeal. Apple didn’t build the first desktop computer based on a microprocessor: the Micral N and the MITS Altair predated the landmark Apple II. Steve Jobs didn’t create the mouse, either: he lifted it from a version he saw at the Xerox Parc research center in Palo Alto. George Lucas, and not Jobs, created Pixar. The Nomad Jukebox, a digital music player made by a company from Singapore, predated the iPod.

Jobs’s real genius was seeing, before practically anybody else, that the computer industry was melding with the consumer-goods industry, and that success would go to products that were useful and well designed, but also nice to look at and cleverly branded. He took genuine innovations and improved upon them. The Apple Macintosh, released in 1984, was the first PC that didn’t look like it belonged in the basement of the campus science center surrounded by math books and used pizza boxes. The iBook used bright colors to make laptops look cool. The iPod, unlike the Nomad, was sleek and light enough to carry around in your pocket. In a 1996 PBS documentary called “Triumph of the Nerds,” Jobs himself said, “We have always been shameless about stealing great ideas.”

Unlike Thomas Edison, to whom he has been compared, Jobs wasn’t really an inventor. In fact, by the standards of Silicon Valley, he wasn’t really a techie at all all.

Cassidy thinks that jobs is best categorised as a “hippie capitalist”.

Gawker, as you might expect, has few scruples about raining on the Jobs parade. In a post with a giveaway title — “what-everyone-is-too-polite-to-say-about-steve-jobs” — it lays into Jobs for censorship and authoritarianism, having products manufactured in Chinese sweatshops, and having a tyrannical managerial style.

I guess there will be more in this vein over the next few months.