Archive for the 'Observer' Category

Celebrating Dave Winer

[link] Sunday, October 12th, 2014

This morning’s Observer column:

Twenty years ago this week, a software developer in California ushered in a new era in how we communicate. His name is Dave Winer and on 7 October 1994 he published his first blog post. He called it Davenet then, and he’s been writing it most days since then. In the process, he has become one of the internet’s elders, as eminent in his way as Vint Cerf, Dave Clark, Doc Searls, Lawrence Lessig, Dave Weinberger or even Tim Berners-Lee.

When you read his blog, Scripting News – as I have been doing for 20 years – you’ll understand why, because he’s such a rare combination of talents and virtues. He’s technically a very gifted software developer, for example. Many years ago he wrote one of the smartest programs that ever ran on the Apple II, the IBM PC and the first Apple Mac – an outliner called ThinkTank, which changed the way many of us thought about the process of writing. After that, Winer wrote the first proper blogging software, invented podcasting and was one of the developers of RSS, the automated syndication system that constitutes the hidden wiring of the blogosphere. And he’s still innovating, still pushing the envelope, still writing great software.

Technical virtuosity is not what makes Winer one of the world’s great bloggers, however. Equally important is that he is a clear thinker and writer, someone who is politically engaged, holds strong opinions and believes in engaging in discussion with those who disagree with him. And yet the strange thing is that this opinionated, smart guy is also sensitive: he gets hurt when people write disparagingly about him, but he also expresses that hurt in a philosophical way…

Read on

Even if you’re not on Facebook, you are still the product

[link] Sunday, October 5th, 2014

This morning’s Observer column:

The old adage “if the service is free, then you are its product” needs updating. What it signified was that web services (like Facebook, Google, Yahoo et al) that do not charge users make their money by harvesting personal and behavioural data relating to those users and selling that data to advertisers. That’s still true, of course. But a more accurate version of the adage would now read something like this: if you use the web for anything (including paying for stuff) then you are also the product, because your data is being sold on to third parties without your knowledge.

In a way, you probably already knew this. A while back you searched for, say, a digital camera on the John Lewis site. And then you noticed that wherever you went on the web after that John Lewis ads for cameras kept appearing on the site you were visiting. What you were witnessing was the output of a multibillion-dollar industry that operates below the surface of the web. Think of it as the hidden wiring of our networked world. And what it does is track you wherever you go online…

Read on

So is it good to talk… again?

[link] Sunday, September 28th, 2014

This morning’s Observer column.

To the technology trade, I am what is known as an “early adopter” (translation: gadget freak, mug, sucker). I had a mobile phone in the mid-1980s, for example, when they were still regarded as weird. It was the size of a brick, cost the best part of a grand and exposed me to ridicule whenever I took it out in public. But I didn’t care because the last Soviet president, Mikhail Gorbachev, used the same phone and he was cool in those days. Besides, it had always seemed absurd to me that phones should be tethered to the wall, like goats. I still have that Nokia handset, by the way: it sits at the bottom of a drawer and I sometimes take it out to show my grandchildren what phones used to be like.

Over the decades since, I have always had latest-model phones – just like all the other early adopters. And of course I used them to make phone calls because basically that’s all you could do with those devices. (Well, almost all: one of mine had an FM radio built in.) And then in 2007 Steve Jobs launched the iPhone and the game changed. Why? Because the Apple device was really just a powerful computer that you could hold in your hand. And it was a real computer; its operating system was a derivative of BSD, the derivative of Unix developed by Bill Joy when he was a graduate student at Berkeley. (Note for non-techies: Unix is to Windows as a JCB is to a garden trowel.)

The fact that the iPhone could also make voice calls seemed, suddenly, a trivial afterthought. What mattered was that it provided mobile access to the internet. And that it could run programs, though it called them apps…

Read on

Tech bubble: does it matter?

[link] Sunday, September 21st, 2014

This morning’s Observer column.

If one wanted to be critical, the most annoying thing about the current bubble is the way the visions and ambitions of startup founders seem to have narrowed. Many of them claim, of course, that what they want to build is a company that in the long term will transform the world or disrupt a particular market. But in actual fact their strategy is to create a product or a service that is sufficiently interesting or annoying to induce Google, Amazon, Facebook, Yahoo or Microsoft to buy the upstart venture. The poster child for this is WhatsApp, a fine company with a viable business model that did not depend on monitoring users and which was run by a chap who fervently declared his resolve to build a great, sustainable enterprise that treated its users well. And he doubtless believed that right up to the moment that Facebook offered him $19bn. And who can blame him: you only live once, after all.

At the end of the day, though, what’s much more worrying than the spectacle of venture capitalists blowing investors’ money is the fact that everywhere state funding for the kind of long-term, fundamental research that is needed to produce the technologies of tomorrow has been shrinking. The current wave of innovation and economic development enabled by the internet has only come about because 60 years ago the US government funded the project that produced first the arpanet and then the internet.

Private enterprise would undoubtedly have produced computer networks, but it would not have created the free and open platform for “permissionless innovation” that we got as a result of public investment. And we would have all been poorer as a result.

Read on

Why Apple Pay was the big news from Apple

[link] Sunday, September 14th, 2014

This morning’s Observer column

In the long view of history, though, the innovation that may be seen as really significant is Apple Pay – an ingenious blend of contactless payment technology with security features that are baked into the new iPhones. Apple Pay will, burbled Tim Cook, “forever change the way all of us buy things… it’s what makes the iPhone 6 the biggest advancement in the history of iPhones”.

The idea is to do away with the rigmarole of having to pull out a credit/debit card, insert in a store’s card reader, type a pin, etc. Instead, you simply bump your iPhone (and, eventually, your Apple Watch) against the store’s contactless reader and – bingo! – you’ve paid, and the store never gets to see your card. Why? Because Apple has stored the card details in heavily encrypted form on your device and assigned each card a unique, device-specific number, which is accepted by the retailer’s contactless reader.

This only works, of course, if the retailer has already signed up with Apple. Cook claimed that 220,000 US retailers have already opted in to the system, as well as six major banks, plus MasterCard, Visa and American Express – which means that 83% of all US credit card payment volume can theoretically already be handled by Apple Pay.

If true, this is a really big deal, because it puts Apple at the heart of an unimaginable volume of financial transactions. In a way, the company is now doing to the card payment business what it did to the music business with the iTunes store…

Read on

Celebgate: what it tells us about us

[link] Sunday, September 7th, 2014

My Observer Comment piece on the stolen selfies.

Ever since 1993, when Mosaic, the first graphical browser, transformed the web into a mainstream medium, the internet has provided a window on aspects of human behaviour that are, at the very least, puzzling and troubling.

In the mid-1990s, for example, there was a huge moral panic about online pornography, which led to the 1996 Communications Decency Act in the US, a statute that was eventually deemed unconstitutional by the Supreme Court. But when I dared to point out at the time in my book (A Brief History of the Future: The Origins of the Internet), that if there was a lot of pornography on the net (and there was) then surely that told us something important about human nature rather than about technology per se, this message went down like a lead balloon.

It still does, but it’s still the important question. There is abundant evidence that large numbers of people behave appallingly when they are online. The degree of verbal aggression and incivility in much online discourse is shocking. It’s also misogynistic to an extraordinary degree, as any woman who has a prominent profile in cyberspace will tell you…

Read on

Why Facebook is for ice buckets and Twitter is for what’s actually going on

[link] Saturday, September 6th, 2014

Tomorrow’s Observer column

Ferguson is a predominantly black town, but its police force is predominantly white. Shortly after the killing, bystanders were recording eyewitness interviews and protests on smartphones and linking to the resulting footage from their Twitter accounts. News of the killing spread like wildfire across the US, leading to days of street confrontations between protesters and police and the imposition of something very like martial law. The US attorney general eventually turned up and the FBI opened a civil rights investigation. For days, if you were a Twitter user, Ferguson dominated your tweetstream, to the point where one of my acquaintances, returning from a holiday off the grid, initially inferred from the trending hashtag “#ferguson” that Sir Alex had died.

There’s no doubt that Twitter played a key role in elevating a local killing into national and international news. (Even Putin’s staff had some fun with it, offering to send human rights observers.) More than 3.6m Ferguson-related tweets were sent between 9 August, the day Brown was killed, and 17 August.

Three cheers for social media, then?

Not quite. ..

Read on

Dave Eggers has seen the future. Well, a possible future anyway…

[link] Monday, September 1st, 2014

Yesterday’s Observer column.

Fifteen months have passed since Edward Snowden began to explain to us how our networked world works. During that time there has been much outrage, shock, horror, etc expressed by the media and the tech industry. So far, so predictable. What is much more puzzling is how relatively relaxed the general public appears to be about all this. In Britain, for example, opinion polling suggests that nearly two thirds of the population think that the kind of surveillance revealed by Snowden is basically OK.

To some extent, the level of public complacency/concern is culturally determined. Citizens of Germany, for example…

Read on

The consolations of error

[link] Monday, August 25th, 2014

Lots of Observer readers have been writing to the Readers’ Editor (and emailing me directly) castigating me for claiming in my essay that Robert Capa’s D-Day Landing pictures were shot using a Leica camera. They maintain — as does Wikipedia — that he was using a Contax II rangefinder on the day, so I’m clearly in error on that point.

There is more disagreement about whether Capa’s famous Spanish Civil War photographs were shot with a Leica. There’s a photograph of him from the time carrying a movie camera with a stills camera in a leather case hanging round his neck. Not being an expert on camera cases, I don’t know whether it’s a Leica case or a Contax one. I guess Capa himself, a guy who covered five major wars, would have regarded this controversy as trivial. But it’s the kind of detail that we obsessives obsess about!

I wish I’d taken the trouble to check the D-Day assertion, but I guess because Capa had been one of the founder-members of Magnum I lazily assumed he had also been a Leica user. Myths endure because nobody checks. Mea culpa.

The same is true for the myths about Dorothy Parker, who is famous for being a world-class wisecracker. In an aside in the piece I claimed that she had reviewed Christopher Isherwood’s I Am A Camera with the crack “Me No Leica”. But, as many readers pointed out, the credit belongs elsewhere — with the theatre critic Walter Kerr. One of his most famous reviews was his three word summary of John Van Druten’s I Am A Camera in 1951: ‘Me no Leica.’

Parker has an enviable trove of wisecracks attributed to her, and she was an exceedingly funny (and exceedingly sad) lady. But in at least one other case she gets more credit than she deserves. When Robert Benchley came to her and said “Calvin Coolidge is dead”, she famously replied, “How could they tell?”, and this has gone down in history as an example of her wit. What’s not so well known, however, is that Benchley replied “He had an erection”, but this was deemed too scandalous for polite society at the time and so Parker’s punchline was the one that endured. Benchley’s widow allegedly went to her deathbed infuriated by the fact that her husband hadn’t got the credit he was due for that exchange.

Still, Eric Clapton hasn’t written in (yet) to say that he does sometimes remember to take the lens cap off his M8. And nobody from the Royal Household has been in touch to say that Her Majesty has, on occasion, forgotten to remove the cap on her M3.

The Leica phenomenon

[link] Sunday, August 24th, 2014

naughton with leica

Photograph by Antonio Olmos for the Observer.

My Observer essay marking the centenary of the Leica camera.

I’m a photographer. No, let me rephrase that: I would like to be a photographer. In reality I’m merely an obsessive who takes lots of photographs in the hope that some day, just once, he will produce an image that is really, truly memorable. Like the images that Henri Cartier-Bresson captured, apparently effortlessly, in their thousands. Think, for example, of his famous picture of the guy leaping over a puddle; or the one of the two stout couples enjoying a picnic on the banks of the Marne; or his magical picture of a cheeky young boy carrying two bottles of red wine on the Rue Mouffetard in 1954. I like this last one particularly, because the lad in the photograph is about the same age as I was then and I often wonder if he’s still around, and what he looks like now.

You can think about this obsessiveness, this quest for the one perfect picture, as a kind of illness. If so, then I’ve had it for more than half a century. And I’m not the only sufferer…

Read on