The future in your pocket

This morning’s Observer column:

If a year is a long time in politics (and it is), then it’s an eternity in communications technology. Fourteen years ago, about 400 million people were using the internet. Today, the number of net users is pushing the 3 billion mark. But that’s not the really big news. What’s truly startling is that 2 billion of these folks are getting their internet connections primarily via smartphones, ie, handheld computers that can access the internet as well as make voice calls, send text messages and do the other things that old-fashioned “feature phones” could do.

This is startling because smartphones are a relatively new development, and when they first appeared less than a decade ago, most of us thought that they would remain an elite consumer product for a long time to come, staples of affluent professionals in the industrialised world, perhaps, but of no relevance to poor people in the developing world who would continue to be delighted with crude feature phones that could just about do SMS.

How wrong can you be? We underestimated both the power of Moore’s law and human nature…

Read on

How the network is evolving

This morning’s Observer column:

Earlier this year engineer Dr Craig Labovitz testified before the US House of Representatives judiciary subcommittee on regulatory reform, commercial and antitrust law. Labovitz is co-founder and chief executive of Deepfield, an outfit that sells software to enable companies to compile detailed analytics on traffic within their computer networks. The hearing was on the proposed merger of Comcast and Time Warner Cable and the impact it was likely to have on competition in the video and broadband market. In the landscape of dysfunctional, viciously partisan US politics, this hearing was the equivalent of rustling in the undergrowth, and yet in the course of his testimony Labovitz said something that laid bare the new realities of our networked world…

Read on…

More…

Wired had an interesting series about this shift, the first episode of which has a useful graphic illustrating the difference between most people’s mental model of the Internet, and the emerging reality.

Celebrating Dave Winer

This morning’s Observer column:

Twenty years ago this week, a software developer in California ushered in a new era in how we communicate. His name is Dave Winer and on 7 October 1994 he published his first blog post. He called it Davenet then, and he’s been writing it most days since then. In the process, he has become one of the internet’s elders, as eminent in his way as Vint Cerf, Dave Clark, Doc Searls, Lawrence Lessig, Dave Weinberger or even Tim Berners-Lee.

When you read his blog, Scripting News – as I have been doing for 20 years – you’ll understand why, because he’s such a rare combination of talents and virtues. He’s technically a very gifted software developer, for example. Many years ago he wrote one of the smartest programs that ever ran on the Apple II, the IBM PC and the first Apple Mac – an outliner called ThinkTank, which changed the way many of us thought about the process of writing. After that, Winer wrote the first proper blogging software, invented podcasting and was one of the developers of RSS, the automated syndication system that constitutes the hidden wiring of the blogosphere. And he’s still innovating, still pushing the envelope, still writing great software.

Technical virtuosity is not what makes Winer one of the world’s great bloggers, however. Equally important is that he is a clear thinker and writer, someone who is politically engaged, holds strong opinions and believes in engaging in discussion with those who disagree with him. And yet the strange thing is that this opinionated, smart guy is also sensitive: he gets hurt when people write disparagingly about him, but he also expresses that hurt in a philosophical way…

Read on

Even if you’re not on Facebook, you are still the product

This morning’s Observer column:

The old adage “if the service is free, then you are its product” needs updating. What it signified was that web services (like Facebook, Google, Yahoo et al) that do not charge users make their money by harvesting personal and behavioural data relating to those users and selling that data to advertisers. That’s still true, of course. But a more accurate version of the adage would now read something like this: if you use the web for anything (including paying for stuff) then you are also the product, because your data is being sold on to third parties without your knowledge.

In a way, you probably already knew this. A while back you searched for, say, a digital camera on the John Lewis site. And then you noticed that wherever you went on the web after that John Lewis ads for cameras kept appearing on the site you were visiting. What you were witnessing was the output of a multibillion-dollar industry that operates below the surface of the web. Think of it as the hidden wiring of our networked world. And what it does is track you wherever you go online…

Read on

So is it good to talk… again?

This morning’s Observer column.

To the technology trade, I am what is known as an “early adopter” (translation: gadget freak, mug, sucker). I had a mobile phone in the mid-1980s, for example, when they were still regarded as weird. It was the size of a brick, cost the best part of a grand and exposed me to ridicule whenever I took it out in public. But I didn’t care because the last Soviet president, Mikhail Gorbachev, used the same phone and he was cool in those days. Besides, it had always seemed absurd to me that phones should be tethered to the wall, like goats. I still have that Nokia handset, by the way: it sits at the bottom of a drawer and I sometimes take it out to show my grandchildren what phones used to be like.

Over the decades since, I have always had latest-model phones – just like all the other early adopters. And of course I used them to make phone calls because basically that’s all you could do with those devices. (Well, almost all: one of mine had an FM radio built in.) And then in 2007 Steve Jobs launched the iPhone and the game changed. Why? Because the Apple device was really just a powerful computer that you could hold in your hand. And it was a real computer; its operating system was a derivative of BSD, the derivative of Unix developed by Bill Joy when he was a graduate student at Berkeley. (Note for non-techies: Unix is to Windows as a JCB is to a garden trowel.)

The fact that the iPhone could also make voice calls seemed, suddenly, a trivial afterthought. What mattered was that it provided mobile access to the internet. And that it could run programs, though it called them apps…

Read on

Tech bubble: does it matter?

This morning’s Observer column.

If one wanted to be critical, the most annoying thing about the current bubble is the way the visions and ambitions of startup founders seem to have narrowed. Many of them claim, of course, that what they want to build is a company that in the long term will transform the world or disrupt a particular market. But in actual fact their strategy is to create a product or a service that is sufficiently interesting or annoying to induce Google, Amazon, Facebook, Yahoo or Microsoft to buy the upstart venture. The poster child for this is WhatsApp, a fine company with a viable business model that did not depend on monitoring users and which was run by a chap who fervently declared his resolve to build a great, sustainable enterprise that treated its users well. And he doubtless believed that right up to the moment that Facebook offered him $19bn. And who can blame him: you only live once, after all.

At the end of the day, though, what’s much more worrying than the spectacle of venture capitalists blowing investors’ money is the fact that everywhere state funding for the kind of long-term, fundamental research that is needed to produce the technologies of tomorrow has been shrinking. The current wave of innovation and economic development enabled by the internet has only come about because 60 years ago the US government funded the project that produced first the arpanet and then the internet.

Private enterprise would undoubtedly have produced computer networks, but it would not have created the free and open platform for “permissionless innovation” that we got as a result of public investment. And we would have all been poorer as a result.

Read on

Why Apple Pay was the big news from Apple

This morning’s Observer column

In the long view of history, though, the innovation that may be seen as really significant is Apple Pay – an ingenious blend of contactless payment technology with security features that are baked into the new iPhones. Apple Pay will, burbled Tim Cook, “forever change the way all of us buy things… it’s what makes the iPhone 6 the biggest advancement in the history of iPhones”.

The idea is to do away with the rigmarole of having to pull out a credit/debit card, insert in a store’s card reader, type a pin, etc. Instead, you simply bump your iPhone (and, eventually, your Apple Watch) against the store’s contactless reader and – bingo! – you’ve paid, and the store never gets to see your card. Why? Because Apple has stored the card details in heavily encrypted form on your device and assigned each card a unique, device-specific number, which is accepted by the retailer’s contactless reader.

This only works, of course, if the retailer has already signed up with Apple. Cook claimed that 220,000 US retailers have already opted in to the system, as well as six major banks, plus MasterCard, Visa and American Express – which means that 83% of all US credit card payment volume can theoretically already be handled by Apple Pay.

If true, this is a really big deal, because it puts Apple at the heart of an unimaginable volume of financial transactions. In a way, the company is now doing to the card payment business what it did to the music business with the iTunes store…

Read on

Celebgate: what it tells us about us

My Observer Comment piece on the stolen selfies.

Ever since 1993, when Mosaic, the first graphical browser, transformed the web into a mainstream medium, the internet has provided a window on aspects of human behaviour that are, at the very least, puzzling and troubling.

In the mid-1990s, for example, there was a huge moral panic about online pornography, which led to the 1996 Communications Decency Act in the US, a statute that was eventually deemed unconstitutional by the Supreme Court. But when I dared to point out at the time in my book (A Brief History of the Future: The Origins of the Internet), that if there was a lot of pornography on the net (and there was) then surely that told us something important about human nature rather than about technology per se, this message went down like a lead balloon.

It still does, but it’s still the important question. There is abundant evidence that large numbers of people behave appallingly when they are online. The degree of verbal aggression and incivility in much online discourse is shocking. It’s also misogynistic to an extraordinary degree, as any woman who has a prominent profile in cyberspace will tell you…

Read on

Why Facebook is for ice buckets and Twitter is for what’s actually going on

Tomorrow’s Observer column

Ferguson is a predominantly black town, but its police force is predominantly white. Shortly after the killing, bystanders were recording eyewitness interviews and protests on smartphones and linking to the resulting footage from their Twitter accounts. News of the killing spread like wildfire across the US, leading to days of street confrontations between protesters and police and the imposition of something very like martial law. The US attorney general eventually turned up and the FBI opened a civil rights investigation. For days, if you were a Twitter user, Ferguson dominated your tweetstream, to the point where one of my acquaintances, returning from a holiday off the grid, initially inferred from the trending hashtag “#ferguson” that Sir Alex had died.

There’s no doubt that Twitter played a key role in elevating a local killing into national and international news. (Even Putin’s staff had some fun with it, offering to send human rights observers.) More than 3.6m Ferguson-related tweets were sent between 9 August, the day Brown was killed, and 17 August.

Three cheers for social media, then?

Not quite. ..

Read on

Dave Eggers has seen the future. Well, a possible future anyway…

Yesterday’s Observer column.

Fifteen months have passed since Edward Snowden began to explain to us how our networked world works. During that time there has been much outrage, shock, horror, etc expressed by the media and the tech industry. So far, so predictable. What is much more puzzling is how relatively relaxed the general public appears to be about all this. In Britain, for example, opinion polling suggests that nearly two thirds of the population think that the kind of surveillance revealed by Snowden is basically OK.

To some extent, the level of public complacency/concern is culturally determined. Citizens of Germany, for example…

Read on