Hofstadter’s Law and self-driving cars

This morning’s Observer column:

In 1979, Douglas Hofstadter, an American cognitive scientist, formulated a useful general rule that applies to all complex tasks. Hofstadter’s law says that “It always takes longer than you expect, even when you take into account Hofstadter’s law”. It may not have the epistemological status of Newton’s first law, but it is “good enough for government work”, as the celebrated computer scientist Roger Needham used to say.

Faced with this assertion, readers of Wired magazine, visitors to Gizmodo or followers of Rory Cellan-Jones, the BBC’s sainted technology correspondent, will retort that while Hofstadter’s law may apply to mundane activities such as building a third runway at Heathrow, it most definitely does not apply to digital technology, where miracles are routinely delivered at the speed of light…

Read on

Google, Facebook and the power to nudge users

This morning’s Observer column:

Thaler and Sunstein describe their philosophy as “libertarian paternalism”. What it involves is a design approach known as “choice architecture” and in particular controlling the default settings at any point where a person has to make a decision.

Funnily enough, this is something that the tech industry has known for decades. In the mid-1990s, for example, Microsoft – which had belatedly realised the significance of the web – set out to destroy Netscape, the first company to create a proper web browser. Microsoft did this by installing its own browser – Internet Explorer – on every copy of the Windows operating system. Users were free to install Netscape, of course, but Microsoft relied on the fact that very few people ever change default settings. For this abuse of its monopoly power, Microsoft was landed with an antitrust suit that nearly resulted in its breakup. But it did succeed in destroying Netscape.

When the EU introduced its General Data Protection Regulation (GDPR) – which seeks to give internet users significant control over uses of their personal data – many of us wondered how data-vampires like Google and Facebook would deal with the implicit threat to their core businesses. Now that the regulation is in force, we’re beginning to find out: they’re using choice architecture to make it as difficult as possible for users to do what is best for them while making it easy to do what is good for the companies.

We know this courtesy of a very useful 43-page report just out from the Norwegian Consumer Council, an organisation funded by the Norwegian government…

Read on

Zuckerberg’s monster

My Observer review of Siva Vaidhyanathan’s Anti-social Media: How Facebook Disconnects Us and Undermines Democracy:

The best metaphor for Facebook is the monster created by Dr Frankenstein. Mary Shelley’s story shows how, as Fiona Sampson put it in a recent Guardian article, “aspiration and progress are indistinguishable from hubris – until something goes wrong, when suddenly we see all too clearly what was reasonable endeavour and what overreaching”. There are clear echoes of this in the evolution of Facebook. “It’s a story”, writes Siva Vaidhyanathan in this excellent critique, “of the hubris of good intentions, a missionary spirit and an ideology that sees computer code as the universal solvent for all human problems. And it’s an indictment of how social media has fostered the deterioration of democratic and intellectual culture around the world.”

Facebook was founded by an undergraduate with good intentions but little understanding of human nature. He thought that by creating a machine for “connecting” people he might do some good for the world while also making himself some money. He wound up creating a corporate monster that is failing spectacularly at the former but succeeding brilliantly at the latter. Facebook is undermining democracy at the same time as it is making Mark Zuckerberg richer than Croesus. And it is now clear that this monster, like Dr Frankenstein’s, is beyond its creator’s control…

Read on

Why surveillance techology is usually better than we realise

This morning’s Observer column:

The images of the moon’s surface coming down from the orbiters were of astonishingly high resolution, good enough to blow up to 40ftx54ft pictures. When Nasa engineers initially stitched the images together they had to hang them in a church to view them. Eventually, they found a hangar where they could be laid on the ground for astronauts to walk on them in stockinged feet in order to search for suitable landing sites. Sign up for Lab Notes – the Guardian’s weekly science update Read more

For decades, nobody outside of Nasa and the US military knew how good these images were. The few that were released for public consumption were heavily degraded and fuzzy. Why? Because the cameras used in the lunar orbiters were derivatives of the cameras used in high-altitude US aerial reconnaissance planes and satellites and the Pentagon didn’t want the Soviets to know the level of detail that could be derived from them.

In a way, we shouldn’t be surprised by this revelation. It’s an old story: powerful states have often possessed more sophisticated surveillance technology than their adversaries – or their citizens – knew or suspected…

Read on

Censorship 2.0

This morning’s Observer column:

One of the axioms of the early internet was an observation made by John Gilmore, a libertarian geek who was one of the founders of the Electronic Frontier Foundation. “The internet,” said Gilmore, “interprets censorship as damage and routes around it.” To lay people this was probably unintelligible, but it spoke eloquently to geeks, to whom it meant that the architecture of the network would make it impossible to censor it. A forbidden message would always find a route through to its destination.

Gilmore’s adage became a key part of the techno-utopian creed in the 1980s and early 1990s. It suggested that neither the state nor the corporate world would be able to censor cyberspace. The unmistakable inference was that the internet posed an existential threat to authoritarian regimes, for whom control of information is an essential requirement for holding on to power.

In the analogue world, censorship was relatively straightforward…

Read on

Kremlinology 2.0

This morning’s Observer column:

In the bad old days of the cold war, western political and journalistic institutions practised an arcane pseudoscience called Kremlinology. Its goal was to try to infer what was going on in the collective mind of the Soviet Politburo. Its method was obsessively to note everything that could be publicly observed of the activities of this secretive cabal – who was sitting next to whom at the podium; which foreign visitors were granted an audience with which high official; who was in the receiving line for a visiting head of state; what editorials in Pravda (the official Communist party newspaper) might mean; and so on.

The Soviet empire is no more, much to Putin’s chagrin, but the world now has some new superpowers. We call them tech companies. Each periodically stages a major public event at which its leaders emerge from their executive suites to convey messages to their faithful followers and to the wider world. In the past few weeks, two such events have been held by two of the biggest powers – Google and Apple. So let’s do some Kremlinology on them…

Read on

Tech-driven wealth is the new aphrodisiac

This morning’s Observer column:

It’s a quintessential Silicon Valley story. A smart, attractive 19-year-old American woman who has taught herself Mandarin while in high school is studying chemical engineering at Stanford, where she is a president’s scholar. Her name is Elizabeth Holmes. In her first year as an undergraduate she persuades her professor to allow her to attend the seminars he runs with his PhD students. Then one day she drops into his office to tell him that she’s dropping out of college because she has a “big idea” and wants to found a company that will revolutionise a huge part of the healthcare system – the market for blood testing services. Her company will be called Theranos.

Holmes’s big idea was for a way to perform multiple tests at once on a tiny drop of blood, and to deliver the results wirelessly to doctors. So she set about pitching to investors…

Read on

‘Social credit’ in China

This morning’s Observer column:

In the old days, western snobbery led to the complacent view that the Chinese could not originate, only copy. One hears this less now, as visitors to China return goggle-eyed at the extent to which its people have integrated digital technology into daily life. One colleague of mine recently returned exasperated because he had been expected to pay for everything there with his phone. Since he possesses only an ancient Nokia handset, he was unable to comply and had been reduced to mendicant status, having to ask his Chinese hosts to pay for everything.

If the future is digital, therefore, a significant minority of China’s 1.4 billion citizens are already there. More significantly, the country’s technocratic rulers have sussed that digital technology is not just good for making economic transactions frictionless, but also for implementing sophisticated systems of social control.

Read on

Facebook and the CCTV effect

This morning’s Observer column:

Jeremy Paxman, who once served as Newsnight’s answer to the pit-bull terrier, famously outlined his philosophy in interviewing prominent politicians thus: “Why is this lying bastard lying to me?” This was unduly prescriptive: not all of Paxman’s interviewees were outright liars; they were merely practitioners of the art of being “economical with the truth”, but it served as a useful heuristic for a busy interviewer.

Maybe the time has come to apply the same heuristic to Facebook’s public statements…

Read on

The Bitcoin/blockchain story: a mixture of greed and idealism

This morning’s Observer column:

Because I write about technology I am regularly assailed by people who are exercised about so-called “cryptocurrencies” like bitcoin, which most of them regard as a scam. But when I reply that while bitcoin might be newsworthy, the really important story concerns the blockchain technology that underpins it, their eyes glaze over and they start looking for the nearest exit as they conclude that they are in the grip of Coleridge’s Ancient Mariner.

And, in a sense, they are. Blockchain technology is indeed important, but it seems largely incomprehensible to ordinary mortals, even though the web teems with attempts to explain it…

Read on