The (non)sharing economy

This morning’s Observer column:

A euphemism is a polite way of obscuring an uncomfortable or unpleasant truth. So pornography becomes adult entertainment, genocide becomes ethnic cleansing, sacking someone becomes letting him (or her) go. People pass away rather than die; toilets become rest rooms; CCTV cameras monitor public and private spaces for our comfort and safety; and shell shock evolved into battle fatigue before finally winding up as post-traumatic stress disorder – which is really a way of disguising the awkward fact that killing people in cold blood can do very bad things to your psyche.

The tech industry is also addicted to euphemism. Thus the ubiquitous, unfair, grotesquely unbalanced contract which gives an internet corporation all the rights and the user almost none is called an end-user licence agreement. Computers that have been illegally hacked are “pwned”. The wholesale hoovering-up of personal data by internet companies (and the state) is never called by its real name, which is surveillance. And so on.

But the word that is most subverted by the tech industry is share…

Read on

History’s contradictions

Lovely Observer piece by Will Hutton:

Sometimes dealing with the past is easy. A few months ago, the college where I am principal (Hertford, Oxford) handed back a precious 16th-century atlas to its rightful owners – the Humboldt University library in Berlin. A British soldier had been offered it in exchange for a packet of cigarettes in the devastated streets of Berlin in May 1945. His father was an Oxford professor and for most of the last 70 years the Ortelius atlas had been first buried in his room and then locked in the college safe.

The 70th anniversary of the end of the war seemed as good a moment as any to return it. But what struck everyone at the small ceremony was how affected the German delegation, including representatives from the embassy and Humboldt University, were by what we were doing. It was a symbol of Germany’s relationship with Britain within a peaceful EU, an act of friendship all the more valuable because it had been freely offered and a recognition that history had moved on.

But more often than not history’s legacies are more unforgiving – a minefield in which yesterday’s and today’s realities seem irreconcilable…

Hunting the Qubit

Today’s Observer column:

We live in a universe that virtually none of us will ever understand. Physicists, however, refuse to be daunted by this and have been casting about for ways of putting these quantum properties to practical use. In the process, their gaze alighted on that fundamental building block of digital technology, the humble binary digit (or “bit”) in which all digital information is encoded. In the Newtonian (ie non-quantum) world, a bit can take only one of two values – one or zero. But at the quantum level, superposition means that a quantum bit – a qubit – could have multiple values (one, zero and a superposition of one and zero) at the same time. Which means that a computer based on quantum principles would be much, much faster than a conventional, silicon-based one. Various outfits have been trying to build one.

The results are controversial but intriguing…

Read on

Academic bitchiness

Isaiah Berlin was a past-master of the genre. Here he is writing1 to Marion and Felix Frankfurter in December 1934 about Richard Crossman, who was then a Fellow of New College, Oxford (where Berlin had been briefly a Fellow), and whom I think Berlin detested.

“Crossman is trying to sell his soul again & finding no buyers even among those who think he had one.”

I worked for Crossman briefly, when he was Editor of the New Statesman. He was an interesting man, but not a nice one.


  1. from Flourishing: Letters 1928-1946, edited by Henry Hardy, Chatto & Windus, 2004. 

Living with the surveillance state

Point made well by Bill Keller in a thoughtful column:

The danger, it seems to me, is not surveillance per se. We have already decided, most of us, that life on the grid entails a certain amount of intrusion. Nor is the danger secrecy, which, as Posner notes, “is ubiquitous in a range of uncontroversial settings,” a promise the government makes to protect “taxpayers, inventors, whistle-blowers, informers, hospital patients, foreign diplomats, entrepreneurs, contractors, data suppliers and many others.”

The danger is the absence of rigorous, independent regulation and vigilant oversight to keep potential abuses of power from becoming a real menace to our freedom. The founders created a system of checks and balances, but the safeguards have not kept up with technology. Instead, we have an executive branch in a leak-hunting frenzy, a Congress that treats oversight as a form of partisan combat, a political climate that has made “regulation” an expletive and a public that feels a generalized, impotent uneasiness. I don’t think we’re on a slippery slope to a police state, but I think if we are too complacent about our civil liberties we could wake up one day and find them gone — not in a flash of nuclear terror but in a gradual, incremental surrender.

We’re the suckers in the Syrian poker game

From Tom Friedman’s NYT column:

What Obama also has right is that old saying: “If you’re in a poker game and you don’t know who the sucker is, it’s probably you.” That’s the game we’re in in Iraq and Syria. All our allies for a coalition to take down ISIS want what we want, but as their second choice.

Kurds are not going to die to liberate Mosul from ISIS in order to hand it over to a Shiite-led government in Baghdad; they’ll want to keep it. The Turks primarily want to block the Kurds. The Iranians want ISIS crushed, but worry that if moderate Sunnis take over its territory they could one day threaten Iran’s allies in Iraq and Syria. The Saudi government would like ISIS to disappear, but its priority right now is crushing Iranian-backed rebels in Yemen. And with 1,000 Saudi youth having joined ISIS as fighters — and with Saudi Arabia leading the world in pro-ISIS tweets, according to a recent Brookings study — the Saudi government is wary about leading the anti-ISIS fight. The Russians pretend to fight ISIS, but they are really in Syria to protect Bashar al-Assad and defeat his moderate foes.

The triumph of hope over adversity?

NYT_editorial

The New York Times has put an editorial on its front page for the first time since 1920. It’s about gun control in the wake of the Californian terrorist massacre. It will, of course, have no effect: the US is beyond rationality in this area — as Nick Kristof observes in a remarkable column on the inside pages:

LESBOS, Greece — For three weeks American politicians have been fulminating about the peril posed by Syrian refugees, even though in the last dozen years no refugee in America has killed a single person in a terror attack.

In the same three weeks as this hysteria about refugees, guns have claimed 2,000 lives in America. The terror attacks in San Bernardino, Calif., and at the Planned Parenthood clinic in Colorado Springs were the most dramatic, but there’s an unrelenting average of 92 gun deaths every day in America, including suicides, murders and accidents.

So if politicians want to tackle a threat, how about developing a serious policy to reduce gun deaths — yes, including counterterrorism measures, but not simply making scapegoats of the world’s most vulnerable people.

The caricatures of Syrian refugees as jihadis who “want to kill us,” as one reader named Josh tweeted me, are unrecognizable to anyone who spends time with these refugees…

Note the numbers in the Kristof piece: an average of 92 gun deaths a day in the US.

Algorithmic power and programmers’ ethics

We know that power corrupts. But what about algorithmic power? This morning’s Observer column:

There is a direction of travel here – one that is taking us towards what an American legal scholar, Frank Pasquale, has christened the “black box society”. You might think that the subtitle – “the secret algorithms that control money and information” – says it all, except that it’s not just about money and information but increasingly about most aspects of contemporary life, at least in industrialised countries. For example, we know that Facebook algorithms can influence the moods and the voting behaviour of the service’s users. And we also know that Google’s search algorithms can effectively render people invisible. In some US cities, algorithms determine whether you are likely to be stopped and searched in the street. For the most part, it’s an algorithm that decides whether a bank will seriously consider your application for a mortgage or a loan. And the chances are that it’s a machine-learning or network-analysis algorithm that flags internet or smartphone users as being worthy of further examination. Uber drivers may think that they are working for themselves, but in reality they are managed by an algorithm. And so on.

Without us noticing it, therefore, a new kind of power – algorithmic power – has arrived in our societies. And for most citizens, these algorithms are black boxes – their inner logic is opaque to us. But they have values and priorities embedded in them, and those values are likewise opaque to us: we cannot interrogate them.

This poses two questions. First of all, who has legal responsibility for the decisions made by algorithms? The company that runs the services that are enabled by them? Maybe – depending on how smart their lawyers are.

But what about the programmers who wrote the code? Don’t they also have some responsibilities? Pasquale reports that some micro-targeting algorithms (the programs that decide what is shown in your browser screen, such as advertising) categorise web users into categories which include “probably bipolar”, “daughter killed in car crash”, “rape victim”, and “gullible elderly”. A programmer wrote that code. Did he (for it was almost certainly a male) not have some ethical qualms about his handiwork?

Read on