Hunting the Qubit

Today’s Observer column:

We live in a universe that virtually none of us will ever understand. Physicists, however, refuse to be daunted by this and have been casting about for ways of putting these quantum properties to practical use. In the process, their gaze alighted on that fundamental building block of digital technology, the humble binary digit (or “bit”) in which all digital information is encoded. In the Newtonian (ie non-quantum) world, a bit can take only one of two values – one or zero. But at the quantum level, superposition means that a quantum bit – a qubit – could have multiple values (one, zero and a superposition of one and zero) at the same time. Which means that a computer based on quantum principles would be much, much faster than a conventional, silicon-based one. Various outfits have been trying to build one.

The results are controversial but intriguing…

Read on

Academic bitchiness

Isaiah Berlin was a past-master of the genre. Here he is writing1 to Marion and Felix Frankfurter in December 1934 about Richard Crossman, who was then a Fellow of New College, Oxford (where Berlin had been briefly a Fellow), and whom I think Berlin detested.

“Crossman is trying to sell his soul again & finding no buyers even among those who think he had one.”

I worked for Crossman briefly, when he was Editor of the New Statesman. He was an interesting man, but not a nice one.


  1. from Flourishing: Letters 1928-1946, edited by Henry Hardy, Chatto & Windus, 2004. 

Living with the surveillance state

Point made well by Bill Keller in a thoughtful column:

The danger, it seems to me, is not surveillance per se. We have already decided, most of us, that life on the grid entails a certain amount of intrusion. Nor is the danger secrecy, which, as Posner notes, “is ubiquitous in a range of uncontroversial settings,” a promise the government makes to protect “taxpayers, inventors, whistle-blowers, informers, hospital patients, foreign diplomats, entrepreneurs, contractors, data suppliers and many others.”

The danger is the absence of rigorous, independent regulation and vigilant oversight to keep potential abuses of power from becoming a real menace to our freedom. The founders created a system of checks and balances, but the safeguards have not kept up with technology. Instead, we have an executive branch in a leak-hunting frenzy, a Congress that treats oversight as a form of partisan combat, a political climate that has made “regulation” an expletive and a public that feels a generalized, impotent uneasiness. I don’t think we’re on a slippery slope to a police state, but I think if we are too complacent about our civil liberties we could wake up one day and find them gone — not in a flash of nuclear terror but in a gradual, incremental surrender.

We’re the suckers in the Syrian poker game

From Tom Friedman’s NYT column:

What Obama also has right is that old saying: “If you’re in a poker game and you don’t know who the sucker is, it’s probably you.” That’s the game we’re in in Iraq and Syria. All our allies for a coalition to take down ISIS want what we want, but as their second choice.

Kurds are not going to die to liberate Mosul from ISIS in order to hand it over to a Shiite-led government in Baghdad; they’ll want to keep it. The Turks primarily want to block the Kurds. The Iranians want ISIS crushed, but worry that if moderate Sunnis take over its territory they could one day threaten Iran’s allies in Iraq and Syria. The Saudi government would like ISIS to disappear, but its priority right now is crushing Iranian-backed rebels in Yemen. And with 1,000 Saudi youth having joined ISIS as fighters — and with Saudi Arabia leading the world in pro-ISIS tweets, according to a recent Brookings study — the Saudi government is wary about leading the anti-ISIS fight. The Russians pretend to fight ISIS, but they are really in Syria to protect Bashar al-Assad and defeat his moderate foes.

The triumph of hope over adversity?

NYT_editorial

The New York Times has put an editorial on its front page for the first time since 1920. It’s about gun control in the wake of the Californian terrorist massacre. It will, of course, have no effect: the US is beyond rationality in this area — as Nick Kristof observes in a remarkable column on the inside pages:

LESBOS, Greece — For three weeks American politicians have been fulminating about the peril posed by Syrian refugees, even though in the last dozen years no refugee in America has killed a single person in a terror attack.

In the same three weeks as this hysteria about refugees, guns have claimed 2,000 lives in America. The terror attacks in San Bernardino, Calif., and at the Planned Parenthood clinic in Colorado Springs were the most dramatic, but there’s an unrelenting average of 92 gun deaths every day in America, including suicides, murders and accidents.

So if politicians want to tackle a threat, how about developing a serious policy to reduce gun deaths — yes, including counterterrorism measures, but not simply making scapegoats of the world’s most vulnerable people.

The caricatures of Syrian refugees as jihadis who “want to kill us,” as one reader named Josh tweeted me, are unrecognizable to anyone who spends time with these refugees…

Note the numbers in the Kristof piece: an average of 92 gun deaths a day in the US.

Algorithmic power and programmers’ ethics

We know that power corrupts. But what about algorithmic power? This morning’s Observer column:

There is a direction of travel here – one that is taking us towards what an American legal scholar, Frank Pasquale, has christened the “black box society”. You might think that the subtitle – “the secret algorithms that control money and information” – says it all, except that it’s not just about money and information but increasingly about most aspects of contemporary life, at least in industrialised countries. For example, we know that Facebook algorithms can influence the moods and the voting behaviour of the service’s users. And we also know that Google’s search algorithms can effectively render people invisible. In some US cities, algorithms determine whether you are likely to be stopped and searched in the street. For the most part, it’s an algorithm that decides whether a bank will seriously consider your application for a mortgage or a loan. And the chances are that it’s a machine-learning or network-analysis algorithm that flags internet or smartphone users as being worthy of further examination. Uber drivers may think that they are working for themselves, but in reality they are managed by an algorithm. And so on.

Without us noticing it, therefore, a new kind of power – algorithmic power – has arrived in our societies. And for most citizens, these algorithms are black boxes – their inner logic is opaque to us. But they have values and priorities embedded in them, and those values are likewise opaque to us: we cannot interrogate them.

This poses two questions. First of all, who has legal responsibility for the decisions made by algorithms? The company that runs the services that are enabled by them? Maybe – depending on how smart their lawyers are.

But what about the programmers who wrote the code? Don’t they also have some responsibilities? Pasquale reports that some micro-targeting algorithms (the programs that decide what is shown in your browser screen, such as advertising) categorise web users into categories which include “probably bipolar”, “daughter killed in car crash”, “rape victim”, and “gullible elderly”. A programmer wrote that code. Did he (for it was almost certainly a male) not have some ethical qualms about his handiwork?

Read on

‘Smart’ homes? Not yet

My Observer comment piece on what the Internet of Things looks like when it’s at home:

There is a technological juggernaut heading our way. It’s called the Internet of Things (IoT). For the tech industry, it’s the Next Big Thing, alongside big data, though in fact that pair are often just two sides of the same coin. The basic idea is that since computing devices are getting smaller and cheaper, and wireless network technology is becoming ubiquitous, it will soon be feasible to have trillions of tiny, networked computers embedded in everything. They can sense changes, turning things on and off, making decisions about whether to open a door or close a valve or order fresh supplies of milk, you name it, the computers communicating with one another and shipping data to server farms all over the place.

As ever with digital technology, there’s an underlying rationality to lots of this. The IoT could make our lives easier and our societies more efficient. If parking bays could signal to nearby cars that they are empty, then the nightmarish task of finding a parking place in crowded cities would be eased. If every river in the UK could tweet its level every few minutes, then we could have advance warning of downstream floods in time to alert those living in their paths. And so on.

But that kind of networking infrastructure takes time to build, so the IoT boys (and they are mostly boys, still) have set their sights closer to home, which is why we are beginning to hear a lot about “smart” homes. On further examination, this turns out mostly to mean houses stuffed with networked kit…

Read on

Tyrannised by email? Here’s how to fight back

Lovely advice from Hannah Jane Parkinson

Here’s what I suggest. Taking a cue from my boss, I’m going to be turning on my out-of-office reply when I actually leave the office in the evening. On time. For homeworkers, or flexiworkers, that means when your shift is over. Because an automated out-of-office email that reads:

“Hi. Thanks for your email. I’ve finished work for the day and I have left the office. I’m now bathing my son and about to watch that new drama – the one with Ben Whishaw – and have a couple of glasses of pinot, but if anyone asks I’ll say it’s one. Might even order a takeaway. I’ll be able to answer your email in the morning, when I’m being paid to, at around 9am. Have a lovely evening too.”