Amazon’s Echo seems great, but what does it hear?

Illustration by James Melaugh/Observer

This morning’s Observer column:

I bought it [the Echo] because it seemed to me that it might be a significant product and I have a policy of never writing about kit that I haven’t paid for myself. Having lived with the Echo for a few weeks I can definitely confirm its significance. It is a big deal, which explains why the company invested so much in it. (It’s said that 1,500 people worked on the project for four years, which sounds implausible until you remember that Apple has 800 people working on the iPhone’s camera alone). Amazon’s boss, Jeff Bezos, may not have bet the ranch on it (he has a pretty big ranch, after all) but the product nevertheless represents a significant investment. And the sales so far suggest that it may well pay off.

Once switched on and hooked up to one’s wifi network, the Echo sits there, listening for its trigger word, “Alexa”. So initially one feels like an idiot saying things such as: “Alexa, play Radio 4” or: “Alexa, who is Kim Kardashian?” (A genuine inquiry this, from a visitor who didn’t know the answer, which duly came in the form of Alexa reading the first lines of the relevant Wikipedia entry.)

Read on

Be careful what you wish for

This morning’s Observer column:

In 1996, two US congressmen, Chris Cox (Republican, California) and Ron Wyden (Democrat, Oregon), drafted a law that they felt was essential if the nascent internet was to grow and prosper. The clause they wrote eventually found its way on to the statute book as section 230 of the Communications Decency Act, part of the sprawling Telecommunications Act, which Bill Clinton signed into law in 1996.

Cox and Wyden had been troubled by the rise of libel suits against internet service providers (ISPs) for defamatory content posted on websites that they hosted. The key sentence in the clause that they eventually drafted read: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

This single sentence provided the legal underpinning for how the world wide web has evolved…

Read on

The network architecture of the alt-right

This morning’s Observer column:

While there is no single, overarching explanation for Donald Trump’s election, his ascendancy would have been unthinkable in a pre-internet age, for two reasons.

The first is that much of Trump’s campaign rhetoric would never have got past the editorial “gatekeepers” of an earlier era – the TV network owners and controllers, the editors of powerful print media and the Federal Communications Commission with its “fairness doctrine” (which required holders of broadcast licences to “present controversial issues of public importance and to do so in a manner that was, in the Commission’s view, honest, equitable and balanced”).

The second reason is that in the pre-internet era, the multitudes of Trump’s vigorous, engaged and angry supporters would have had little option but to fume impotently in whatever local arenas they inhabited. It would have been difficult, if not impossible, for them to hook up with millions of like-minded souls to crowdsource their indignation and their enthusiasm for the candidate.

So I think we can say that while the net may not have been a sufficient condition for Trump’s victory, it was definitely a necessary one…

Read on

Hacking, disinformation and democracy

My take (in Prospect magazine) on Russian interference in the US election.

The CIA has concluded that Russia intervened in this year’s presidential election to help Donald Trump win. Speaking on Fox News the beneficiary of these alleged subterranean efforts retorted, “I think it’s ridiculous. I think it’s just another excuse. I don’t believe it.” And his transition team issued a dismissive statement. “These are the same people,” it stated, “that said Saddam Hussein had weapons of mass destruction. The election ended a long time ago in one of the biggest Electoral College victories in history. It’s now time to move on and ‘Make America Great Again.’”

Ponder this for a moment. American intelligence agencies have concluded with “high confidence” that Russia acted covertly in the latter stages of the presidential campaign to harm Hillary Clinton’s chances and promote Trump’s. They based that conclusion, in part, on finding that the Russians hacked the Republican National Committee’s computer systems as well as the Democratic National Committee’s network, but did not release whatever information they gleaned from the Republican networks.

That, of course, doesn’t prove that the Russian intervention was decisive in enabling Trump’s victory (though, in the end, the verdict of the Electoral College depended on 80,000 votes). But in a way it doesn’t matter. What matters is that a foreign adversary intervened covertly but adroitly in an American presidential election; that the outcome was the victory of a candidate who seems less belligerent towards Russia than his predecessor; and that the new president is contemptuously dismissive of the analysis of the intelligence services that he is soon to lead…

Read on

How do you throw the book at an algorithm?

This morning’s Observer column:

When, in the mid-1990s, the world wide web transformed the internet from a geek playground into a global marketplace, I once had an image of seeing two elderly gentlemen dancing delightedly in that part of heaven reserved for political philosophers. Their names: Adam Smith and Friedrich Hayek.

Why were they celebrating? Because they saw in the internet a technology that would validate their most treasured beliefs. Smith saw vigorous competition as the benevolent “invisible hand” that ensured individuals’ efforts to pursue their own interests could benefit society more than if they were actually trying to achieve that end. Hayek foresaw the potential of the internet to turn almost any set of transactions into a marketplace as a way of corroborating his belief that price signals communicated via open markets were the optimum way for individuals to co-ordinate their activities.

In the 1990s, those rosy views of the online world made sense…

Read on

What’s in a year? How about 2007?

This morning’s Observer column:

It’s interesting how particular years acquire historical significance: 1789 (the French Revolution); 1914 (outbreak of the first world war); 1917 (the Russian revolution); 1929 (the Wall Street crash); 1983 (switching on of the internet); 1993 (the Mosaic Web browser, which started the metamorphosis of the internet from geek sandpit to the nervous system of the planet). And of course 2016, the year of Brexit and Trump, the implications of which are, as yet, unknown.

But what about 2007? That was the year when Slovenia adopted the euro, Bulgaria and Romania joined the EU, Kurt Vonnegut died, smoking in enclosed public places was banned in the UK, a student shot 32 people dead and wounded 17 others at Virginia Tech, Luciano Pavarotti died and Benazir Bhutto was assassinated. Oh – and it was also the year that Steve Jobs launched the Apple iPhone.

And that, I suspect, is the main – perhaps the only – reason that 2007 will be counted as a pivotal year, because it was the moment that determined how the internet would evolve…

Read on

Donald in computerland

Leave aside for the moment the mysterious role (and behaviour) of the FBI Director in the last week of the election campaign and focus for a moment on one aspect of the supporters of the two candidates — different levels of education. A few weeks ago my colleague David Runciman wrote this in the Guardian:

The possibility that education has become a fundamental divide in democracy – with the educated on one side and the less educated on another – is an alarming prospect. It points to a deep alienation that cuts both ways. The less educated fear they are being governed by intellectual snobs who know nothing of their lives and experiences. The educated fear their fate may be decided by know-nothings who are ignorant of how the world really works. Bringing the two sides together is going to be very hard. The current election season appears to be doing the opposite.

The headline on the essay was “How the education gap is tearing politics apart.”

Yesterday the FBI Director wrote to Congress saying that his staff had reviewed the relevant emails on Anthony Weiner’s laptop and found nothing that would cause the Agency to review its earlier judgment that there was no justification for taking action against Hillary Clinton.

Needless to say, this was meat and drink for Trump. “You can’t review 650,000 emails in eight days,” he said yesterday in an appearance at the Freedom Hill Amphitheater in Michigan.

“You can’t do it folks. Hillary Clinton is guilty. The investigations into her crimes will go on for a long, long time. The rank-and-file special agents at the FBI won’t let her get away with her terrible crimes, including the deletion of her 33,000 emails after receiving a congressional subpoena.”

Needless to say, Trump’s supporters were delighted. Here’s a sample tweet:

trump_flynn_tweet

There are only two conclusions to be drawn from this: (a) Trump knows nothing about computer technology; or (b) he does, but is reckoning that his supporter base knows nothing about it.

My hunch is (b), but it doesn’t really matter either way. The task of getting software to trawl through any number of electronic documents looking either for metadata (like “From:”. “To:” or “cc:”) or keywords is computationally trivial. Ask Edward Snowden:

snowden_tweet

In fact, as one expert pointed out the real question that the FBI Director should have to answer is: what took you so long?

Apple’s iWatch in iDecline?

iwatch

From Statista:

According to IDC’s most recent data, Apple Watch shipments declined by 71.6 percent in the past quarter compared to the same period last year. It’s the second consecutive quarter of high double-digit sales declines for Apple’s smartwatch, indicating that demand is quickly fading after a decent start. As our chart illustrates, the Apple Watch’s early sales figures were as good as the iPad’s and much better than the iPhone’s were in 2007. However, it took the iPhone nine years and the iPad three years to see their first year-over-year sales decline. Apple Watch sales started going downhill after just one year on the market.

Frankly, I’m not surprised. Although I continue to wear my Apple watch most days (especially on busy days when I’m expecting urgent emails or messages), I find it more or less useless for everything else. I’ve given up bringing it with me on overnight trips — can’t be bothered lugging a charger and cable. And of course at home it sits in its charging dock every night. Leading-edge uselessness, methinks.