The network architecture of the alt-right

This morning’s Observer column:

While there is no single, overarching explanation for Donald Trump’s election, his ascendancy would have been unthinkable in a pre-internet age, for two reasons.

The first is that much of Trump’s campaign rhetoric would never have got past the editorial “gatekeepers” of an earlier era – the TV network owners and controllers, the editors of powerful print media and the Federal Communications Commission with its “fairness doctrine” (which required holders of broadcast licences to “present controversial issues of public importance and to do so in a manner that was, in the Commission’s view, honest, equitable and balanced”).

The second reason is that in the pre-internet era, the multitudes of Trump’s vigorous, engaged and angry supporters would have had little option but to fume impotently in whatever local arenas they inhabited. It would have been difficult, if not impossible, for them to hook up with millions of like-minded souls to crowdsource their indignation and their enthusiasm for the candidate.

So I think we can say that while the net may not have been a sufficient condition for Trump’s victory, it was definitely a necessary one…

Read on

So what brought the tech moguls to fawn on Trumplethinskin?

This morning’s Observer column:

On Wednesday, a curious spectacle could be observed in New York. A swarm of tech billionaires arrived in their private jets and were whisked to Trump Tower, the Louis XV pastiche that is the residence of Trumplethinskin, as the tech journalist Kara Swisher calls the president-elect.

The roll call of assembled tech moguls ran as follows: Satya Natella and Brad Smith (Microsoft), Jeff Bezos (Amazon), Larry Page and Eric Schmidt (Alphabet, Google’s holding company), Sheryl Sandberg (Facebook), Tim Cook (Apple), Elon Musk (Tesla), Ginni Rometty (IBM), Safra Catz (Oracle), Chuck Robbins (Cisco), Alex Karp (Palantir) and Brian Krzanich (Intel).

Apart from their vast wealth and an aversion to paying tax, what linked these notables? Answer: a deep loathing of Trumplethinskin. Yet when he issued the summons to his preposterous “summit” they all came running. Why?

Read on

LATER

Most of those attending were tightlipped afterwards Kara Swisher extracted some impressions. She quoted one of the attendees as admitting that it was a bit of a humiliation.

“We definitely gave up a little stature now for possible benefit later,” said one source, noting that it was the price of being a public company with a tweet-happy new U.S. leader. “It’s better to be quiet now and speak up later if we have to, and save our powder.”

Which provides an interesting confirmation of the point I made in the column about the perceived power of a Trump tweet.

Sad but true: ‘Digital natives’ can be, er, naive

This morning’s Observer column:

If Facebook thinks it can outsource the detection of fake news to its users (and thereby avoid accepting editorial responsibility) then Stanford University has some bad news for it. Over the past 18 months the university’s history education group has been testing the ability of 7,800 “digital natives” (ie at middle school, high school and college students) in 12 states to judge the credibility of online information…

Read on

How do you throw the book at an algorithm?

This morning’s Observer column:

When, in the mid-1990s, the world wide web transformed the internet from a geek playground into a global marketplace, I once had an image of seeing two elderly gentlemen dancing delightedly in that part of heaven reserved for political philosophers. Their names: Adam Smith and Friedrich Hayek.

Why were they celebrating? Because they saw in the internet a technology that would validate their most treasured beliefs. Smith saw vigorous competition as the benevolent “invisible hand” that ensured individuals’ efforts to pursue their own interests could benefit society more than if they were actually trying to achieve that end. Hayek foresaw the potential of the internet to turn almost any set of transactions into a marketplace as a way of corroborating his belief that price signals communicated via open markets were the optimum way for individuals to co-ordinate their activities.

In the 1990s, those rosy views of the online world made sense…

Read on

Metaphors for our networked future

My longish essay on ways of thinking about the Internet — in today’s Observer:

So we find ourselves living in this paradoxical world, which is both wonderful and frightening. Social historians will say that there’s nothing new here: the world was always like this. The only difference is that we now experience it 24/7 and on a global scale. But as we thrash around looking for a way to understand it, our public discourse is depressingly Manichean: tech boosters and evangelists at one extreme; angry technophobes at the other; and most of us somewhere in between. Small wonder that Manuel Castells, the great scholar of cyberspace, once described our condition as that of “informed bewilderment”.

One way of combating this bewilderment is to look for metaphors. The idea of the net as a mirror held up to human nature is one. But recently people have been looking for others. According to IT journalist Sean Gallagher, the internet ‘looks a lot’ like New York of the late 70s: ‘There is a cacophony of hateful speech, vice of every kind… and policemen trying to keep a lid on all of it’…

Read on

What’s in a year? How about 2007?

This morning’s Observer column:

It’s interesting how particular years acquire historical significance: 1789 (the French Revolution); 1914 (outbreak of the first world war); 1917 (the Russian revolution); 1929 (the Wall Street crash); 1983 (switching on of the internet); 1993 (the Mosaic Web browser, which started the metamorphosis of the internet from geek sandpit to the nervous system of the planet). And of course 2016, the year of Brexit and Trump, the implications of which are, as yet, unknown.

But what about 2007? That was the year when Slovenia adopted the euro, Bulgaria and Romania joined the EU, Kurt Vonnegut died, smoking in enclosed public places was banned in the UK, a student shot 32 people dead and wounded 17 others at Virginia Tech, Luciano Pavarotti died and Benazir Bhutto was assassinated. Oh – and it was also the year that Steve Jobs launched the Apple iPhone.

And that, I suspect, is the main – perhaps the only – reason that 2007 will be counted as a pivotal year, because it was the moment that determined how the internet would evolve…

Read on

Zuckerberg’s problem: he makes money from fake news

This morning’s Observer column:

Zuckerberg says that he doesn’t want fake news on Facebook, but it turns out that getting rid of it is very difficult because “identifying the ‘truth’ is complicated”. Philosophers worldwide will agree with that proposition. But you don’t need to have a Nobel prize to check whether the pope did indeed endorse Trump or whether Clinton conducted the supposed purchases of arms or a Maldives house.

Zuckerberg’s problem is that he doesn’t want to engage in that kind of fact-checking, because that would be a tacit acknowledgement that Facebook is a publisher rather than just a technology company and therefore has some editorial responsibilities. And what he omits to mention is that Facebook has a conflict of interest in these matters. It makes its vast living, remember, from monitoring and making money from the data trails of its users. The more something is “shared” on the internet, the more lucrative it is for Facebook…

Read on

‘Transparency’: like motherhood and apple pie

This morning’s Observer column:

On 25 October, the German chancellor, Angela Merkel, wandered into unfamiliar territory – at least for a major politician. Addressing a media conference in Munich, she called on major internet companies to divulge the secrets of their algorithms on the grounds that their lack of transparency endangered public discourse. Her prime target appeared to be search engines such as Google and Bing, whose algorithms determine what you see when you type a search query into them. Given that, an internet user should have a right to know the logic behind the results presented to him or her.

“I’m of the opinion,” declared the chancellor, “that algorithms must be made more transparent, so that one can inform oneself as an interested citizen about questions like, ‘What influences my behaviour on the internet and that of others?’ Algorithms, when they are not transparent, can lead to a distortion of our perception; they can shrink our expanse of information.”

All of which is unarguably true…

Read on

So the government is serious about cybersecurity? Really?

This morning’s Observer column:

On Tuesday, the chancellor, Philip Hammond, announced that the government was “investing” £1.9bn in boosting the nation’s cybersecurity. “If we want Britain to be the best place in the world to be a tech business,” he said, “then it is also crucial that Britain is a safe place to do digital business… Just as technology presents huge opportunities for our economy – so to it poses a risk. Trust in the internet and the infrastructure on which it relies is fundamental to our economic future. Because without that trust, faith in the whole digital edifice will fall away.”

Quite so; cybersecurity is clearly important. After all, in its 2015 strategic defence and security review, the government classified “cyber” as a “tier 1” threat. That’s the same level as international military conflict and terrorism. So let’s look at the numbers. The UK’s defence budget currently runs at £35.1bn, while the country’s expenditure on counterterrorism is now running at about £3bn a year. That puts Hammond’s £1.9bn (a commitment he inherited from George Osborne, by the way) into perspective. And the money is to be spent over five years, so an uncharitable reading of the chancellor’s announcement is that the government is actually investing just under £400m annually in combating this tier 1 threat.

All of which suggests that there’s a yawning chasm between Hammond’s stirring rhetoric about the cyber threat and his ability to muster the resources needed to combat it…

Read on

Apple mania

This morning’s Observer column:

It’s that time of year again. Apple has released its results for the fiscal quarter ended 24 September 2016 and we are immediately plunged into “Has Apple peaked?” speculation. How come? Well, the company posted quarterly revenue of $46.9bn and net income of $9bn. Not bad, eh? Ah, yes, but not if you’re a Wall Street analyst, because these numbers compare to revenue of $51.5bn and net income of $11.1bn in the same quarter the year before. And – shock, horror! – the company’s gross margin was only 38% compared to 39.9% a year ago. The numbers are down, in other words.

Cue fevered speculation about the fate of the company. The numbers, burbled one analyst, show “the danger of being a one-trick pony when everyone already owns a pony. The company’s reliance on the smartphone, which is now a mature and saturated market in the developed world, is starting to create a growth problem for Apple. Breaking through will be a challenge, reminding investors Apple’s fundamentals and stock price have peaked.”

Pause for a reality check: Apple has cash reserves of $237.6bn, up $32bn from last year. At $622bn (at 26 October 2016), it is the most valuable company in the world…

Read on