Amazon’s Cloud Nine

This morning’s Observer column:

In 1999, Andy Grove, then the CEO of Intel, was widely ridiculed for declaring that “in five years’ time there won’t be any internet companies. All companies will be internet companies or they will be dead.” What he meant was that anybody who aspired to be in business in 2004 would have to deal with the internet in one way or another, just as they relied on electricity. And he was right; that’s what a GPT is like: it’s pervasive.

But digital technology differs in four significant ways from earlier GPTs. First of all, it is characterised by zero – or near-zero – marginal costs: once you’ve made the investment needed to create a digital good, it costs next to nothing to roll out and distribute a million (or indeed a billion) copies. Second, digital technology can exploit network effects at much greater speeds than the GPTs of the past. Third, almost everything that goes on in digital networks is governed by so-called power law distributions, in which a small number of actors (sites, companies, publishers…) get most of the action, while everyone else languishes in a “long tail”. Finally, digital technology sometimes gives rise to technological “lock-in”, where the proprietary standards of one company become the de facto standards for an entire industry. Thus, Microsoft once had that kind of lock-in on the desktop computer market: if you wanted to be in business you could have any kind of computer you wanted – so long as it ran Windows…

Read on

LATER Just came on this — which makes the same point about Amazon’s AWS, only more forcefully.

Evidence-based anti-terrorism. Now that *would* be a good idea…

… as Gandhi said of Western civilisation. But it’s not what we’ve got. In fact the only paper I can find which attempts to do some kind of cost-benefit analysis on the ‘war on terror’ comes from the Cato Institute. Since the 9/11 attacks, US taxpayers have blown about $1.6 trillion on this so-called global ‘war’ — and that doesn’t include money for the Department of Homeland Security. Timothy Egan has a good OpEd piece in the NYT about this. Sample:

Most of us are going to live to the actuarial average of 78, and never experience terrorism as anything other than the energy drink that keeps Wolf Blitzer going in the absence of real news. (This week, he was breathless over an apparent hoax, while “Breaking News: Airline threats not credible” flashed on the screen, contradicting his reason for doing the story.)

Consider the various threats to life. The sun, for starters. The incidence of melanoma, the most lethal form of skin cancer, has doubled in the last 30 years. More than 9,000 Americans now die every year from this common cancer. I also lost a friend — 30 years old, father of two — to malignant melanoma.

Cancer is the second leading cause of death, just behind heart disease. Together, they kill more than a million people in this country, followed by respiratory diseases, accidents and strokes. Then comes Alzheimer’s, which kills 84,000 Americans a year. And yet, total federal research money on Alzheimer’s through the National Institutes of Health was $562 million last year.

To put that in perspective, we spent almost 20 times that amount — somewhere around $10 billion — on the National Security Agency, the electronic snoops who monitor everyday phone records. For the rough equivalent of funding a breakthrough in Alzheimer’s, the government has not prevented a single terrorist attack, according to a 2014 report on the telephone-gathering colossus at the N.S.A.

What happens when algorithms decide what should be passed on?

One of the things we’re interested in on our research project is how rumours, news, information (and mis-information) can spread with astonishing speed across the world as a result of the Internet. Up to now I had been mostly working on the assumption that the fundamental mechanism involved is always something like the ‘retweet’ in Twitter — i.e. people coming on something that they wanted to pass on to others for whatever reason. So human agency was the key factor in viral retransmission of memes.

But I’ve just seen an interesting article in the Boston Globe which suggests that we need to think of the ‘retweeting’ effect in wider terms.

A surprise awaited Facebook users who recently clicked on a link to read a story about Michelle Obama’s encounter with a 10-year-old girl whose father was jobless.

Facebook responded to the click by offering what it called “related articles.” These included one that alleged a Secret Service officer had found the president and his wife having “S*X in Oval Office,” and another that said “Barack has lost all control of Michelle” and was considering divorce.

A Facebook spokeswoman did not try to defend the content, much of which was clearly false, but instead said there was a simple explanation for why such stories are pushed on readers. In a word: algorithms.

The stories, in other words, apparently are selected by Facebook based on mathematical calculations that rely on word association and the popularity of an article. No effort is made to vet or verify the content.

This prompted a comment from my former Observer colleague, Emily Bell, who now runs the Tow Center at Columbia. “They have really screwed up,” she told the Globe. “If you are spreading false information, you have a serious problem on your hands. They shouldn’t be recommending stories until they have got it figured out.”

She’s right, of course. A world in which algorithms decided what was ‘newsworthy’ would be a very strange place. But we might get find ourselves living in such a world, because Facebook won’t take responsibility for its algorithms, any more than Google will take responsibility for YouTube videos. These companies want the status of common carriers because otherwise they have to assume legal responsibility for the messages they circulate. And having to check everything that goes through their servers is simply not feasible.

Beyond gadgetry lies the real technology

This morning’s Observer column.

Cloud computing is a good illustration of why much media commentary about – and public perceptions of – information technology tends to miss the point. By focusing on tangible things – smartphones, tablets, Google Glass, embedded sensors, wearable devices, social networking services, and so on – it portrays technology as gadgetry, much as earlier generations misrepresented (and misunderstood) the significance of solid state electronics by calling portable radios “transistors”.

What matters, in other words, is not the gadget but the underlying technology that makes it possible. Cloud computing is what turns the tablet and the smartphone into viable devices.

The Snowden effect (contd.)

The Snowden effect continues. And affects not just companies getting nervous of the US cloud, but alsop, apparently, American internet users. Which in due course will affect US advertisers.

In the days after one of the most damning intelligence leaks since the birth of the Internet, polls were showing that average Americans felt sort of “meh” about the whole NSA-monitoring-our-calls-Skype-emails thing. But according to a new analysis from Annalect, a digital data and analytics firm, two months of ongoing discussion about online privacy have actually had major impacts on consumer behavior. Online consumers, riled by political sentiments or not, are changing their privacy and tracking settings–and if the trend continues, the advertising industry could be dinged in a significant way.

On June 10, nearly four days after journalist Glenn Greenwald published the Snowden scoop in the Guardian, a Washington Post-Pew Research Center Poll found that 56% of Americans felt that NSA monitoring was a-okay. In fact, government monitoring could go even further, 45% said, if it prevented terrorist attacks. Seven weeks later, the Annalect study, which began as a longitudinal investigation into consumer awareness of online privacy in early 2013 (before the Snowden kerfuffle), shows that collective sentiment may have shifted–consumer concern about online privacy actually jumped from 48% to 57% between June and July.

“This jump is largely from unconcerned Internet users becoming concerned–not from the normal vacillation found among neutral Internet users,” researchers wrote.

Unintended consequences of NSA surveillance (contd.)

This from a law professor.

You can consider the National Security Agency’s data-gathering programs a grim necessity to protect the nation or an outrageous violation of privacy. What is unquestionable is that they are reshaping the tech marketplace.

Yet it should have been obvious that so extensive a system of surveillance, no matter how benignly intended, would have unintended consequences. Some of the ill consequences are even predictable.

Consider cloud computing. Worldwide spending on the cloud is expected to double over the next three years to more than $200 billion. U.S. firms have been leaders in developing the technology. According to a new report from the Information Technology & Innovation Foundation, however, global worries about NSA surveillance are likely to reduce U.S. market share.

The report’s admittedly loose estimate is that U.S. cloud-computing firms will lose $21 billion to $35 billion in revenue between now and 2016. According to the report, some 10 percent of non-U.S. members of the Cloud Security Alliance said they’ve canceled a project with a U.S. company since the disclosure of the NSA’s surveillance. In addition, 56 percent indicated “that they would be less likely to use a U.S.-based cloud computing service.”

These are scary numbers for one of the few true growth areas in the tech sector. But they are precisely what should have been expected in the wake of the disclosures. “If I were an American cloud provider, I would be quite frustrated with my government right now,” Neelie Kroes, the European Union’s commissioner for digital affairs, said in the ITIF report.

Edward Snowden’s not the story. The fate of the internet is

This morning’s Observer column.

Repeat after me: Edward Snowden is not the story. The story is what he has revealed about the hidden wiring of our networked world. This insight seems to have escaped most of the world’s mainstream media, for reasons that escape me but would not have surprised Evelyn Waugh, whose contempt for journalists was one of his few endearing characteristics. The obvious explanations are: incorrigible ignorance; the imperative to personalise stories; or gullibility in swallowing US government spin, which brands Snowden as a spy rather than a whistleblower.

In a way, it doesn’t matter why the media lost the scent. What matters is that they did. So as a public service, let us summarise what Snowden has achieved thus far…

It’s the metadata, stoopid

This morning’s Observer column.

“To be remembered after we are dead,” wrote Hazlitt, “is but poor recompense for being treated with contempt while we are living.” Cue President “George W” Obama in the matter of telephone surveillance by his National Security Agency. The fact that for the past seven years the agency has been collecting details of every telephone call placed in the United States without a warrant was, he intoned, no reason for Americans to be alarmed. “Nobody is listening to your telephone calls,” he cooed. The torch was then passed to Dianne Feinstein, chair of the Senate intelligence committee, who was likewise on bromide-dispensing duty. “This is just metadata,” she burbled, “there is no content involved.”

At which point the thought uppermost in one’s mind is: what kind of idiots do they take us for? Of course there’s no content involved, for the simple reason that content is a pain in the butt from the point of view of modern surveillance. First, you have to listen to the damned recordings, and that requires people (because even today, computers are not great at understanding everyday conversation) and time. And although Senator Feinstein let slip that the FBI already employs 10,000 people “doing intelligence on counter-terrorism”, even that Stasi-scale mob isn’t a match for the torrent of voice recordings that Verizon and co could cough up daily for the spooks…

While we’re on the George W. Obama theme, John Perry Barlow claimed this morning that the Obama administration has now prosecuted seven officials under the Espionage Act and goes on to point out that the total for all his predecessors since 1917 is 3.

The PC: the new sunset industry

IDC says PC sales fell 14 percent in the first quarter on a year-over-year basis. That’s worse than its forecast of a 7.7 percent drop.

This is the worst quarter for PC industry since 1994 when IDC started tracking sales. So, that pretty much makes it the worst quarter in history.

IDC blames Microsoft’s Windows 8 operating system for alienating consumers. The new tile-based interface is too weird for consumers, says IDC.

Instead of buying new laptops or desktops, people are buying tablets and smartphones which serve as good-enough alternatives.

Source

Technology and the English language

Him: “I think I’ll do a blog about that”.
Me: “Do you really think it merits creating a new blog?”
Him: “What do you mean?”
Me: “Well, you said you were going to do a blog about it”.
Him: “So?”
Me: “Well, a blog is a unique web site. Did you mean a blog post?”
Him: “There’s a difference?”

This conversation, between me and an ostensibly well-informed acquaintance (who made a lot of money from technology, by the way) took place the other day. In its way, it’s emblematic of what always happens to technical terms when they run into everyday language. Once upon a time, some guys in Bell Labs invented a magical solid-state device called the ‘transistor’. It enabled us to make portable devices called ‘transistor radios’. But in no time at all, they became known as ‘transistors’. Then someone invented ‘videotape’, which enabled us to record ‘video tapes’. But in due course they became merely ‘videos’. Similarly, ‘text messages’ became ‘texts’. And now blog posts have become ‘blogs’.