Will computer science have an ethical reckoning?

I came on an interesting series of tweets on Twitter by Yonatan Zunger (@yonatanzunger) triggered by reading the revelations in the Observer and New York Times about the way stolen Facebook data had been weaponised for use in the US presidential election by a rogue data-scientist and an unscrupulous company. A key element in was a wide-ranging and astonishingly frank interview given to Observer journalist Carole Cadwalladr by Chris Wylie, the programmer who wrote the code used to exploit the data. I’ve assembled Zunger’s tweetstream in chronological order:

I didn’t come up in computer science; I used to be a physicist. That transition gives me a rather specific perspective on this situation: that computer science is a field which hasn’t yet encountered consequences.

Chemistry had two reckonings, in the late 19th and early 20th centuries: first with dynamite, and then with chemical weapons. Physics had its reckoning with the Bomb. These events completely changed the fields, and the way people come up in them.

Before then, both fields were dominated by hope: the ways that science could be used to make the world a fundamentally better place. New dyes, new materials, new sources of energy, new modes of transport; everyone could see the beauty.

Afterwards, everyone became painfully, continuously aware of how things could be turned against everything they ever dreamed of.

I don’t know the stories from chemistry as well. In physics, I can tell you that everyone, from their first days as an undergrad (or often before), encounters this and wrestles with it. They talk about it in the halls or late at night, they worry about it.

They occasionally even rap about it, like @acapellascience (a physicist, btw) did. (The lyrics are worth listening to carefully)

This isn’t to say that physicists are all pacifists. The rift between Edward Teller and J. R. Oppenheimer after the war was legendary, and both of them had very real reasons to believe what they did: Teller to love the Bomb, Oppenheimer to hate it.

(For those wondering: Teller was part of that generation of Central Europeans who saw exactly how bad things could get in so much detail. They were famous for their determination to make sure things were safe at all goddamned costs.)

They were infamously not messing around, even though they took a wide range of approaches to it; consider that Edward Teller, John von Neumann, Henry Kissinger, and George Soros were all part of that.)

For a long time, it frightened me that biology hadn’t yet had this moment of reckoning — that there hadn’t yet been an incident which seared the importance of ethics and consequences into the hearts of every young scientist. Today, it frightens me more about computer scientists.

Young engineers treat ethics as a speciality, something you don’t really need to worry about; you just need to learn to code, change the world, disrupt something. They’re like kids in a toy shop full of loaded AK-47’s.

The hard lesson which other fields had to learn was this: you can never ignore that for a minute. You can never stop thinking about the uses your work might be put to, the consequences which might follow, because the worst case is so much worse than you can imagine.

Even what Chris Wylie did is only the beginning. You hand authoritarian regimes access to modern data science, and what happens? You create the tools of a real panopticon, and what happens?

Those of you in CS right now: if you don’t know if what I’m saying makes sense, pick up Richard Rhodes’ “The Making of the Atomic Bomb.” It’s an amazingly good book in its own right, and you’ll get to know both the people and what happened.

Think about this problem like SRE’s, like safety engineers. Scope your failure modes out to things involving bad actors using the systems you’re building. Come up with your disaster response exercises.

If you can do it without wanting to hide under a table, you’re not thinking hard enough. There are worse failure modes, and they’re coming for you. And you will be on deck to try to mitigate them. //

Short postscript: As several people have pointed out, many fields of biology have had these reckonings (thanks to eugenics and the like), and civil engineering did as well, with things like bridge collapses in the late 19th century.

LATER Zunger wrote all this more elegantly in the Boston Globe

To understand Trump, read Plato’s ‘Republic’

It’s as clear as day that Trump is getting ready to fire Robert Mueller, the Special Prosecutor who is remorselessly closing in on him. Liberals who gleefully believe that this will be the same kind of disastrous mistake that Richard Nixon made when he fired Archibald Cox are in for sharp disappointment. Nixon’s action made impeachment a certainty (which is why he resigned before he was drummed from office by an irate Congress). But the current Republican-controlled Congress will do nothing when Mueller is fired. Which means that Trump has to do it before the mid-term elections in November.

Trump’s critics see the remarkable level of personnel ‘churn’ in the White House as evidence of the president’s dysfunctional impulsiveness and narcissism which make it impossible for him to get anything done. I don’t share that comforting thought. There is, I’m afraid, method in his madness.

Andrew Sullivan, the veteran observer of these things, thinks so too. Since Trump doesn’t read (indeed may have difficulty reading), he obviously hasn’t read Plato’s Republic, but Sullivan has, and he sees it as an operating manual for Trumpism because it describes “how a late-stage democracy, dripping with decadence and corruption, with elites dedicated primarily to enriching themselves, and a people well past any kind of civic virtue, morphs so easily into tyranny.”

When Plato’s tyrant first comes to power — on a wave of populist hatred of the existing elites — there is a period of relative calm when he just gives away stuff: at first he promises much “in private and public, and grant[s] freedom from debts and distribute[s] land to the people and those around himself” (or, say, a trillion-dollar unfunded tax cut). He aims to please. But then, as he accustoms himself to power, and feels more comfortable, “he suspects certain men of having free thoughts and not putting up with his ruling … Some of those who helped in setting him up and are in power — the manliest among them — speak frankly to him and to one another, criticizing what is happening … Then the tyrant must gradually do away with all of them, if he’s going to rule, until he has left neither friend nor enemy of any worth whatsoever.”

This is the second phase of tyranny, after the more benign settling-in: the purge. Any constraints that had been in place to moderate the tyrant’s whims are set aside; no advice that counters his own gut impulses can be tolerated. And so, over the last couple of weeks, we have seen the president fire Rex Tillerson and Andrew McCabe, two individuals who simply couldn’t capitulate to the demand that they obey only Trump, rather than the country as well.

And because of this small gesture of defiance, they deserved especial public humiliation. Tillerson was warned of his impending doom while on the toilet — a nice, sadistic touch. McCabe was fired hours before his retirement, a public execution also fraught with venom. What kind of man is this? We have become numb to it, but we should never forget how our president is a man who revels in his own cruelty. Revenge is not a dish best served cold for him. It’s the reddest and rawest of meats.

On this reading, the firing of Tillerson and his replacement by Pompeo (“by a fawning toady, Mike Pompeo, a man whose hatred of Islam is only matched by his sympathy for waterboarders”) makes perfect sense. So too does the fact that Pompeo has been replaced in turn by Gina Happel (“a war criminal, who authorized brutal torture and illegally destroyed the evidence”).

And then there’s the replacement of H.R. McMaster by John Bolton, a nutter who has never seen a war that he didn’t like. And this too, says Sullivan, follows Plato’s playbook:

And this, of course, is also part of the second phase for Plato’s tyrant: war. “As his first step, he is always setting some war in motion, so that people will be in need of a leader,” Plato explains. In fact, “it’s necessary for a tyrant always to be stirring up war.”

This is simultaneously scary and persuasive because it suggests that we are indeed heading for war — first with North Korea, and secondly with Iran. Those mid-Term elections have never been more important.

Why Facebook can’t change

My €0.02-worth on the bigger story behind the Cambridge Analytica shenanigans:

Watching Alexander Nix and his Cambridge Analytica henchmen bragging on Channel 4 News about their impressive repertoire of dirty tricks, the character who came irresistibly to mind was Gordon Liddy. Readers with long memories will recall him as the guy who ran the “White House Plumbers” during the presidency of Richard Nixon. Liddy directed the Watergate burglary in June 1972, detection of which started the long chain of events that eventually led to Nixon’s resignation two years later. For his pains, Liddy spent more than four years in jail, but went on to build a second career as a talk-show host and D-list celebrity. Reflecting on this, one wonders what job opportunities – other than those of pantomime villain and Savile Row mannequin – will now be available to Mr Nix.

The investigations into the company by Carole Cadwalladr, in the Observer, reveal that in every respect save one important one, CA looks like a standard-issue psychological warfare outfit of the kind retained by political parties – and sometimes national security services – since time immemorial. It did, however, have one unique selling proposition, namely its ability to offer “psychographic” services: voter-targeting strategies allegedly derived by analysing the personal data of more than 50 million US users of Facebook.

The story of how those data made the journey from Facebook’s servers to Cambridge Analytica’s is now widely known. But it is also widely misunderstood…

Read on

Enlightenment, what enlightenment?

Sam Moyn is not impressed by Steven Pinker’s new book – Enlightenment Now:

In laying out his vision of betterment in Enlightenment Now, Pinker confronts alternative trends and looming threats for progress only in order to brush them off. He does not take seriously the risk of major catastrophes, such as the collapse of a recent era of peace or the outbreak of a global pandemic, which he believes is easy to magnify beyond reason. As for environmental degradation, humanity will surely find a way to counteract this in time. “As the world has gotten richer,” Pinker explains, “nature has begun to rebound”—as if the failure of a few prophecies of ecological disaster to come to pass on schedule means the planet is infinitely resilient. Once he gets around to acknowledging that climate change is an actual problem, Pinker spends much of his time attacking “climate justice warriors” for their anti-capitalist hysteria.

Lots more in that sceptical vein. Worth reading in full.

Facebook’s sudden attack of modesty

One of the most illuminating things you can do as a researcher is to go into Facebook not as a schmuck (i.e. user) but as an advertiser — just like your average Russian agent. Upon entering, you quickly begin to appreciate the amazing ingenuity and comprehensiveness of the machine that Zuckerberg & Co have constructed. It’s utterly brilliant, with a great user interface and lots of automated advice and help for choosing your targeted audience.

When doing this a while back — a few months after Trump’s election — I noticed that there was a list of case studies of different industries showing how effective a given targeting strategy could be in a particular application. One of those ‘industries’ was “Government and Politics” and among the case studies was a story of how a Facebook campaign had proved instrumental in helping a congressional candidate to win against considerable odds. I meant to grab some screenshots of this uplifting tale, but of course forget to do so. When I went back later, the case study had, well, disappeared.

Luckily, someone else had the presence of mind to grab a screenshot. The Intercept, bless it, has the before-and-after comparison shown in the image above. They are Facebook screenshots from (left) June 2017 and (right) March 2018.

Interesting, ne c’est pas?

In surveillance capitalism, extremism is good for business

This morning’s Observer column:

Zeynep Tufecki is one of the shrewdest writers on technology around. A while back, when researching an article on why (and how) Donald Trump appealed to those who supported him, she needed some direct quotes from the man himself and so turned to YouTube, which has a useful archive of videos of his campaign rallies. She then noticed something interesting. “YouTube started to recommend and ‘autoplay’ videos for me,” she wrote, “that featured white supremacist rants, Holocaust denials and other disturbing content.”

Since Tufecki was not in the habit of watching far-right fare on YouTube, she wondered if this was an exclusively rightwing phenomenon. So she created another YouTube account and started watching Hillary Clinton’s and Bernie Sanders’s campaign videos, following the accompanying links suggested by YouTube’s “recommender” algorithm. “Before long,” she reported, “I was being directed to videos of a leftish conspiratorial cast, including arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of 11 September. As with the Trump videos, YouTube was recommending content that was more and more extreme.”

Read on

Why scorning Trump didn’t work

While thinking about Trump this morning, I came on this astute observation by P.J. O’Rourke. He’s right: Establishment scorn of Trump was as toxic as Hillary Clinton’s reference to his supporters as “deplorables”. This lesson has still to be learned by many ‘Remain’ supporters in the UK.

How to stay sane on Twitter: ignore retweets

This morning’s Observer column:

When Twitter first broke cover in July 2006, the initial reaction in the non-geek community was derisive incredulity. First of all, there was the ludicrous idea of a “tweet” – not to mention the metaphor of “twittering”, which, after all, is what small birds do. Besides, what could one usefully say in 140 characters? To the average retired colonel (AKA Daily Telegraph reader), Twitter summed up the bird-brained frivolity of the internet era, providing further evidence that the world was going to the dogs.

And now? It turns out that the aforementioned colonel might have been right. For one of the things you can do with a tweet is declare nuclear war. Another thing you can do with Twitter is to bypass the mainstream media, ignore the opinion polls, spread lies and fake news without let or hindrance and get yourself elected president of the United States.

How did it come to this?

Read on

The bad news about false news

The most comprehensive study to date of misinformation on Twitter is out. The Abstract reads:

We investigated the differential diffusion of all of the verified true and false news stories distributed on Twitter from 2006 to 2017. The data comprise 126,000 stories tweeted by 3 million people more than 4.5 million times. We classified news as true or false using information from six independent fact-checking organizations that exhibited 95 to 98% agreement on the classifications. Falsehood diffused significantly farther, faster, deeper, and more broadly than the truth in all categories of information, and the effects were more pronounced for false political news than for false news about terrorism, natural disasters, science, urban legends, or financial information. We found that false news was more novel than true news, which suggests that people were more likely to share novel information. Whereas false stories inspired fear, disgust, and surprise in replies, true stories inspired anticipation, sadness, joy, and trust. Contrary to conventional wisdom, robots accelerated the spread of true and false news at the same rate, implying that false news spreads more than the truth because humans, not robots, are more likely to spread it. We investigated the differential diffusion of all of the verified true and false news stories distributed on Twitter from 2006 to 2017. The data comprise 126,000 stories tweeted by 3 million people more than 4.5 million times. We classified news as true or false using information from six independent fact-checking organizations that exhibited 95 to 98% agreement on the classifications. Falsehood diffused significantly farther, faster, deeper, and more broadly than the truth in all categories of information, and the effects were more pronounced for false political news than for false news about terrorism, natural disasters, science, urban legends, or financial information. We found that false news was more novel than true news, which suggests that people were more likely to share novel information. Whereas false stories inspired fear, disgust, and surprise in replies, true stories inspired anticipation, sadness, joy, and trust. Contrary to conventional wisdom, robots accelerated the spread of true and false news at the same rate, implying that false news spreads more than the truth because humans, not robots, are more likely to spread it.