Quote of the Day

“The problem will never be solved, if solving it means getting rid of all the bad stuff, because we can’t agree on what the bad stuff is. Knowing that things won’t be perfect, what do we feel is most desirable? A system that errs on the side of caution, or one that errs on the side of being permissive?”

Rasmus Nielsen, Reuters Institute, Oxford.

So what happened to “don’t be evil”? Do you have to ask?

From Wired:

Two employee activists at Google say they have been retaliated against for helping to organize a walkout among thousands of Google workers in November, and are planning a “town hall” meeting on Friday for others to discuss alleged instances of retaliation.

In a message posted to many internal Google mailing lists Monday, Meredith Whittaker, who leads Google’s Open Research, said that after the company disbanded its external AI ethics council on April 4, she was told that her role would be “changed dramatically.” Whittaker said she was told that, in order to stay at the company, she would have to “abandon” her work on AI ethics and her role at AI Now Institute, a research center she co-founded at New York University.

Claire Stapleton, another walkout organizer and a 12-year veteran of the company, said in the email that two months after the protest she was told she would be demoted from her role as marketing manager at YouTube and lose half her reports. After escalating the issue to human resources, she said she faced further retaliation. “My manager started ignoring me, my work was given to other people, and I was told to go on medical leave, even though I’m not sick,” Stapleton wrote. After she hired a lawyer, the company conducted an investigation and seemed to reverse her demotion. “While my work has been restored, the environment remains hostile and I consider quitting nearly every day,” she wrote.

The only thing that’s surprising about this is that anybody should be surprised. Google is a corporation, and therefore a sociopathic entity that does only what’s in its interests. And having a senior AI researcher co-found an independent institute that is doing good work interrogating the ethical basis of AI is definitely NOT in the company’s interests.

Footnote: Famously, Google’s unofficial motto was “don’t be evil.” But that’s over, according to the code of conduct that the company distributes to its employees. According to Gizmodo, archives hosted by the Wayback Machine show that the phrase was removed sometime in late April or early May 2018.

Toxic tech?

This morning’s Observer column:

The headline above an essay in a magazine published by the Association of Computing Machinery (ACM) caught my eye. “Facial recognition is the plutonium of AI”, it said. Since plutonium – a by-product of uranium-based nuclear power generation – is one of the most toxic materials known to humankind, this seemed like an alarmist metaphor, so I settled down to read.

The article, by a Microsoft researcher, Luke Stark, argues that facial-recognition technology – one of the current obsessions of the tech industry – is potentially so toxic for the health of human society that it should be treated like plutonium and restricted accordingly. You could spend a lot of time in Silicon Valley before you heard sentiments like these about a technology that enables computers to recognise faces in a photograph or from a camera…

Read on

Quote of the day

”The age of party democracy has passed. Although the parties themselves remain, they have become so disconnected from the wider society, and pursue a form of competition that is so lacking in meaning, that they no longer seem capable of sustaining democracy in its present form.”

Peter Mair, Ruling the Void

As she used to be

From a National Geographic photograph by Eric Kruszewski.

Source

It’s not all bad news. The wonderful (and, sadly, late) Andrew Fallon made an intensive and comprehensive laser-scan of the entire building some years ago. Alexis Madrigal tells the story here. So a reference blueprint (should that be dataprint?) exists from which restorers can work.