- The Ship of State: A Conversation with Dave Eggers Good interview by Tom Lutz of Dave Eggars about his new book, The Captain and the Glory: An Entertainment. It’s a satire about Trump, if you believe such a thing is possible. And I’ve pre-ordered a copy. (It’s out on December 5.)
- The impact of direct air carbon capture on climate change A fascinating, honest, thoughtful, essay on whether Carbon Capture might be a feasible (and affordable) way of mitigating climate change by taking CO2 out of the atmosphere.
- Is Elon Musk preparing for the failure of the state? Interesting macro-interpretation of the projects in which he has major investments. Note also that the new Tesla pickup truck is bulletproof.
Tech commentary and gender
This morning’s Observer column:
Reading the observations of these three women brought to the surface a thought that’s been lurking at the back of my mind for years. It is that the most trenchant and perceptive critiques of digital technology – and particularly of the ways in which it has been exploited by tech companies – have come from female commentators. The thought originated ages ago as a vague impression, then morphed into an intuitive correlation and eventually surfaced as a conjecture that could be examined.
So I spent a few hours going through a decade’s-worth of electronic records – reprints, notes and links. What I found is an impressive history of female commentary and a gallery of more than 20 formidable critics…
Linkblog
- Facebook and Google’s pervasive surveillance poses an unprecedented danger to human rights The Amnesty Report.
- After Uber arrives, heavy drinking increases Surprise, surprise.
- The global population pyramid: how global demography has changed and what we can expect for the 21st century
- On the sentience of animals “You know the story: you cast your vote against animal sentience and you feel it’s reasonable to do so, but then you have to go home and undress in front of the cat.” Lovely essay by Daisy Hildyard in the London Review of Books.
THANKS to Mark, the reader who spotted the typo in the second link. Now fixed.
Tesla’s new pickup truck
No, this is not a spoof. It’s apparently what Elon Musk thinks will appeal to rural dwellers in the US. It’s not clear where they will store their assault rifles.
It comes from ArsTechnica so it must be true.
Linkblog
- John Cassidy’s New Yorker report on the testimony of Fiona Hill to the Trump impeachment inquiry
- Amazon will pay $0 in US taxes on $11,200,000,000 in profit for 2018 Yep, you read that correctly.
- What Can Americans Learn from Germany’s Reckoning with the Holocaust? Well, they could start with coming to terms with slavery and its legacy.
- On being a newbie in America Nice meditation by Dave Winer, who’s recently moved to Woodstock from NYC.
‘Middlemarch’ then and now
Today is the 200th anniversary of the birth of Mary Ann Evans, a woman whom we all know better as George Eliot. The New Yorker has a lovely essay by Rebecca Mead about Eliot and in particular about her great novel Middlemarch. Mead has already written a book about her own encounters with that novel — how she saw it differently each time she returned to it at various times in her own life. Middlemarch, she says “is a book that grows with the reader as the reader grows, which is why, two hundred years after Eliot’s birth, a reader can find it always has something to say to her or to him.”
But now she sees it in another, contemporary, light:
Lately, though, I have found myself thinking less about Eliot’s depiction of individual characters and more about the novel’s subtitle, “A Study of Provincial Life.” When Eliot set out to write “Middlemarch,” what she seemed to have in mind was a panoramic examination of a small town and its inhabitants that would capture not just the stories of individuals but would also say something about the way a community works, and about the state of the nation. “I am delighted to hear of a Novel of English Life having taken such warm possession of you,” her publisher, John Blackwood, remarked, when Eliot conveyed her intentions to him. Revisiting “Middlemarch” in the England of 2019—a year in which Britain was due to leave the European Union but instead has been mired in parliamentary paralysis, which the forthcoming election may or may not resolve—Eliot’s ironic observations about the electoral system have a new piquancy, and her representation of the innate conservatism of English provincial life has a topical relevance.
The parallel Mead sees is between the current UK government’s attempts to leave the European Union and the first Reform Bill of 1832. She focuses on one of the lesser characters in Middlemarch, Mr. Brooke, Dorothea Brooke’s uncle and guardian, who is a comfortable member of the landed gentry, and decides to run for office under the banner of Reform.
“There is no part of the country where opinion is narrower than it is here,” Mr. Brooke tells a reproving neighbor, Mrs. Cadwallader, the rector’s wife. Eliot shows, however, that Mr. Brooke’s commitment to reform is, at best, insubstantial. Having read theorists whose ideas underlie the movement, Mr. Brooke is inclined to ideas of liberalism, but, being a comfortable member of the landed gentry, his instincts are less than disruptive. (“Let Brooke reform his rent roll. He’s a cursed old screw, and the buildings all over his estate are going to rack,” one of the burghers of Middlemarch scathingly observes, when Brooke announces his forthcoming platform.) “This Reform will touch everybody by-and-by—a thoroughly popular measure—a sort of A, B, C, you know, that must come first before the rest can follow,” Mr. Brooke argues, to a voter, with “a sense of being a little out at sea, though finding it still enjoyable.” The hallmarks of Mr. Brooke’s character, and of his political campaign, are an inconsistency of mind and an absence of intellectual rigor.
Well, well. Which contemporary political figure does that bring to mind?
Linkblog
- Think twice before plugging your phone into a public charging socket It’s such a wicked world: people exploiting desperate smartphone owners who are running low on battery. But there is a fix — a “USB condom”.
- How Turkish coffee destroyed an empire Basically by getting men out of the house to places where they could debate and scheme.
- Australian Law should treat social media companies as publishers says Attorney-General So much for Section 230 then.
‘Don’t be Evil’ changes to ‘Don’t ask me anything’
From Steven Levy, who knows as much about Google as any outsider:
Last week, Google CEO Sundar Pichai sent an email blast to his 100,000 or so employees, cutting back the company’s defining all-hands meeting known as TGIF. The famous free-for-alls had epitomized the company’s egalitarian ethos, a place where employees and leaders could talk freely about nearly anything. More recently, however, the biweekly meeting had become fraught as it increasingly reflected Google’s tensions as opposed to its aspirations. “It’s not working in its current form,” Pichai said of what was once the hallmark of Google culture. In 2020, he declared, the meetings would be limited to once a month, and they would be more constrained affairs, sticking to “product and business strategy.” Don’t Be Evil has changed to Don’t Ask Me Anything.
It was inevitable, really. You can’t run a giant company as if it were a small startup.
Linkblog
- Dealing With Bias in Artificial Intelligence Three women with extensive experience in A.I. speak on the topic and how to confront it.
- History as a giant data set: how analysing the past could help save the future Fascinating long read about the theory that human history goes in cycles.
- The Royal Mint has melted down a million Brexit commemorative coins created for October 31, 2019 One of the lesser consequences of Brexit madness.
- The collapse of the information ecosystem poses profound risks for humanity intriguing metaphor: disinformation is to our media ecosystem as fossil fuels are to the environment.
Bias in machine learning
Nice example from Daphne Keller of Google:
Another notion of bias, one that is highly relevant to my work, are cases in which an algorithm is latching onto something that is meaningless and could potentially give you very poor results. For example, imagine that you’re trying to predict fractures from X-ray images in data from multiple hospitals. If you’re not careful, the algorithm will learn to recognize which hospital generated the image. Some X-ray machines have different characteristics in the image they produce than other machines, and some hospitals have a much larger percentage of fractures than others. And so, you could actually learn to predict fractures pretty well on the data set that you were given simply by recognizing which hospital did the scan, without actually ever looking at the bone. The algorithm is doing something that appears to be good but is actually doing it for the wrong reasons. The causes are the same in the sense that these are all about how the algorithm latches onto things that it shouldn’t latch onto in making its prediction.
Addressing bias in algorithms is crucial, especially in domains like healthcare where accurate predictions are vital. One effective approach to recognizing and mitigating biases is to rigorously test the algorithm in scenarios similar to its real-world applications. Suppose a machine-learning algorithm is trained on data from specific hospitals to predict fractures from X-ray images. In this case, it may appropriately incorporate prior knowledge about patient populations in those hospitals, resulting in reliable predictions within that context. However, the challenge arises when the algorithm is intended to be used in different hospitals not present in the initial training data set. To avoid unintended biases, a robust evaluation process is essential, and the use of a mobile learning management system can prove beneficial. Such a system enables continuous monitoring and assessment of the algorithm’s performance across various hospital settings, ensuring it doesn’t latch onto irrelevant factors and provides accurate predictions based on genuine medical insights.
To recognize and address these situations, you have to make sure that you test the algorithm in a regime that is similar to how it will be used in the real world. So, if your machine-learning algorithm is one that is trained on the data from a given set of hospitals, and you will only use it in those same set of hospitals, then latching onto which hospital did the scan could well be a reasonable approach. It’s effectively letting the algorithm incorporate prior knowledge about the patient population in different hospitals. The problem really arises if you’re going to use that algorithm in the context of another hospital that wasn’t in your data set to begin with. Then, you’re asking the algorithm to use these biases that it learned on the hospitals that it trained on, on a hospital where the biases might be completely wrong.