Not a post office scandal
Merely a New Year knitted top for a postbox in Ely!
Quote of the Day
”Life must be understood backwards. But with this, one forgets the second proposition, that it must be lived forwards.”
- Søren Kierkegaard
Musical alternative to the morning’s radio news
Richard Strauss | Four Last Songs, TrV 296 – 4. Im Abendrot | Jessye Norman
I love this. The songs were composed in 1948, when Strauss was 84, and premiered at the Albert Hall in London on 22 May 1950 by soprano Kirsten Flagstad and the Philharmonia Orchestra, conducted by Furtwängler.
Long Read of the Day
AI rights and human harms
Terrifically sharp essay by Helen Beetham (Whom God Preserve), a writer who takes no prisoners.
A possible captive in this context is Jeff Jarvis, a distinguished journalist and academic (and author of an interesting book, The Gutenberg Parenthesis, which I’ve read and enjoyed). When generative AI arrived on the scene, Jeff was excited about its possibilities for journalism and penned an essay cautioning us against imposing unreasonable restrictions on the training of LLMs (Large Language Models) like CPT-4 et al.
His piece included the following stirring paragraph:
To this day, journalists — whether on radio or at The New York Times — read, learn from, and repurpose facts and knowledge gained from the work of fellow journalists. Without that assured freedom, newspapers and news on television and radio and online could not function. The real question at hand is whether artificial intelligence should have the same right that journalists and we all have: the right to read, the right to learn, the right to use information once known. If it is deprived of such rights, what might we lose?
That last, rhetorical, question is what irked Helen. “Whose rights are really at risk?” she asks
Who or what is being ‘deprived’ of development? If we read closely, it is not the models at all, but ‘we’ who will ‘lose out’ if AI is not allowed to ‘learn’. This is not a coherent moral position. If models have rights, it can only be on their own behalf: their rights must relate to their own needs and purposes and vulnerability to ‘loss’, not to anyone else’s.
So what passes for moral philosophy in Silicon Valley really amounts to this: let big tech get on with doing big tech, without annoyances like legal frameworks and workers rights. The very last thing these corporations want is a new class of entities with rights they might have to worry about. They don’t want to give up valuable server space to failed or defunct models just because they ‘learned’ or once passed some spurious test of ‘sentience’: they want to decommission the heck out of them and make way for something more profitable. That is hardly a rights-respecting relationship. No, the models that big tech really cares about are business models and the thing they want to be accorded more rights, power and agency is the business itself.
Warming to her task, she exhumes an essay in a special issue of Robotics and AI about whether robots should have moral standing. “The essay,” she writes,
uses the examples of ‘servants’, ‘slaves’ and ‘animals’ to argue that what matters is how ‘virtuously’ the ‘owner’ behaves towards those in his power. The lived experience of slavery does briefly appear – so props to the author for realising that there might be an issue here – but in the end only to lament that the robot-slave metaphor is ‘limited’ by the unhappy particulars. Not that the ‘virtuous slave owner’ is a problematic moral guide. Not that human slavery should conscientiously be avoided as a metaphor for something else, such as the rights of non-human machines.
You are free to use the metaphors you choose, guys, but your choices betray your perspective. And in all these cases, the perspective is from someone with power. The power to choose, the power to behave nicely, or not so nicely, towards other people, women, servants, slaves, animals, chatbots, substrates. What these choices give away is a complete lack of understanding of the agency, the consciousness, the realities and perspectives and struggles of other people. The puzzle you can see lurking behind these examples is: where did all these rights of non-white non-guys come from? And the answer: it can only have been from the enlightened virtue of the white guys in charge. They decided that women deserved the vote, that slaves should be free. And in exactly the same way, they can decide to endow rights, privileges, consciousness even, to things they have created from their own incredible brains.
There’s lots more in that vein, which makes for a striking, exhilarating read.
My commonplace booklet
Our Rodent Selfies, Ourselves
From the New York Times…
A photographer trained two rats to take photographs of themselves. Guess what: They didn’t want to stop.
(Instagram and TikTok users, look away now.)
Augustine Lignier, a professional photographer, began to wonder why so many humans feel compelled to photograph their lives and share those images online.
So he built his own version of a Skinner box — a tall, transparent tower with an attached camera — and released two pet-store rats inside. Whenever the rats pressed the button inside the box, they got a small dose of sugar and the camera snapped their photo. The resulting images were immediately displayed on a screen, where the rats could see them. (“But honestly I don’t think they understood it,” Mr. Lignier said.)
Do read it. And reflect. We are the rats in the Skinner boxes devised by Meta, ByteDance, Google & Co. But at least Mr Lignier’s rodents, unlike us, didn’t have better things to do with their lives. We, on the other hand, do. Go figure.
Linkblog
Something I noticed, while drinking from the Internet firehose.
- Machine-learning (‘AI’) is coming for your signature. Link
Just as well we’ve stopped writing cheques.
This Blog is also available as an email three days a week. If you think that might suit you better, why not subscribe? One email on Mondays, Wednesdays and Fridays delivered to your inbox at 6am UK time. It’s free, and you can always unsubscribe if you conclude your inbox is full enough already!