Friday 9 April, 2021


Philip Mountbatten RIP

En passant. My hunch is that the first series of The Crown did wonders for Philip’s image with the general public, because it illustrated very clearly the challenges facing any sentient being trying to integrate with such a terminally dysfunctional family. In some ways, his experience was a dry run for that of Diana Spencer and — later — Meghan Markle. The difference is that Philip stayed the course.

I met him once — by chance. He was Chancellor of Cambridge University for a long time, and he came on a routine visit to my College. What was interesting was the astute way he worked the room. He got to say something to everyone. I was reminded — oddly enough — of Bill Clinton, who had the same knack.


Quote of the Day

“The American writer in the middle of the twentieth century has his hands full in trying to understand, describe, and then make credible much of American reality. It stupefies, it sickens, it infuriates, and finally it is even a kind of embarrassment to one’s own meager imagination.”

  • Philip Roth, 1960 Note the date.

Musical alternative to the morning’s radio news

Diana Krall | Narrow Daylight

Link


Lockdown listening

Lovely blog post by Quentin Stafford-Fraser.

“I don’t really listen to many podcasts now…”, I heard someone say recently, “…because I no longer have a commute”. This made me realise how different my listening habits would be if I didn’t have a spaniel to walk. (I’ve tried the commuting thing on occasion, by the way, and gave it up as a bad lot. Spaniels are better.)

Recommended. I used to have a long commute — and listened to a lot of podcasts as a result. Lacking a spaniel, I now listen to fewer.


The problem-solving strategy we generally overlook

People often limit their creativity by continually adding new features to a design rather than removing existing ones, says Diana Kwon in an interesting Scientific American essay.

This idea of overlooking the obvious reminds me of a salutary lesson I learned as a consultant many years ago. In the 1970s, a group of us in the Open University Systems group used to tackle multi-faceted problem-situations that arose in industrial companies. We used an approach known as Soft Systems Analysis which involved identifying everyone in the organisation who had a role, however minor, in the problem-situation and working with that group on the analysis.

The first stage in the process involved collectively building a ‘Rich Picture’ — a shared representation of what was going wrong, which often involved a fairly gruelling set of group discussions. And early on we noticed a pattern: whenever a group was discussing a problem-situation, 90 per cent of the conversation was not about what was wrong, but involved people canvassing their personal candidates for a ‘solution’. It was a real struggle to get groups to focus on building a shared understanding on what was going wrong. And yet sometimes it was the construction of that Rich Picture that proved the key to making progress.


Time to regulate AI that interprets human emotions

The pandemic is being used as a pretext to push unproven artificial-intelligence tools into workplaces and schools.

Great OpEd in Nature by Kate Crawford.

In March, a citizen’s panel convened by the Ada Lovelace Institute in London said that an independent, legal body should oversee development and implementation of biometric technologies (see go.nature.com/3cejmtk). Such oversight is essential to defend against systems driven by what I call the phrenological impulse: drawing faulty assumptions about internal states and capabilities from external appearances, with the aim of extracting more about a person than they choose to reveal.

Countries around the world have regulations to enforce scientific rigour in developing medicines that treat the body. Tools that make claims about our minds should be afforded at least the same protection. For years, scholars have called for federal entities to regulate robotics and facial recognition; that should extend to emotion recognition, too. It is time for national regulatory agencies to guard against unproven applications, especially those targeting children and other vulnerable populations.

She uses the polygraph (lie-detector) as a case study on how flaky theories are used to justify tools — simply because they fit the only things that the tools can do. Same thing is now happening with certain applications of machine-learning.

Impressive piece.


Lack of gender and ethnic diversity in tech is not just societally damaging…

… It also leads to terrible design.

Striking Economist Leader:

Some things, you might think, are obvious. For example, if you design a device which shines light through someone’s fingertip to measure the oxygen level of their blood, then the colour of the skin through which that light is shining should be a factor when the device is calibrated.

But no. Research suggests that, with honourable exceptions, pulse oximeters, the machines which do this, overestimate oxygen levels three times more frequently (12% of the time) in people with black skin rather than white. When this informs decisions on whom to admit to hospital during a pandemic, more black than white patients are sent home on the mistaken conclusion that their blood-oxygen levels are within a safe range. This could have fatal consequences.

The pulse oximeter is only the latest example of an approach to design which fails to recognise that human beings are different from one another. Other recent medical cases include an algorithm that gave white patients in America priority over those from racial minorities, and the discovery that implants such as prosthetic hips and cardiac pacemakers cause problems more often in women than in men.

Beyond medicine, there are many examples of this phenomenon in information technology: systems that recognise white faces but not black ones; legal software which recommends harsher sentences for black criminals than white; voice-activated programs that work better for men than women. Even mundane things like car seat-belts have often been designed with men in mind rather than women.


The Technology of Bereavement

Sombre reflection by David Vincent on being unable to attend the funeral of friends.

We are particularly diminished by the sudden loss of our friends in Scotland. We began our careers and our families together, living and working alongside each other for three decades, and then regularly exchanging visits as our paths diverged. In John Donne’s terms, a full promontory has been washed away from our lives.

David and his wife were unable to attend the funerals, because of the rules governing ceremonies. So,

Instead we depend on Obitus, which describes itself as ‘a leading UK provider of bereavement technology services.’ The firm was apparently founded a decade ago, an indication that virtual mourning was not invented by Covid. It has expanded in the last year, working with funeral directors to connect the congregations unable to attend. We sit at home, three hundred miles away, equipped with a login and a password, and five minutes before the ceremony is due to begin, an empty, unnamed, funeral chapel appears on our screen. There is one fixed camera at the rear of the chapel, transmitting an unchanging view of the backs of twenty mourners. The sound quality is indifferent, the visual effects non-existent. After half an hour the congregation leaves separately, unable to attend a wake larger than six people, and we close the lid on the laptop. In a week’s time we will repeat the process for my godfather.

We had the same experience last year, when my mother-in-law was taken by Covid. Except that, in our case, the technology didn’t work — and when we eventually got a view of the chapel it was time for the next ceremony. I wrote about it in my lockdown diary.


This blog is also available as a daily email. If you think this might suit you better, why not subscribe? One email a day, delivered to your inbox at 7am UK time. It’s free, and there’s a one-click unsubscribe if you decide that your inbox is full enough already!