Quote of the Day
”A science is said to be useful if its development tends to accentuate the existing inequities in the distribution of wealth, or more directly promotes the destruction of human life.”
- G.H. Hardy, in A Mathematician’s Apology
Musical replacement for the morning’s radio news
Leonard Cohen – Hallelujah (Live In London)
Many Top AI Researchers Get Financial Backing From Big Tech
Surprise, surprise. Interesting story in Wired.
Mohamed and Moustafa Abdalla, two brothers who are graduate students at the university of Toronto, embarked on an interesting mini-project. They looked at how many AI researchers at Stanford, MIT, UC Berkeley, and the University of Toronto have received funding from Big Tech over their careers. They examined the CVs of 135 computer science faculty who work on AI at the four schools, looking for indications that the researcher had received funding from one or more tech companies.
For 52 of those, they couldn’t make a determination. Of the remaining 83 faculty, they found that 48, or 58 percent, had received funding such as a grant or a fellowship from one of 14 large technology companies: Alphabet, Amazon, Facebook, Microsoft, Apple, Nvidia, Intel, IBM, Huawei, Samsung, Uber, Alibaba, Element AI, or OpenAI. Among a smaller group of faculty that works on AI ethics, they also found that 58 percent of those had been funded by Big Tech. When any source of funding was included, including dual appointments, internships, and sabbaticals, 32 out of 33, or 97 percent, had financial ties to tech companies. “There are very few people that don’t have some sort of connection to Big Tech,” Abdalla says.
Adballa says industry funding is not necessarily compromising, but he worries that it might have some influence, perhaps discouraging researchers from pursuing certain projects or prompting them to agree with solutions proposed by tech companies. Provocatively, the Abdallas’ paper draws parallels between Big Tech funding for AI research and the way tobacco companies paid for research into the health effects of smoking in the 1950s.
Their paper, “The Grey Hoodie Project: Big Tobacco, Big Tech, and the threat on academic integrity” is on arXiv.
The Abstract reads:
As governmental bodies rely on academics’ expert advice to shape policy regarding Artificial Intelligence, it is important that these academics not have conflicts of interests that may cloud or bias their judgement. Our work explores how Big Tech is actively distorting the academic landscape to suit its needs. By comparing the well-studied actions of another industry, that of Big Tobacco, to the current actions of Big Tech we see similar strategies employed by both industries to sway and influence academic and public discourse. We examine the funding of academic research as a tool used by Big Tech to put forward a socially responsible public image, influence events hosted by and decisions made by funded universities, influence the research questions and plans of individual scientists, and discover receptive academics who can be leveraged. We demonstrate, in a rigorous manner, how Big Tech can affect academia from the institutional level down to individual researchers. Thus, we believe that it is vital, particularly for universities and other institutions of higher learning, to discuss the appropriateness and the tradeoffs of accepting funding from Big Tech, and what limitations or conditions should be put in place.
When one raises the question of relationships with big tech companies with some academics the general response is that there’s nothing to see here. Prominent medical researchers who have links to Big Pharma give the same responses. Nothing to see here, move along. Until, of course, there is something to see.
Face masks: what the data say
One of the strangest (and annoying) aspects of the pandemic as it evolved was the reluctance of the government’s scientific advisers to recommend the wearing of non-N95 face masks. People who decided to make their own and wear them were regarded in many places as cranks. And now masks are mandatory in shops and other buildings. So somewhere along the line crankiness became Holy Writ. And of course in the US, under the tutelage of Donald Trump, refusing to wear a mask became a test of masculinity or patriotism, or both. (Or a litmus test for idiocy.)
I always thought that the issue was a bit like Pascal’s Wager: it was unlikely to do one harm, and might do some good, so why not wear one?
Now I find a paper in Nature, no less, saying “The science supports that face coverings are saving lives during the coronavirus pandemic, and yet the debate trundles on. How much evidence is enough?”
Security flaw left ‘smart’ chastity sex toy users at risk of permanent lock-in
There’s a long list of things I don’t understand about this, but here goes:
Security researchers have discovered that a major security flaw in one popular sex toy could have been catastrophic for tens of thousands of users.
U.K.-based security firm Pen Test Partners said the flaw in the Qiui Cellmate internet-connected chastity lock, billed as the “world’s first app controlled chastity device,” could have allowed anyone to remotely and permanently lock in the user’s penis.
The Cellmate chastity lock works by allowing a trusted partner to remotely lock and unlock the chamber over Bluetooth using a mobile app. That app communicates with the lock using an API. But that API was left open and without a password, allowing anyone to take complete control of any user’s device.
Because the chamber was designed to lock with a metal ring underneath the user’s penis, the researchers said it may require the intervention of a heavy-duty bolt cutter or an angle grinder to free the user.
At first I assumed it was a spoof — “Middle Ages meets smartphone era”. But apparently not.
And this thing is, apparently, a toy.
This blog is also available as a daily email. If you think this might suit you better, why not subscribe? One email a day, delivered to your inbox at 7am UK time. It’s free, and there’s a one-click unsubscribe if you decide that your inbox is full enough already!