Apologies for intruding on your Saturday morning.
Thank you for the messages wondering why my Observer columns had suddenly disappeared from this newsletter — and indeed apparently also from the Web. Some wondered darkly if it might have had something to do with the fact that the Guardian had dumped sold the Observer to Tortoise Media?
If it’s any consolation, I too initially wondered what had been going on.
As it happened, the move from one proprietor to another was involved, but the disappearance of the columns from the online edition of the paper was accidental. Tortoise had done a herculean task in creating a new Observer website from scratch under fierce deadline pressure and tagging my ’Networker’ column seems to have been one of the tasks that was accidentally overlooked.
So here are two of the ‘missing’ columns.
AI can crunch data but to evolve, it needs the human factor – learning by experience
Artificial generative intelligence has taken another step forward with chatbot maker OpenAI’s latest model but it will only become truly smart by interacting with its environment OpenAI, that curious profit-making nonprofit oxymoron run by Sam Altman, recently released its newest large language model (LLM), coyly named o3. Cue the usual chorus of superlatives from Altman’s admirers. Tyler Cowen, a prominent economist who should know better, kicked off early on the theme of artificial general intelligence (AGI). “I think it is AGI, seriously,” quoth he. “Try asking it lots of questions, and then ask yourself: just how much smarter was I expecting AGI to be?”
So, I dutifully asked it lots of questions, and pushed back a bit on some of its answers, and found it quicker and a bit slicker than other LLMs I regularly use. It’s multimodal – that is, it handles text, images, and audio input and output. It produces near-human speech, engages in lively interactions, and seems quite good at the kind of knowledge tasks that researchers use to test LLMs.
It can “see”, and seems to understand, images (charts, graphs, diagrams, photographs). But was it as close to AGI as Cowen thinks? Answer: no – unless one accepts the ultra-narrow definition of AGI that OpenAI uses; that it can “outperform humans at most economically valuable work”. The “G” in AGI is still missing…
Why US scientists are suddenly using ‘burner’ phones (please destroy after reading)
The National Science Foundation (NSF) is a supposedly independent agency of the US federal government that was set up in 1950 to support fundamental research and education in all the non-medical fields of science and engineering.
Today, it funds about a quarter of all federally supported basic research that goes on in American colleges and universities. In some fields, such as mathematics, computer science, economics and the social sciences, the NSF is the main source of federal funding.
And in recent decades, it has devoted billions of dollars to attract more women and members of other underrepresented groups into the Stem fields – science, technology, engineering and mathematics.
On 18 April, though, the research community that depends on the agency had a nasty shock. Henceforth, the NSF announced, those latter initiatives were “no longer aligned with its priorities” and it was terminating any existing grant designed to improve the demographics of the scientific workforce.
Oh, and grants related to “misinformation/disinformation” were also being axed because that kind of research “could be used to infringe on the constitutionally protected speech rights of American citizens across the US in a manner that advances a preferred narrative about significant matters of public debate”.
So if you were a researcher thinking of applying for funding to estimate the percentage of AI-powered bots now operating on X (neé Twitter), forget it…
Enjoy the weekend. Normal service resumes on Monday.
John