The big tent
The Schlumberger Centre in West Cambridge at dusk yesterday. It’s an oil exploration research lab on my cycle-path to and from college. It was designed by Hopkins to “foster interactions between scientists within its laboratories, workshops and office areas”.
Rather than relegate the noisy drilling-rig test station to a less prominent location, this main 24m-wide workshop is placed at the heart of the building, overlooked on either side by acoustically-insulated laboratories facing inwards. These single storey wings are flanked by individual scientists’ rooms facing outwards over the Fens landscape.
The roof is made of Teflon-coated glass fibre, suspended on a network of cables by four suspension bridge-like structures. It was built in 1992 and has withstood the elements rather well.
Quote of the Day
”Novel-writing is a highly skilled and laborious trade. One does not just sit behind a screen jotting down other people’s conversation. One has for one’s raw material every single thing one has ever seen or heard or felt, and one has to go over that vast, smoldering rubbish-heap of experience, half stifled by fumes and dust, scraping and delving until one finds a few discarded valuables. Then one has to assemble these tarnished and dented fragments, polish them, set them in order, and try to make a coherent and significant arrangement of them.
- Evelyn Waugh
Musical alternative to the morning’s radio news
John Field | Nocturne No. 9 in E Minor, H. 46 | Alice Sara Ott
Long Read of the Day
Reasons for pessimism in Europe — Crooked Timber
The title of this essay by Chris Bertram on the Crooked Timber blog says it all, but does so in an elegant and restrained way.
Those of us who live in Europe have reason to be very pessimistic about the next four years. The state that Europeans have relied upon as their security guarantee is now in the hands of the nationalist extreme right and the information space is saturated by the output of tech oligarchs such as Elon Musk who are either aligned with or beholden to that nationalist right and who openly fantasize about replacing elected European governments. These pressures come on top of military aggression from Russia in Ukraine and elsewhere, austerity in public services, increased energy costs, stagnant living standards, a difficult green transition, demographic decline, and anxiety about immigration and cultural diversity. Most of these pressures are likely to be deliberately worsened by the incoming Trump regime in the hope of having its ideological allies come to power in European countries. In fact the very same figures who vaunted the importance of national sovereignty are salivating at the prospect of a great power interfering to their benefit in domestic affairs: so much for patriotism!
Resistance will be hampered on several fronts…
It makes me think of a motto which I attribute (perhaps wrongly) to Gramsci — that what we need is “pessimism of the intellect and optimism of the will”.
But underneath the piece is a less bleak comment.
I consider myself to be a pretty pessimistic guy, who usually thinks things won’t work out well and are at least as likely to get worse than to get better, but I think this is probably a bit too pessimistic. For all this to happen, a bunch of people who are not that smart, not that organized, are hard to get along with, and have other serious problems would have to have a lot of things go right for them. That might happen! But, I think the above is close to an absolute worst-case scenario, and the more likely outcome, while bad, is less bad than this…
Who knows? And we won’t find out, though, for quite a while. So maybe what we liberals need just now is realism of the intellect and optimism of the will?
My commonplace booklet
Hugging Face has just announced a new Large Language Model (LLM), Deepseek-V3, which apparently has a performance close to other leading models but requires only a tenth of the computing power for its training. Impressive, eh?
Here’s how the designers introduce it:
We present DeepSeek-V3, a strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token. To achieve efficient inference and cost-effective training, DeepSeek-V3 adopts Multi-head Latent Attention (MLA) and DeepSeekMoE architectures, which were thoroughly validated in DeepSeek-V2. Furthermore, DeepSeek-V3 pioneers an auxiliary-loss-free strategy for load balancing and sets a multi-token prediction training objective for stronger performance. We pre-train DeepSeek-V3 on 14.8 trillion diverse and high-quality tokens, followed by Supervised Fine-Tuning and Reinforcement Learning stages to fully harness its capabilities. Comprehensive evaluations reveal that DeepSeek-V3 outperforms other open-source models and achieves performance comparable to leading closed-source models. Despite its excellent performance, DeepSeek-V3 requires only 2.788M H800 GPU hours for its full training. In addition, its training process is remarkably stable. Throughout the entire training process, we did not experience any irrecoverable loss spikes or perform any rollbacks.
Of course this will be incomprehensible to any non-technical reader — not just because of all the jargon and acronyms, but also because it assumes a conceptual grasp how LLMs are created. But it’s an impressive example of good technical writing: compressed, efficient and informative.
Linkblog
Something I noticed, while drinking from the Internet firehose.
- Terrific conversation between Ian Hislop and Andrew Marr about Elon Musk Link
This Blog is also available as an email three days a week. If you think that might suit you better, why not subscribe? One email on Mondays, Wednesdays and Fridays delivered to your inbox at 6am UK time. It’s free, and you can always unsubscribe if you conclude your inbox is full enough already!