“The problem is our brains are intuitively suited to the sorts of risk management decisions endemic to living in small family groups in the East African highlands in 100,000 BC, and not to living in the New York City of 2008.”
- Bruce Schneier
“The problem is our brains are intuitively suited to the sorts of risk management decisions endemic to living in small family groups in the East African highlands in 100,000 BC, and not to living in the New York City of 2008.”
Announcing their decision to step down from directly managing the company they created, the Google co-founders wrote:
With Alphabet now well-established, and Google and the Other Bets operating effectively as independent companies, it’s the natural time to simplify our management structure. We’ve never been ones to hold on to management roles when we think there’s a better way to run the company. And Alphabet and Google no longer need two CEOs and a President. Going forward, Sundar will be the CEO of both Google and Alphabet. He will be the executive responsible and accountable for leading Google, and managing Alphabet’s investment in our portfolio of Other Bets. We are deeply committed to Google and Alphabet for the long term, and will remain actively involved as Board members, shareholders and co-founders. In addition, we plan to continue talking with Sundar regularly, especially on topics we’re passionate about!
John Gruber is having none of it:
This whole “Alphabet” thing is a joke. I still don’t get what they’re even trying for with it. The company is Google and we all know it. The subsidiary owns the parent and everyone knows it. No one is fooled by this. Nothing has changed regarding the goofy super-class shares that Page and Brin hold that give them complete control of the company. Google is a privately-held company that trades as a publicly-held one.
Here’s the thing that’s always rubbed me the wrong way about Google. They’re insulting. Steve Jobs, Jeff Bezos, Bill Gates — I completely believe they’re all geniuses. But they never seem(ed) condescending. Tim Cook and Satya Nadella aren’t founders but they’re both great examples of what a CEO should be: smart, honest, respectful.
Brin and Page are almost certainly smarter than you and me. But they’re not as much smarter as they think they are. Read this whole announcement through the filter of “they think we’re dumb” and it makes a lot more sense. And if they were as smart as they think they are, they’d therefore be smart enough to recognize how tone-deaf this plays.
Yep.
From Tyler Cowen of all people:
I remain a supporter of Remain, for reasons I will not recap here, but I am also a realist and I recognize that a commitment to the European Union requires a substantial commitment from the population, more than a mere fifty percent and in the United Kingdom we do not see that close to that. You probably know that the Tories seem to have won a smashing victory in today’s election, and by campaigning on Brexit as their main issue. And you can’t just blame Corbyn — his ascendancy and leadership were endogenous to the broader process, and getting rid of him to reverse Brexit it turned out was not the priority.
So do you know who looks much better in retrospect? Yes, David Cameron. After the initial referendum I heard from the usual elites the notion that Cameron committed some kind of inexplicable, aberrant error by allowing the referendum in the first place. That notion is much harder to entertain after today. Even if you are pro-Remain, we should now see that either the referendum, or something like it, was indeed a necessary step in British politics. Cameron himself saw this, and thought that a later referendum, run by an EU-hostile Tory government, could in fact go much worse than what he chanced. So it seems with hindsight that Cameron was pretty prescient, even if he did not get what he wanted.
The only flaw in that argument is its assumption that Cameron was thinking of the population as a whole, rather than of the Europhobes in his own party.
This morning’s Observer column:
One of the things that really annoys AI researchers is how supposedly “intelligent” machines are judged by much higher standards than are humans. Take self-driving cars, they say. So far they’ve driven millions of miles with very few accidents, a tiny number of them fatal. Yet whenever an autonomous vehicle kills someone there’s a huge hoo-ha, while every year in the US nearly 40,000 people die in crashes involving conventional vehicles.
Likewise, the AI evangelists complain, everybody and his dog (this columnist included) is up in arms about algorithmic bias: the way in which automated decision-making systems embody the racial, gender and other prejudices implicit in the data sets on which they were trained. And yet society is apparently content to endure the astonishing irrationality and capriciousness of much human decision-making.
If you are a prisoner applying for parole in some jurisdictions, for example, you had better hope that the (human) judge has just eaten when your case comes up…
Tony Brooker, the guy who developed Autocode, arguably the world’s first machine-independent programming language, has passed away at the age of 94. He did it to make one of the early computers, the Ferranti Mark 1 in Alan Turing’s lab in Manchester University, useable by human beings. There’s a lovely recording of him talking about it on the British Library site. Also, a nice obit in the New York Times.
“The truth is that these companies won’t fundamentally change because their entire business model relies on generating more engagement, and nothing generates more engagement than lies, fear and outrage.”
Shortly after I wrote Building vs. Streaming in popped an email from Drew Austin, who was musing about what happens when a new product/service fills a void and thereby leads to the decline of whatever filled it beforehand.
Here’s the money quote:
The increasingly-maligned model of VC-funded, loss-leading hypergrowth in the pursuit of market dominance, understood another way, is a quest to create voids that matter, voids that will hurt if we let them emerge by rejecting the product currently filling them (the fissures of a post-WeWork world are at least perceptible now). In the early ‘00s, when Blockbuster died out, it was clear that something better was replacing it (there’s a nostalgic counterargument that I’m tempted to indulge, but let’s just accept this). Today, it’s more common to watch something decline without a replacement that’s clearly better. It’s easy to understand why physical media led to file-sharing and then streaming, but what comes after Netflix and Spotify? Does anyone think it’s likely to be another improvement? I don’t, and the companies’ Facebook-like pursuit of absolute ubiquity is why. Unlike the immediately-filled Blockbuster void, I fear the Spotify void. I already got rid of all my CDs. The residue of buildings and cities determines what gets built on top of them, and if we’re conscientious, we’ll build with a more distant future in mind.
Dave Winer has come up with an nice metaphor for the impeachment process:
If you think of the United States as a company, we’ve had a strategic partnership with Russia for the last three years, kind of like the one Microsoft had with IBM. Russia is analogous to Microsoft. They’re about to roll over us in the 2020 election. Our last gasp is the impeachment.
[…]
Impeachment is like IBM shipping OS/2 and the Micro Channel Architecture. Both were designed to rid IBM of Microsoft once and for all. But it didn’t work. It was too little too late. Microsoft came out with Windows 3.0, and IBM became a global consulting company. The company that dominated the computer business left the computer business. With the US and Russia analogy substitute “computer business” with “democracy business.”
Ouch! Full disclosure: I was foolish enough to fall for IBM’s ploy. On a research budget I bought an IBM PS2 computer running OS2. It was a turkey with only one good point: a really nice keyboard!