“Those who predict the future we call futurists. Those who know when the future will happen we call billionaires”.
Horace Dediu (@asymco)
“Those who predict the future we call futurists. Those who know when the future will happen we call billionaires”.
Horace Dediu (@asymco)
My longish essay on ways of thinking about the Internet — in today’s Observer:
So we find ourselves living in this paradoxical world, which is both wonderful and frightening. Social historians will say that there’s nothing new here: the world was always like this. The only difference is that we now experience it 24/7 and on a global scale. But as we thrash around looking for a way to understand it, our public discourse is depressingly Manichean: tech boosters and evangelists at one extreme; angry technophobes at the other; and most of us somewhere in between. Small wonder that Manuel Castells, the great scholar of cyberspace, once described our condition as that of “informed bewilderment”.
One way of combating this bewilderment is to look for metaphors. The idea of the net as a mirror held up to human nature is one. But recently people have been looking for others. According to IT journalist Sean Gallagher, the internet ‘looks a lot’ like New York of the late 70s: ‘There is a cacophony of hateful speech, vice of every kind… and policemen trying to keep a lid on all of it’…
“Into the face of the young man … who sat on the terrace of the Hotel Magnifique at Cannes there had crept a look of furtive shame, the shifty hangdog look which announces that an Englishman is about to speak French.”
Thus P.G. Wodehouse in The Luck of the Bodkins — one of the quotations in the Observer‘s scoop that the great man’s archive is coming to the British Museum. Hooray!
This morning’s Observer column:
It’s interesting how particular years acquire historical significance: 1789 (the French Revolution); 1914 (outbreak of the first world war); 1917 (the Russian revolution); 1929 (the Wall Street crash); 1983 (switching on of the internet); 1993 (the Mosaic Web browser, which started the metamorphosis of the internet from geek sandpit to the nervous system of the planet). And of course 2016, the year of Brexit and Trump, the implications of which are, as yet, unknown.
But what about 2007? That was the year when Slovenia adopted the euro, Bulgaria and Romania joined the EU, Kurt Vonnegut died, smoking in enclosed public places was banned in the UK, a student shot 32 people dead and wounded 17 others at Virginia Tech, Luciano Pavarotti died and Benazir Bhutto was assassinated. Oh – and it was also the year that Steve Jobs launched the Apple iPhone.
And that, I suspect, is the main – perhaps the only – reason that 2007 will be counted as a pivotal year, because it was the moment that determined how the internet would evolve…
“had 174,808 inhabitants in 1951. By 2015, the number had dropped to 56,072. That’s about 2,000 fewer residents than Venice had in the aftermath of the plague of 1348. Maybe the ancient records can’t be trusted. But you get the idea.”
I do. Better book that holiday now.
Jack Shafer is right: the moral panic about fake news on social media — especially Facebook — looks like becoming serious. But, he warns, the cure (Zuckerberg becoming the world’s censor) would be worse than the disease.
Already, otherwise intelligent and calm observers are cheering plans set forth by Facebook’s Mark Zuckerberg to censor users’ news feeds in a fashion that will eliminate fake news. Do we really want Facebook exercising this sort of top-down power to determine what is true or false? Wouldn’t we be revolted if one company owned all the newsstands and decided what was proper and improper reading fare?
Once established to crush fake news, the Facebook mechanism could be repurposed to crush other types of information that might cause moral panic. This cure for fake news is worse than the disease.
As we applaud Facebook’s decision to blue-pencil the News Feed, we need to ask why fake news exists and—as I previously wrote—why it has existed for centuries…
Good question. And the answer:
The audience for fake news resembles the crowds who pay money to attend magic shows. Magic-show patrons know going in that some of what they’re going to see is genuine. But they also know that a good portion of what they’re going to see is going to look real but be phony. Like a woman sawed in half. Or an act of levitation. Being shown something fantastical that is almost true brings delight to almost everybody. People like to be fooled…
Spot on. That’s why millions of people in the UK pay good money every day to buy the Sun and (worse) the Daily Express. It also partly explained why they liked Trump. Sad but true fact about human nature. Or, as Shafer puts it, “Deep in the brain exists a hungry lobe that loves to be deceived.” Sigh.
When it dawned on me in August that Trump could conceivably pull it off, a book came to mind — Philip Roth’s The Plot Against America, which is an imaginative speculation on what would have happened if Charles Lindbergh, aviator hero and Nazi sympathiser, had beaten FDR to the Presidency in 1940. So I downloaded and read it. The novel chronicles the fortunes of the (Jewish) Roth family as anti-semitism becomes more mainstream during Lindbergh’s tenure of office. It’s imaginative and clever and persuasive. But then I forgot about it as the election campaign proceeded.
A few days before the election, I noticed a colleague smugly brandishing another book — this time Sinclair Lewis’s It Can’t Happen Here — a cautionary tale about the fragility of democracy which describes how easily fascism could take hold in America. When I checked on Amazon I discovered that my colleague was clearly not the only person with a premonition — Amazon had run out of stocks of the volume.
And now I find that another book — this time by a famous philosopher — seems eerily prescient. It’s Richard Rorty’s Achieving Our Country: Leftist Thought in Twentieth-century America, excerpts from which have been going viral across the Net. For example:
The nonsuburban electorate will decide that the system has failed and start looking around for a strongman to vote for – someone willing to assure them that, once he is elected, the smug bureaucratics, the tricky lawyers, overpaid bond salesmen, and postmodernist professors will no longer be calling the shots.
One thing that is very likely to happen is that the gains made in the past forty years by black and brown Americans, and by homosexuals, will be wiped out. Jocular contempt for women will come back into fashion. The words “n—-r” and “kike” will once again be heard in the workplace. All the resentment which badly educated Americans feel about having their manners dictated to them by college graduates will find an outlet.
Rorty wrote that in 1999.
As the world becomes increasingly driven by algorithms that are, effectively, ‘black boxes’, issues of responsibility, liability and accountability are becoming acute. Two researchers — Nicholas Diakopoulos of the University of Maryland, College Park and Sorelle Friedler from Data & Society are proposing five principles that might be helpful. They are:
Responsibility. “For any algorithmic system, there needs to be a person with the authority to deal with its adverse individual or societal effects in a timely fashion. This is not a statement about legal responsibility but, rather, a focus on avenues for redress, public dialogue, and internal authority for change”.
Explainability. “Any decisions produced by an algorithmic system should be explainable to the people affected by those decisions. These explanations must be accessible and understandable to the target audience; purely technical descriptions are not appropriate for the general public.”
Accuracy “The principle of accuracy suggests that sources of error and uncertainty throughout an algorithm and its data sources need to be identified, logged, and benchmarked.”
Auditability “The principle of auditability states that algorithms should be developed to enable third parties to probe and review the behavior of an algorithm… While there may be technical challenges in allowing public auditing while protecting proprietary information, private auditing (as in accounting) could provide some public assurance.”
Fairness “All algorithms making decisions about individuals should be evaluated for discriminatory effects. The results of the evaluation and the criteria used should be publicly released and explained.”
Not rocket science, but useful. What I like about this work is that it adds value. We all know by now that algorithmic decision-making is problematic. The next step is to figure out what to do about it, given that algorithms are here to stay.
The Investigatory Powers Act has passed through Parliament and will soon be law. It provides the UK intelligence agencies and police with what the Guardian‘s Ewen MacAskill described as “the most sweeping surveillance powers in the western world” and it passed into law with “barely a whimper, meeting only token resistance over the past 12 months from inside parliament and barely any from outside”. The Bill’s relatively serene passage through the legislature surprised many in government, and was probably partly due to the fact that the Labour party, under Jeremy Corbyn, seems largely uninterested in its responsibilities as the official opposition.
It’s not all bad news: the Act brings under explicit oversight a whole range of activities that were hitherto carried out under obscure, possibly dodgy, legal provisions and with totally inadequate oversight. So at least you could say that, at last, the activities of the secret state are all in a single piece of legislation.
On the other hand, the powers granted by the Act in relation to data retention are indeed sweeping, and include some new powers to conduct what is euphemistically termed ‘Equipment Interference’ — which is essentially legalised hacking; their inclusion in the Act is in effect an implicit admission that GCHQ and the security services have been doing this stuff anyway for some time.
The Act confirms that the British state’s appetite for fine-grained communications data seems insatiable and is destined to grow. Confronted with this new reality, one celebrated ex-spook once remarked that we are “a keystroke away from totalitarianism”. What he meant is that the information resources now available to states would be a godsend to an authoritarian regime that wasn’t restrained by constitutional niceties, civil liberties or human rights.
When one puts this point to spooks and government officials, however, their instinctive response is to pooh-pooh the idea. It may be technically true, they say, but — come on! — we live in a democracy and the chances of an authoritarian bully gaining power in such a polity are, well, infinitesimal.
Well, that was then and this is now. An authoritarian bully with no apparent respect for the rule of law will become president of the United States on January 20. Given that the British state has a long history of close co-operation with the US national security state, it’s possible that the new powers conferred on British agencies by the Investigatory Powers Act might mean that personal data on British subjects will be slipping noiselessly into the computerised maw of President Trump’s newly-energised security services. If this country had a functioning parliamentary opposition maybe Mrs May’s Bill would have had a rougher passage, and the Act would have been less sweeping. But the opportunity to rein in the surveillance state has been missed for a generation.
Bruno Latour has an interesting essay on the implications of Brexit and Trump. His main point is that both Trump and Clinton were selling utopian fantasies: one about reversing globalisation, the other about keeping it going but with a better bedside manner. Neither is now plausible because the earth cannot take it any more. “Our incapacity to foresee”, Latour writes,
has been the main lesson of this cataclysm: how could we have been so wrong? All the polls, all the newspapers, all the commentators, the entire intelligentsia. It is as if we had completely lacked any means of encountering those whom we struggled even to name: the “uneducated white men,” the ones that “globalization left behind”; some even tried calling them “deplorables.”
There’s no question that those people are out there, but we have utterly failed to hear their voices, let alone represent them. Despite having spent the past six weeks at American universities, I have yet to hear a single account of those “other people” that is realistic enough to truly unsettle us. They are, it seems, just as invisible, inaudible, and incomprehensible as the Barbarians outside the gates of Athens. We, the “intellectuals,” live in a bubble — or, perhaps better, on an archipelago amid a sea of discontents.
The real tragedy, though, is that the others live in a bubble, too: a world of the past completely undisturbed by climate change, a world that no fact, study, or science can shake. After all, they swallowed all the lies of the calls to restore an old order with perfect enthusiasm, while the alarm bells of the fact-checkers went on ringing unheard. A Trump goes on lying and cheating without remorse, and what a pleasure it is to be misled. We can’t expect them to play the roles of good, common-sense people, with their feet planted firmly on the ground. Their ideals are even more illusory than ours.
We thus find ourselves with our countries split in two, each half becoming ever less capable of grasping its own reality, let alone the other side’s. The first half — let us call them the globalized — believe that the horizon of emancipation and modernity (often confused with the reign of finance) can still expand to embrace the whole planet. Meanwhile, the second half has decided to retreat to the Aventine Hill, dreaming of a return to a past world…
Nice essay, worth reading in full.