This morning’s Observer column:
When Edward Snowden first revealed the extent of government surveillance of our online lives, the then foreign secretary, William (now Lord) Hague, immediately trotted out the old chestnut: “If you have nothing to hide, then you have nothing to fear.” This prompted replies along the lines of: “Well then, foreign secretary, can we have that photograph of you shaving while naked?”, which made us laugh, perhaps, but rather diverted us from pondering the absurdity of Hague’s remark. Most people have nothing to hide, but that doesn’t give the state the right to see them as fair game for intrusive surveillance.
During the hoo-ha, one of the spooks with whom I discussed Snowden’s revelations waxed indignant about our coverage of the story. What bugged him (pardon the pun) was the unfairness of having state agencies pilloried, while firms such as Google and Facebook, which, in his opinion, conducted much more intensive surveillance than the NSA or GCHQ, got off scot free. His argument was that he and his colleagues were at least subject to some degree of democratic oversight, but the companies, whose business model is essentially “surveillance capitalism”, were entirely unregulated.
He was right…
Lovely comment from the inestimable Dave Pell:
It’s long been described as the sharing economy. But, of course, there is little real sharing going on. The gig economy is just another way to pay people to give you a ride or rent you a room or bring you a meal. Even if the sharing economy is really the on-demand economy, does it represent a new, more worker-friendly, more altruistic version of the working life? The New Yorker’s Nathan Heller wonders: Is The Gig Economy Working? “The American workplace is both a seat of national identity and a site of chronic upheaval and shame. The industry that drove America’s rise in the nineteenth century was often inhumane. The twentieth-century corrective—a corporate workplace of rules, hierarchies, collective bargaining, triplicate forms—brought its own unfairnesses. Gigging reflects the endlessly personalizable values of our own era, but its social effects, untried by time, remain uncertain.” In a perfect version of the sharing economy, I would summarize Heller’s findings and deliver them to you in easily digestable, bite-sized chunks. But once you see the rates I charge, I have a feeling you’ll want to try Task Rabbit.
Here’s a telling excerpt from a fine piece about Facebook by Farhad Manjoo:
The people who work on News Feed aren’t making decisions that turn on fuzzy human ideas like ethics, judgment, intuition or seniority. They are concerned only with quantifiable outcomes about people’s actions on the site. That data, at Facebook, is the only real truth. And it is a particular kind of truth: The News Feed team’s ultimate mission is to figure out what users want — what they find “meaningful,” to use Cox and Zuckerberg’s preferred term — and to give them more of that.
This ideal runs so deep that the people who make News Feed often have to put aside their own notions of what’s best. “One of the things we’ve all learned over the years is that our intuition can be wrong a fair amount of the time,” John Hegeman, the vice president of product management and a News Feed team member, told me. “There are things you don’t expect will happen. And we learn a lot from that process: Why didn’t that happen, and what might that mean?” But it is precisely this ideal that conflicts with attempts to wrangle the feed in the way press critics have called for. The whole purpose of editorial guidelines and ethics is often to suppress individual instincts in favor of some larger social goal. Facebook finds it very hard to suppress anything that its users’ actions say they want. In some cases, it has been easier for the company to seek out evidence that, in fact, users don’t want these things at all.
Facebook’s two-year-long battle against “clickbait” is a telling example. Early this decade, the internet’s headline writers discovered the power of stories that trick you into clicking on them, like those that teasingly withhold information from their headlines: “Dustin Hoffman Breaks Down Crying Explaining Something That Every Woman Sadly Already Experienced.” By the fall of 2013, clickbait had overrun News Feed. Upworthy, a progressive activism site co-founded by Pariser, the author of “The Filter Bubble,” that relied heavily on teasing headlines, was attracting 90 million readers a month to its feel-good viral posts.
If a human editor ran News Feed, she would look at the clickbait scourge and make simple, intuitive fixes: Turn down the Upworthy knob. But Facebook approaches the feed as an engineering project rather than an editorial one. When it makes alterations in the code that powers News Feed, it’s often only because it has found some clear signal in its data that users are demanding the change. In this sense, clickbait was a riddle. In surveys, people kept telling Facebook that they hated teasing headlines. But if that was true, why were they clicking on them? Was there something Facebook’s algorithm was missing, some signal that would show that despite the clicks, clickbait was really sickening users?
If you want to understand why fake news will be a hard problem to crack, this is a good place to start.
My Observer review of Jonathan Taplin’s Move Fast and Break Things:
Much has been made in previous histories of Silicon Valley’s counter-cultural origins. Taplin finds other, less agreeable roots, notably in the writings of Ayn Rand, a flake of Cadbury proportions who had an astonishing impact on many otherwise intelligent individuals. These include Alan Greenspan, the Federal Reserve chairman who presided over events leading to the banking collapse of 2008, and [Peter] Thiel, who made an early fortune out of PayPal and was the first investor in Facebook. Rand believed that “achievement of your happiness is the only moral purpose of your life”. She had no time for altruism, government or anything else that might interfere with capitalism red in tooth and claw.
Neither does Thiel. For him, “competition is for losers”. He believes in investing only in companies that have the potential to become monopolies and he thinks monopolies are good for society. “Americans mythologise competition and credit it with saving us from socialist bread lines,” he once wrote. “Actually, capitalism and competition are opposites. Capitalism is premised on the accumulation of capital, but under perfect competition, all profits get competed away.”
The three great monopolies of the digital world have followed the Thiel playbook and Taplin does a good job of explaining how each of them works and how, strangely, their vast profits are never “competed away”. He also punctures the public image so assiduously fostered by Google and Facebook – that they are basically cool tech companies run by good chaps (and they are still mainly chaps, btw) who are hellbent on making the world a better place – whereas, in fact, they are increasingly hard to distinguish from the older brutes of the capitalist jungle…
Jia Tolentino has a very good piece in the New Yorker about the ideology that underpins the gig economy. The piece opens with the story of Mary, a Lyft driver in Chicago who kept accepting rides even though she was nine months pregnant – and even kept going when her contractions began!
In the end, it ended well. Mary had a customer who only needed a short ride, so she was able to drive herself to hospital after dropping him off. Once there, she gave birth to a baby girl — who appears on the company blog wearing a “Little Miss Lyft” onesie.
The point of the company blog post is to laud the spirit of workers like Mary. But, writes Tolentino,
It does require a fairly dystopian strain of doublethink for a company to celebrate how hard and how constantly its employees must work to make a living, given that these companies are themselves setting the terms. And yet this type of faux-inspirational tale has been appearing more lately, both in corporate advertising and in the news. Fiverr, an online freelance marketplace that promotes itself as being for “the lean entrepreneur”—as its name suggests, services advertised on Fiverr can be purchased for as low as five dollars—recently attracted ire for an ad campaign called “In Doers We Trust.” One ad, prominently displayed on some New York City subway cars, features a woman staring at the camera with a look of blank determination. “You eat a coffee for lunch,” the ad proclaims. “You follow through on your follow through. Sleep deprivation is your drug of choice. You might be a doer.”
Quite so. Lyft drivers in Chicago earn about $11 per trip.
Perhaps, as Lyft suggests, Mary kept accepting riders while experiencing contractions because “she was still a week away from her due date,” or “she didn’t believe she was going into labor yet.” Or maybe Mary kept accepting riders because the gig economy has further normalized the circumstances in which earning an extra eleven dollars can feel more important than seeking out the urgent medical care that these quasi-employers do not sponsor. In the other version of Mary’s story, she’s an unprotected worker in precarious circumstances.
This morning’s Observer column:
And so the advertisers’ money, diverted from print and TV, cascaded into the coffers of Google and co. In 2012, Procter & Gamble announced that it would make $1bn in savings by targeting consumers through digital and social media. It has got to the point where, according to last week’s Financial Times, 2017 will be the year when advertisers spend more online than they do on TV.
Trebles all round, then? Not quite. It turns out that the advertising industry is beginning to smell a rat in this hi-tech nirvana. In a speech to the annual conference of the Internet Advertising Bureau in January, the Procter & Gamble boss, Marc Pritchard, said this: “We have seen an exponential increase in, well… crap. Craft or crap? Technology enables both and all too often the outcome has been more crappy advertising accompanied by even crappier viewing experiences… is it any wonder ad blockers are growing 40%?”
But the exponential growth in crap is not the biggest problem, he said. Much more worrying was the return of the Wanamaker problem: how many people are actually seeing these ads?
This neat formulation from a 2014 essay by Shoshanna Zuboff:
We often hear that our privacy rights have been eroded and secrecy has grown. But that way of framing things obscures what’s really at stake. Privacy hasn’t been eroded. It’s been expropriated. The difference in framing provides new ways to define the problem and consider solutions.
In the conventional telling, privacy and secrecy are treated as opposites. In fact, one is a cause and the other is an effect. Exercising our right to privacy leads to choice. We can choose to keep something secret or to share it, but we only have that choice when we first have privacy. Privacy rights confer decision rights. Privacy lets us decide where we want to be on the spectrum between secrecy and transparency in each situation. Secrecy is the effect; privacy is the cause.
I suggest that privacy rights have not been eroded, if anything they’ve multiplied. The difference now is how these rights are distributed. Instead of many people having some privacy rights, nearly all the rights have been concentrated in the hands of a few. On the one hand, we have lost the ability to choose what we keep secret, and what we share. On the other, Google, the NSA, and others in the new zone have accumulated privacy rights. How? Most of their rights have come from taking ours without asking. But they also manufactured new rights for themselves, the way a forger might print currency. They assert a right to privacy with respect to their surveillance tactics and then exercise their choice to keep those tactics secret.
We need more writing like this. On the phony ‘privacy vs security’ question, for example.
As George Lakoff pointed out many years ago (but only right-wingers listened), creative framing is the way to win both arguments and votes.
My Observer piece on Yuval Noah Harari’s Homo Deus: A Brief History of Tomorrow.
ALSO David Runciman’s review of the book is here