I believe what we’re seeing here is a convergence of two separate but highly charged news streams and political moments. On the one hand, you have the Russia probe, with all that is tied to that investigation. On another, you have the rising public backlash against Big Tech, the various threats it arguably poses and its outsized power in the American economy and American public life. A couple weeks ago, I wrote that after working with Google in various capacities for more than a decade I’d observed that Google is, institutionally, so accustomed to its customers actually being its products that when it gets into lines of business where its customers are really customers it really doesn’t know how to deal with them. There’s something comparable with Facebook.
Facebook is so accustomed to treating its ‘internal policies’ as though they were something like laws that they appear to have a sort of blind spot that prevents them from seeing how ridiculous their resistance sounds. To use the cliche, it feels like a real shark jumping moment. As someone recently observed, Facebook’s ‘internal policies’ are crafted to create the appearance of civic concerns for privacy, free speech, and other similar concerns. But they’re actually just a business model. Facebook’s ‘internal policies’ amount to a kind of Stepford Wives version of civic liberalism and speech and privacy rights, the outward form of the things preserved while the innards have been gutted and replaced by something entirely different, an aggressive and totalizing business model which in many ways turns these norms and values on their heads. More to the point, most people have the experience of Facebook’s ‘internal policies’ being meaningless in terms of protecting their speech or privacy or whatever as soon as they bump up against Facebook’s business model.
Here’s a telling excerpt from a fine piece about Facebook by Farhad Manjoo:
The people who work on News Feed aren’t making decisions that turn on fuzzy human ideas like ethics, judgment, intuition or seniority. They are concerned only with quantifiable outcomes about people’s actions on the site. That data, at Facebook, is the only real truth. And it is a particular kind of truth: The News Feed team’s ultimate mission is to figure out what users want — what they find “meaningful,” to use Cox and Zuckerberg’s preferred term — and to give them more of that.
This ideal runs so deep that the people who make News Feed often have to put aside their own notions of what’s best. “One of the things we’ve all learned over the years is that our intuition can be wrong a fair amount of the time,” John Hegeman, the vice president of product management and a News Feed team member, told me. “There are things you don’t expect will happen. And we learn a lot from that process: Why didn’t that happen, and what might that mean?” But it is precisely this ideal that conflicts with attempts to wrangle the feed in the way press critics have called for. The whole purpose of editorial guidelines and ethics is often to suppress individual instincts in favor of some larger social goal. Facebook finds it very hard to suppress anything that its users’ actions say they want. In some cases, it has been easier for the company to seek out evidence that, in fact, users don’t want these things at all.
Facebook’s two-year-long battle against “clickbait” is a telling example. Early this decade, the internet’s headline writers discovered the power of stories that trick you into clicking on them, like those that teasingly withhold information from their headlines: “Dustin Hoffman Breaks Down Crying Explaining Something That Every Woman Sadly Already Experienced.” By the fall of 2013, clickbait had overrun News Feed. Upworthy, a progressive activism site co-founded by Pariser, the author of “The Filter Bubble,” that relied heavily on teasing headlines, was attracting 90 million readers a month to its feel-good viral posts.
If a human editor ran News Feed, she would look at the clickbait scourge and make simple, intuitive fixes: Turn down the Upworthy knob. But Facebook approaches the feed as an engineering project rather than an editorial one. When it makes alterations in the code that powers News Feed, it’s often only because it has found some clear signal in its data that users are demanding the change. In this sense, clickbait was a riddle. In surveys, people kept telling Facebook that they hated teasing headlines. But if that was true, why were they clicking on them? Was there something Facebook’s algorithm was missing, some signal that would show that despite the clicks, clickbait was really sickening users?
If you want to understand why fake news will be a hard problem to crack, this is a good place to start.
The social network on Wednesday reached the latest milestones in its quest to dominate the world, topping 1.79 billion monthly visitors as of the end of September, up 16 percent from a year ago. Facebook also added a record number of new daily users and said for the first time that more than one billion people regularly used its network exclusively on their mobile device every month.
And those numbers do not even include Facebook’s other properties, such as the photo-sharing service Instagram and the messaging service WhatsApp.
Facebook’s user growth defies the usual trajectories for social media companies, which often start strong out of the gate and then sharply slow down. Twitter, which added four million new visitors last quarter, now serves a user base roughly one-sixth the size of Facebook’s. Snapchat, while popular among young users, has about 150 million daily users, about half as many as Twitter…
Many years ago, the political theorist Steven Lukes published a seminal book – Power: A Radical View. In it, he argued that power essentially comes in three varieties: the ability to compel people to do what they don’t want to do; the capability to stop them doing what they want to do; and the power to shape the way they think. This last is the kind of power exercised by our mass media. They can shape the public (and therefore the political) agenda by choosing the news that people read, hear or watch; and they can shape the ways in which that news is presented. Lukes’s “third dimension” of power is what’s wielded in this country by outlets like Radio 4’s Today programme, the Sun and the Daily Mail. And this power is real: it’s why all British governments in recent years have been so frightened of the Mail.
But as our media ecosystem has changed under the impact of the internet, new power brokers have appeared….
“Data is the new oil,” declared Clive Humby, a mathematician who was the genius behind the Tesco Clubcard. This insight was later elaborated by Michael Palmer of the Association of National Advertisers. “Data is just like crude [oil],” said Palmer. “It’s valuable, but if unrefined it cannot really be used. It has to be changed into gas, plastic, chemicals, etc to create a valuable entity that drives profitable activity; so must data be broken down, analysed for it to have value.”
There was just one thing wrong with the metaphor. Oil is a natural resource; it has to be found, drilled for and pumped from the bowels of the Earth. Data, in contrast, is a highly unnatural resource. It has to be created before it can be extracted and refined. Which raises the question of who, exactly, creates this magical resource? Answer: you and me…
Power and money are the two great aphrodisiacs, and few people or institutions are immune to their attractions. Not even the Economist, a posh magazine which resolutely sees itself as floating above the vulgar ruckus of journalistic hackery. Last week, like an elderly dowager seduced by Justin Bieber, the venerable publication checked its collective brains at the door and swooned over Mark Zuckerberg, the infant prodigy who now presides over Facebook, and so possesses both power and money.
For the cover illustration, the magazine photoshopped a picture of a celebrated statue of Emperor Constantine the Great (272-337). Young Zuckerberg’s head, adorned with a wreath of gold laurel leaves, replaced Constantine’s. The sword in his left hand was replaced by a Facebook logo, and the emperor’s languidly drooping right hand was rotated 180 degrees so that it now gave the thumbs-up that is Facebook’s “like” symbol. (The gesture had a rather different interpretation in Roman times.) On the plinth of the statue were the words: “MARCVS ZVCKERBERGVS” and CONIVNGE ET IMPERA”, which is the nearest the photoshopper could get to “connect and rule”.
On inside pages one finds an editorial and a long article explaining why Marcvs Z is the greatest thing since Constantine.
On Tuesday, the European court of justice, Europe’s supreme court, lobbed a grenade into the cosy, quasi-monopolistic world of the giant American internet companies.
It did so by declaring invalid a decision made by the European commission in 2000 that US companies complying with its “safe harbour privacy principles” would be allowed to transfer personal data from the EU to the US.
This judgment may not strike you as a big deal. You may also think that it has nothing to do with you.
Wrong on both counts, but to see why, some background might be useful….
LATER This is a truly extraordinary moment. Lots of interesting and informative stuff about it on the Web, including this piece by Julia Powles and this NYT piece by Robert Levine.
And this from Edward Snowden:
So what happens next? My colleague Nóra ní Loideain has passed me this reassuring note:
Christopher Graham, UK Information Commissioner, said on 8 October at a meeting at Dentons [a law firm]: “Don’t panic. Safe Harbor is not the only route for international transfers. We are coordinating our thinking with other DPAs across the European Union.” The 28 DPAs which form the EU Art. 29 Data Protection Working Party met in their International Transfers sub-group on 8 October, and this group’s plenary will discuss the issue on Thursday this week, on 15 October.