Here’s a telling excerpt from a fine piece about Facebook by Farhad Manjoo:
The people who work on News Feed aren’t making decisions that turn on fuzzy human ideas like ethics, judgment, intuition or seniority. They are concerned only with quantifiable outcomes about people’s actions on the site. That data, at Facebook, is the only real truth. And it is a particular kind of truth: The News Feed team’s ultimate mission is to figure out what users want — what they find “meaningful,” to use Cox and Zuckerberg’s preferred term — and to give them more of that.
This ideal runs so deep that the people who make News Feed often have to put aside their own notions of what’s best. “One of the things we’ve all learned over the years is that our intuition can be wrong a fair amount of the time,” John Hegeman, the vice president of product management and a News Feed team member, told me. “There are things you don’t expect will happen. And we learn a lot from that process: Why didn’t that happen, and what might that mean?” But it is precisely this ideal that conflicts with attempts to wrangle the feed in the way press critics have called for. The whole purpose of editorial guidelines and ethics is often to suppress individual instincts in favor of some larger social goal. Facebook finds it very hard to suppress anything that its users’ actions say they want. In some cases, it has been easier for the company to seek out evidence that, in fact, users don’t want these things at all.
Facebook’s two-year-long battle against “clickbait” is a telling example. Early this decade, the internet’s headline writers discovered the power of stories that trick you into clicking on them, like those that teasingly withhold information from their headlines: “Dustin Hoffman Breaks Down Crying Explaining Something That Every Woman Sadly Already Experienced.” By the fall of 2013, clickbait had overrun News Feed. Upworthy, a progressive activism site co-founded by Pariser, the author of “The Filter Bubble,” that relied heavily on teasing headlines, was attracting 90 million readers a month to its feel-good viral posts.
If a human editor ran News Feed, she would look at the clickbait scourge and make simple, intuitive fixes: Turn down the Upworthy knob. But Facebook approaches the feed as an engineering project rather than an editorial one. When it makes alterations in the code that powers News Feed, it’s often only because it has found some clear signal in its data that users are demanding the change. In this sense, clickbait was a riddle. In surveys, people kept telling Facebook that they hated teasing headlines. But if that was true, why were they clicking on them? Was there something Facebook’s algorithm was missing, some signal that would show that despite the clicks, clickbait was really sickening users?
If you want to understand why fake news will be a hard problem to crack, this is a good place to start.
Google’s Chrome browser is popular world wide. And it turns out that many of its users don’t like ads — which is very naughty of them in an ad-based universe. But now there are rumours that Google plans to incorporate some kind of blocking of “unacceptable” ads in its browser. Which of course might be welcome to many users. But it would also make Google the arbiter of what is “unacceptable”.
Now here’s something you couldn’t make up — unless you have plumbed the depths of surveillance capitalism. Unroll.me is a ‘service’ that promises to help you clean up your inbox. You give it permission to access your Gmail, for example, and: “Instantly see a list of all your subscription emails. Unsubscribe easily from whatever you don’t want.”
Unroll.me is owned by an analytics outfit called Slice Intelligence. And last week the New York Times (in a profile of Uber’s controversial boss, Travis Kalanick) revealed that Unroll was collecting its subscribers’ emailed Lyft receipts from their inboxes and selling the anonymized data to Uber — which used the data as a proxy for the health of its competitor’s business.
Embarrassing, eh? Not at all. Unroll’s boss, Jojo Hedaya, has published a post on the company blog under the headline “We Can Do Better”. “Our users are the heart of our company and service”, it begins,
So it was heartbreaking to see that some of our users were upset to learn about how we monetize our free service.
And while we try our best to be open about our business model, recent customer feedback tells me we weren’t explicit enough.
Note (i) “heartbreaking” and (ii) “recent customer feedback”. Translation: (i) disastrous; (ii) good investigative journalism by the New York Times.
Crocodile tears having been duly shed, Jojo continues:
So we need to do better for our users, and will from this point forward, with clearer messaging on our website, in our app, and in our FAQs. We will also be more clear about our data usage in our on-boarding process. The rest will remain the same: providing a killer service that gives you hours back in your day while protecting your privacy and security above all else.
I can’t stress enough the importance of your privacy. We never, ever release personal data about you. All data is completely anonymous and related to purchases only. To get a sense of what this data looks like and how it is used, check out the Slice Intelligence blog.
Thank you for being such an important part of our company. If there’s more we can be doing better, please let me know.
George Orwell would have really enjoyed this. Schmucks are “such an important part of our company”, for example. And he “can’t stress enough” the importance of said schmucks’ privacy.
But — as Charles Arthur points out — there’s nothing in Jojo’s FAQs about selling the data.
Yesterday’s Observer column:
The old adage “be careful what you wish for” comes to mind. A while back, Facebook launched Facebook Live, a service that enables its users to broadcast live video to the world. Shortly after the service was activated, the company’s founder and CEO, Mark Zuckerberg, said that the service would support all the “personal and emotional and raw and visceral” ways that people communicate. Users were encouraged to “go live” in casual settings – waiting for baggage at the airport, for example, or eating at a restaurant.
Note the phrase “raw and visceral”. Facebook Live has already broadcast a live stream of a young disabled man being tied up, gagged and attacked with a knife. In March, two Chicago teenage boys live-streamed themselves gang-raping a teenage girl. And around 40 Facebook users watched the video without reporting it either to Facebook or the police.
That’s pretty raw and visceral, you might think. But it turns out that it was just a prelude…
From an interesting NYT piece on how Google is coining money by allowing firms to put product information in the space immediately below the search bar.
Product ads that appeal to shoppers are also strategically important because consumers are starting their online shopping at Amazon.com. Last year, a survey of 2,000 American shoppers found that 55 percent turn to Amazon first when searching for a product, while only 28 percent start with a web search.
Larger size here
Much has been made in previous histories of Silicon Valley’s counter-cultural origins. Taplin finds other, less agreeable roots, notably in the writings of Ayn Rand, a flake of Cadbury proportions who had an astonishing impact on many otherwise intelligent individuals. These include Alan Greenspan, the Federal Reserve chairman who presided over events leading to the banking collapse of 2008, and [Peter] Thiel, who made an early fortune out of PayPal and was the first investor in Facebook. Rand believed that “achievement of your happiness is the only moral purpose of your life”. She had no time for altruism, government or anything else that might interfere with capitalism red in tooth and claw.
Neither does Thiel. For him, “competition is for losers”. He believes in investing only in companies that have the potential to become monopolies and he thinks monopolies are good for society. “Americans mythologise competition and credit it with saving us from socialist bread lines,” he once wrote. “Actually, capitalism and competition are opposites. Capitalism is premised on the accumulation of capital, but under perfect competition, all profits get competed away.”
The three great monopolies of the digital world have followed the Thiel playbook and Taplin does a good job of explaining how each of them works and how, strangely, their vast profits are never “competed away”. He also punctures the public image so assiduously fostered by Google and Facebook – that they are basically cool tech companies run by good chaps (and they are still mainly chaps, btw) who are hellbent on making the world a better place – whereas, in fact, they are increasingly hard to distinguish from the older brutes of the capitalist jungle…
The madness that afflicted Paris had also reached Venice. Fortunately, few of Venice’s bridges have suitable railings.