Facebook’s sudden attack of modesty

One of the most illuminating things you can do as a researcher is to go into Facebook not as a schmuck (i.e. user) but as an advertiser — just like your average Russian agent. Upon entering, you quickly begin to appreciate the amazing ingenuity and comprehensiveness of the machine that Zuckerberg & Co have constructed. It’s utterly brilliant, with a great user interface and lots of automated advice and help for choosing your targeted audience.

When doing this a while back — a few months after Trump’s election — I noticed that there was a list of case studies of different industries showing how effective a given targeting strategy could be in a particular application. One of those ‘industries’ was “Government and Politics” and among the case studies was a story of how a Facebook campaign had proved instrumental in helping a congressional candidate to win against considerable odds. I meant to grab some screenshots of this uplifting tale, but of course forget to do so. When I went back later, the case study had, well, disappeared.

Luckily, someone else had the presence of mind to grab a screenshot. The Intercept, bless it, has the before-and-after comparison shown in the image above. They are Facebook screenshots from (left) June 2017 and (right) March 2018.

Interesting, ne c’est pas?

What Facebook is for

From the Columbia Journalism Review:

Digital-journalism veteran David Cohn has argued that the network’s main purpose is not information so much as it is identity, and the construction by users of a public identity that matches the group they wish to belong to. This is why fake news is so powerful.

“The headline isn’t meant to inform somebody about the world,” wrote Cohn, a senior director at Advance Publications, which owns Condé Nast and Reddit. “The headline is a tool to be used by a person to inform others about who they are. ‘This is me,’ they say when they share that headline. ‘This is what I believe. This shows what tribe I belong to.’ It is virtue signaling.”

Twitter suffers from a similar problem, in the sense that many users seem to see their posts as a way of displaying (or arguing for) their beliefs rather than a way of exchanging verifiable news. But Facebook’s role in the spread of misinformation is orders of magnitude larger than Twitter’s: 2 billion monthly users versus 330 million.

Theresa May’s pious hopes for Facebook

This morning’s Observer column:

It has taken an age, but at last politicians seem to be waking up to the societal problems posed by the dominance of certain tech firms – notably Facebook, Twitter and Google – and in particular the way they are allowing their users to pollute the public sphere with extremist rhetoric, hate speech, trolling and multipurpose abusiveness.

The latest occupant of the “techlash” bandwagon is Theresa May, who at the time of writing was still the UK’s prime minister…

Read on

President Zuck

There’s a fascinating article in The Verge based on an interview with Travis McGinn, an opinion pollster who was hired by Facebook to lead an ongoing polling operation to track minute changes in public perceptions of the company’s founder and CEO, Mark Zuckerberg.

It was a very unusual role,” McGinn says. “It was my job to do surveys and focus groups globally to understand why people like Mark Zuckerberg, whether they think they can trust him, and whether they’ve even heard of him. That’s especially important outside of the United States.”

McGinn tracked a wide range of questions related to Zuckerberg’s public perception. “Not just him in the abstract, but do people like Mark’s speeches? Do they like his interviews with the press? Do people like his posts on Facebook? It’s a bit like a political campaign, in the sense that you’re constantly measuring how every piece of communication lands. If Mark’s doing a barbecue in his backyard and he hops on Facebook Live, how do people respond to that?”

Facebook worked to develop an understanding of Zuckerberg’s perception that went beyond simple “thumbs-up” or “thumbs-down” metrics, McGinn says. “If Mark gives a speech and he’s talking about immigration and universal health care and access to equal education, it’s looking at all the different topics that Mark mentions and seeing what resonates with different audiences in the United States,” he says. “It’s very advanced research.”

Well, well. Now when was the last time a corporation devoted that kind of resource to determine how the great unwashed perceives its CEO? And — since nothing strategic goes on at Facebook without the boss’s say-so, what does it tell us of Zuckerberg’s delusions about himself?

“Facebook is Mark, and Mark is Facebook,” McGinn says.

“Mark has 60 percent voting rights for Facebook. So you have one individual, 33 years old, who has basically full control of the experience of 2 billion people around the world. That’s unprecedented. Even the president of the United States has checks and balances. At Facebook, it’s really this one person.”
McGinn claimed that he joined Facebook “hoping to have an impact from the inside.“

”I thought, here’s this huge machine that has a tremendous influence on society, and there’s nothing I can do as an outsider. But if I join the company, and I’m regularly taking the pulse of Americans to Mark, maybe, just maybe that could change the way the company does business. I worked there for six months and I realized that even on the inside, I was not going to be able to change the way that the company does business. I couldn’t change the values. I couldn’t change the culture. I was probably far too optimistic.”

This sounds extraordinarily naive of McGinn. Didn’t he understand the business model on which the company is based?

Why Facebook has abandoned news for the important business of trivia

Today’s Observer column:

Connoisseurs of corporate cant have a new collector’s item: Mark Zuckerberg’s latest Epistle to his Disciples. “We built Facebook,” it begins, “to help people stay connected and bring us closer together with the people that matter to us. That’s why we’ve always put friends and family at the core of the experience. Research shows that strengthening our relationships improves our wellbeing and happiness.”

Quite so. But all is not well, it seems. “Recently,” continues Zuck, sorrowfully, “we’ve gotten feedback from our community that public content – posts from businesses, brands and media – is crowding out the personal moments that lead us to connect more with each other.”

Well, well. How did this happen? Simple: it turns out that “video and other public content have exploded on Facebook in the past couple of years. Since there’s more public content than posts from your friends and family, the balance of what’s in news feed has shifted away from the most important thing Facebook can do – help us connect with each other.”

Note the impersonality of all this. Somehow, this pestilential content has “exploded” on Facebook. Which is odd, is it not, given that nothing appears in a user’s news feed that isn’t decided by Facebook?

Read on

Facebook’s new gateway drug for kids

This morning’s Observer column:

In one of those coincidences that give irony a bad name, Facebook launched a new service for children at the same time that a moral panic was sweeping the UK about the dangers of children using live-streaming apps that enable anyone to broadcast video directly from a smartphone or a tablet. The BBC showed a scary example of what can happen. A young woman who works as an internet safety campaigner posed as a 14-year-old girl to find out what occurs when a young female goes online using one of these streaming services…

Read on

Facebook’s biggest ethical dilemma: unwillingness to acknowledge that it has one

There are really only two possible explanations for the crisis now beginning to engulf Facebook. One is that the company’s founder was — and perhaps still is — a smart but profoundly naive individual who knows little about the world or about human behaviour. The other is that he is — how shall I put it? — a sociopath, indifferent to what happens to people so long as his empire continues to grow.

I prefer the former explanation, but sometimes one wonders…

Consider Free Basics — the program to bring Internet access to millions of people in poor countries. It works by having Facebook pre-installed on cheap smartphones together with deals with local mobile networks that traffic to the Facebook app will not incur any data charges.

The cynical interpretation of this is that it’s a way of furthering Zuckerberg’s goal of replacing the Internet with Facebook, creating the ultimate global walled garden. The charitable spin is the one Zuckerberg himself put on it — that Free Basics provides a way to connect people who would otherwise never go online.

Either way, the effects were predictable: new users in these countries think that Facebook is the Internet; and Facebook becomes the major channel for news. The NYT has a sobering report on what happened in Myanmar, where Facebook now has millions of users.

“Facebook has become sort of the de facto internet for Myanmar,” said Jes Kaliebe Petersen, chief executive of Phandeeyar, Myanmar’s leading technology hub that helped Facebook create its Burmese-language community standards page. “When people buy their first smartphone, it just comes preinstalled.”

But since the company took no editorial responsibility for what people used its service for, when it transpired that it was being used to stir up ethnic hatred and worse, it seemed unable to spot what was happening. “Facebook”, reports the Times,

has become a breeding ground for hate speech and virulent posts about the Rohingya. And because of Facebook’s design, posts that are shared and liked more frequently get more prominent placement in feeds, favoring highly partisan content in timelines.

Ashin Wirathu, the monk, has hundreds of thousands of followers on Facebook accounts in Burmese and English. His posts include graphic photos and videos of decaying bodies that Ashin Wirathu says are Buddhist victims of Rohingya attacks, or posts denouncing the minority ethnic group or updates that identify them falsely as “Bengali” foreigners.

It’s the same story as everywhere else that Facebook has touched. A company that built a money-making advertising machine which gets its revenues from monetising user activity finds that sometimes that activity is very unsavoury and inhumane. And when this is finally realised, it finds itself caught between a rock and a hard place, unwilling to accept responsibility from the unintended consequences of its wealth-generating machine.