Back to the Future

“When steam power will be perfected, when, together with telegraphy and railways, it will have made distances disappear, it will not only be commodities which travel, but also ideas which will have wings. When fiscal and commercial barriers have been abolished between different states, as they have already been between the provinces of the same state; when different countries, in daily relations, tend toward the unity of peoples, how will you be able to revive the old mode of separation?”

Francois-René de Chateaubriand, 1841

Voodoo economics 2.0

Well, well. Here we go again. From the Boston Globe:

WASHINGTON — A white cloth napkin, now displayed in the National Museum of American History, helped change the course of modern economics. On it, the economist Arthur Laffer in 1974 sketched a curve meant to illustrate his theory that cutting taxes would spur enough economic growth to generate new tax revenue.

More than 40 years after those scribblings, President Donald Trump is reviving the so-called Laffer curve as he is set to announce the broad outlines of a tax overhaul on Wednesday. What the first President George Bush once called “voodoo economics” is back, as Trump’s advisers argue that deep cuts in corporate taxes will ultimately pay for themselves with an explosion of new business and job creation.

Wikipedia says:

The Laffer curve postulates that no tax revenue will be raised at the extreme tax rates of 0% and 100% and that there must be at least one rate which maximizes government taxation revenue. The Laffer curve is typically represented as a graph which starts at 0% tax with zero revenue, rises to a maximum rate of revenue at an intermediate rate of taxation, and then falls again to zero revenue at a 100% tax rate. The shape of the curve is uncertain and disputed.1

One implication of the Laffer curve is that increasing tax rates beyond a certain point will be counter-productive for raising further tax revenue…

As the Globe observes:

what the president has called a tax reform plan is looking more like a tax cut plan, showering taxpayers with rate reductions without offsetting the full cost by closing loopholes or raising taxes elsewhere. In the short run, such a plan would add many billions of dollars to the national deficit. Trump contends that it will be worth it in the long run.

“The tax plan will pay for itself with economic growth,” Steven Mnuchin, the Treasury secretary and main architect of the plan, told reporters this week.)

Questions: does any serious economist believe this? And isn’t it interesting that the proposed tax cuts will — coincidentally — benefit the Trump family and its subsidiaries?

Why fake news will be hard to fix — it’s the users, stoopid

Here’s a telling excerpt from a fine piece about Facebook by Farhad Manjoo:

The people who work on News Feed aren’t making decisions that turn on fuzzy human ideas like ethics, judgment, intuition or seniority. They are concerned only with quantifiable outcomes about people’s actions on the site. That data, at Facebook, is the only real truth. And it is a particular kind of truth: The News Feed team’s ultimate mission is to figure out what users want — what they find “meaningful,” to use Cox and Zuckerberg’s preferred term — and to give them more of that.

This ideal runs so deep that the people who make News Feed often have to put aside their own notions of what’s best. “One of the things we’ve all learned over the years is that our intuition can be wrong a fair amount of the time,” John Hegeman, the vice president of product management and a News Feed team member, told me. “There are things you don’t expect will happen. And we learn a lot from that process: Why didn’t that happen, and what might that mean?” But it is precisely this ideal that conflicts with attempts to wrangle the feed in the way press critics have called for. The whole purpose of editorial guidelines and ethics is often to suppress individual instincts in favor of some larger social goal. Facebook finds it very hard to suppress anything that its users’ actions say they want. In some cases, it has been easier for the company to seek out evidence that, in fact, users don’t want these things at all.

Facebook’s two-year-long battle against “clickbait” is a telling example. Early this decade, the internet’s headline writers discovered the power of stories that trick you into clicking on them, like those that teasingly withhold information from their headlines: “Dustin Hoffman Breaks Down Crying Explaining Something That Every Woman Sadly Already Experienced.” By the fall of 2013, clickbait had overrun News Feed. Upworthy, a progressive activism site co-founded by Pariser, the author of “The Filter Bubble,” that relied heavily on teasing headlines, was attracting 90 million readers a month to its feel-good viral posts.

If a human editor ran News Feed, she would look at the clickbait scourge and make simple, intuitive fixes: Turn down the Upworthy knob. But Facebook approaches the feed as an engineering project rather than an editorial one. When it makes alterations in the code that powers News Feed, it’s often only because it has found some clear signal in its data that users are demanding the change. In this sense, clickbait was a riddle. In surveys, people kept telling Facebook that they hated teasing headlines. But if that was true, why were they clicking on them? Was there something Facebook’s algorithm was missing, some signal that would show that despite the clicks, clickbait was really sickening users?

If you want to understand why fake news will be a hard problem to crack, this is a good place to start.

Google’s new power-grab

Google’s Chrome browser is popular world wide. And it turns out that many of its users don’t like ads — which is very naughty of them in an ad-based universe. But now there are rumours that Google plans to incorporate some kind of blocking of “unacceptable” ads in its browser. Which of course might be welcome to many users. But it would also make Google the arbiter of what is “unacceptable”.

Source

Hypocrisy on stilts

Now here’s something you couldn’t make up — unless you have plumbed the depths of surveillance capitalism. Unroll.me is a ‘service’ that promises to help you clean up your inbox. You give it permission to access your Gmail, for example, and: “Instantly see a list of all your subscription emails. Unsubscribe easily from whatever you don’t want.”

Unroll.me is owned by an analytics outfit called Slice Intelligence. And last week the New York Times (in a profile of Uber’s controversial boss, Travis Kalanick) revealed that Unroll was collecting its subscribers’ emailed Lyft receipts from their inboxes and selling the anonymized data to Uber — which used the data as a proxy for the health of its competitor’s business.

Embarrassing, eh? Not at all. Unroll’s boss, Jojo Hedaya, has published a post on the company blog under the headline “We Can Do Better”. “Our users are the heart of our company and service”, it begins,

So it was heartbreaking to see that some of our users were upset to learn about how we monetize our free service.

And while we try our best to be open about our business model, recent customer feedback tells me we weren’t explicit enough.

Note (i) “heartbreaking” and (ii) “recent customer feedback”. Translation: (i) disastrous; (ii) good investigative journalism by the New York Times.

Crocodile tears having been duly shed, Jojo continues:

So we need to do better for our users, and will from this point forward, with clearer messaging on our website, in our app, and in our FAQs. We will also be more clear about our data usage in our on-boarding process. The rest will remain the same: providing a killer service that gives you hours back in your day while protecting your privacy and security above all else.

I can’t stress enough the importance of your privacy. We never, ever release personal data about you. All data is completely anonymous and related to purchases only. To get a sense of what this data looks like and how it is used, check out the Slice Intelligence blog.

Thank you for being such an important part of our company. If there’s more we can be doing better, please let me know.

George Orwell would have really enjoyed this. Schmucks are “such an important part of our company”, for example. And he “can’t stress enough” the importance of said schmucks’ privacy.

But — as Charles Arthur points out — there’s nothing in Jojo’s FAQs about selling the data.

Facebook: the Psychopaths ‘R Us channel

Yesterday’s Observer column:

The old adage “be careful what you wish for” comes to mind. A while back, Facebook launched Facebook Live, a service that enables its users to broadcast live video to the world. Shortly after the service was activated, the company’s founder and CEO, Mark Zuckerberg, said that the service would support all the “personal and emotional and raw and visceral” ways that people communicate. Users were encouraged to “go live” in casual settings – waiting for baggage at the airport, for example, or eating at a restaurant.

Note the phrase “raw and visceral”. Facebook Live has already broadcast a live stream of a young disabled man being tied up, gagged and attacked with a knife. In March, two Chicago teenage boys live-streamed themselves gang-raping a teenage girl. And around 40 Facebook users watched the video without reporting it either to Facebook or the police.

That’s pretty raw and visceral, you might think. But it turns out that it was just a prelude…

Read on