What happens when algorithms decide what should be passed on?

One of the things we’re interested in on our research project is how rumours, news, information (and mis-information) can spread with astonishing speed across the world as a result of the Internet. Up to now I had been mostly working on the assumption that the fundamental mechanism involved is always something like the ‘retweet’ in Twitter — i.e. people coming on something that they wanted to pass on to others for whatever reason. So human agency was the key factor in viral retransmission of memes.

But I’ve just seen an interesting article in the Boston Globe which suggests that we need to think of the ‘retweeting’ effect in wider terms.

A surprise awaited Facebook users who recently clicked on a link to read a story about Michelle Obama’s encounter with a 10-year-old girl whose father was jobless.

Facebook responded to the click by offering what it called “related articles.” These included one that alleged a Secret Service officer had found the president and his wife having “S*X in Oval Office,” and another that said “Barack has lost all control of Michelle” and was considering divorce.

A Facebook spokeswoman did not try to defend the content, much of which was clearly false, but instead said there was a simple explanation for why such stories are pushed on readers. In a word: algorithms.

The stories, in other words, apparently are selected by Facebook based on mathematical calculations that rely on word association and the popularity of an article. No effort is made to vet or verify the content.

This prompted a comment from my former Observer colleague, Emily Bell, who now runs the Tow Center at Columbia. “They have really screwed up,” she told the Globe. “If you are spreading false information, you have a serious problem on your hands. They shouldn’t be recommending stories until they have got it figured out.”

She’s right, of course. A world in which algorithms decided what was ‘newsworthy’ would be a very strange place. But we might get find ourselves living in such a world, because Facebook won’t take responsibility for its algorithms, any more than Google will take responsibility for YouTube videos. These companies want the status of common carriers because otherwise they have to assume legal responsibility for the messages they circulate. And having to check everything that goes through their servers is simply not feasible.

Research-led teaching and pathological paradigms

The economist Diane Coyle has an interesting article on the need to reform the way undergraduate economics is taught. This is a theme that she has been writing about ever since the 2008 crash. In her article she points to some cracks that are appearing in the hitherto impenetrable facade of the profession’s establishment — as evidenced by some interesting new initiatives by university teachers. Wendy Carlin’s intriguing project, for example, carries the subtitle: “teaching economics as if the last three decades had happened”.

“Even a relatively minimal interpretation implies a substantial amount of change in many undergraduate economics programmes,” writes Coyle.

In many universities, the core curriculum settled into a predictable rut. This interacted with two factors: (i) incentives for academic research to focus on technical increments to knowledge – contributions aimed solely at professional peers, and (ii) rising teaching loads and student numbers stemming from pressures on university finances.

Despite the great interest in reform among economists teaching undergraduate courses, change will take some time as these various barriers are overcome.

There is probably the widest agreement about changes such as:

Re-introducing elements of economic history into core modules;
* Incorporating some issues on the frontiers of research into undergraduate teaching;
* Encouraging inter-disciplinary interest; and
* Ensuring students are taught key skills such as data handling and good communication.

I was particularly struck by her point about the factors explaining the “rut” into which undergraduate economics teaching had fallen. For several decades (perhaps longer) the ‘mathematisation’ of economics had led to the fossilisation of the profession round a Kuhnian paradigm which yielded lots of interesting intellectual puzzles but was effectively detached from the real world of finance, globalisation, computerised trading, neoliberal ideology and other phenomena. The result was the evolution of a profession that is effectively coalesced round a pathological paradigm — i.e. one that has little to do with the real-world domain to which it purportedly applies (see “The Dismal (and dangerous) Science” and Richard Posner’s strictures).

And therein lies an interesting unintended consequence — the way in which successive generations of undergraduates have been lured astray by something that all elite universities sell as their USP – the promise that kids will be taught by academics who are research leaders in their fields.

Those universities have been as good as their word. The result is that generations of kids in elite institutions have been stuffed with the fantasies emanating from the dominant, research-led economics paradigm. As my friend Geoff Harcourt pointed out in his letter to the Queen (see here again), apologists for the paradigm do not consider

“how the preference for mathematical technique over real-world substance diverted many economists from looking at the vital whole. It fails to reflect upon the drive to specialise in narrow areas of inquiry, to the detriment of any synthetic vision. For example, it does not consider the typical omission of psychology, philosophy or economic history from the current education of economists in prestigious institutions. It mentions neither the highly questionable belief in universal ‘rationality’ nor the ‘efficient markets hypothesis’ — both widely promoted by mainstream economists. It also fails to consider how economists have also been ‘charmed by the market’ and how simplistic and reckless market solutions have been widely and vigorously promoted by many economists.

What has been scarce is a professional wisdom informed by a rich knowledge of psychology, institutional structures and historic precedents. This insufficiency has been apparent among those economists giving advice to governments, banks, businesses and policy institutes. Non-quantified warnings about the potential instability of the global financial system should have been given much more attention.

We believe that the narrow training of economists — which concentrates on mathematical techniques and the building of empirically uncontrolled formal models — has been a major reason for this failure in our profession. This defect is enhanced by the pursuit of mathematical technique for its own sake in many leading academic journals and departments of economics.”

The big question, of course, is whether the arguments advanced by Diane Coyle, Geoff Harcourt and other perceptive critics will lead to any substantive change in the way mainstream economics is taught. Anyone familiar with Thomas Kuhn’s analysis, who has seen the intellectual and organisational grip that paradigms exert on academic disciplines, or read John Cassidy’s account of denial in the profession, is bound to be sceptical. Some people would sooner die than admit that they have been wrong. And they include many academics.

Metcalfe’s Law Rules OK

This morning’s Observer column:

There are two paradoxical things about Twitter. The first is how so many people apparently can’t get their heads around what seems like a blindingly simple idea – free expression, 140 characters at a time. I long ago lost count of the number of people who would come up to me on social occasions saying that they just couldn’t see the point of Twitter. Why would anyone be interested in knowing what they had for breakfast? I would patiently explain that while some twitterers might indeed be broadcasting details of their eating habits, the significance of the medium was that it enabled one to tap into the “thought-stream” of interesting individuals. The key to it, in other words, lay in choosing whom to “follow”. In that way, Twitter functions as a human-mediated RSS feed which is why, IMHO, it continues to be one of the most useful services available on the internet.

The second paradox about Twitter is how a service that has become ubiquitous – and enjoys nearly 100% name recognition, at least in industrialised countries – could become the stuff of analysts’ nightmares because they fear it lacks a business model that will one day produce the revenues to justify investors’ hopes for it.

They may be right about the business model – in which case Twitter becomes a perfect case study in the economics of information goods. The key to success in cyberspace is to harness the power of Metcalfe’s Law, which says that the value of a network is proportional to the square of the number of its users…

Read on