Uncomfortable thoughts about Leveson

I’ve been thinking. Always a bad idea (as my mother used to say). And four uncomfortable thoughts come to mind.

  • The first is the sad fact that I mentioned in an earlier post, namely that as long as the great British public continues to buy newspapers that behave disgracefully, then newspapers will behave disgracefully, no matter how much self-regulation they undergo.
  • The second is that Leveson’s prescription is going to be applied to something that may be a sunset industry (printed newspapers) while being able to do little or nothing about the Internet and what’s published on it — at least by organisations and websites based in other jurisdictions.
  • To date the only really original thinking I’ve seen on this last point is Lord MacAlpine’s idea of — and method for — going after defamatory Tweeters. This seems to be a smart way of using existing statutes to ensure that people think carefully before tweeting or retweeting something that may be defamatory in the UK.
  • The Guardian reports that Cameron is thinking of using a Royal Charter to regulate the press. The model for this is, of course, the BBC, which is governed by just such a charter. What could be simpler: we’re all proud of the BBC and its journalism is (generally) a model of probity? There’s just one problem with this — as Professor Brian Winston pointed out in a letter to the Guardian:

    A far better indication of the consequential dangers of content regulation by the state is the Hutton inquiry. The content-regulated BBC was called to account for its actions in reporting on David Kelly while Paul Dacre and Lord Rothermere (who ran the same story) were not. The question of independence needs to be tested against broadcasting’s record of investigations of UK political power, and history suggests that this has been less than stellar. Over time, it has certainly been consistently more constrained than parallel probes by the printed press: witness, for example, Hackgate itself, and the Guardian’s crucial role in that.

    That’s true. There are some examples where TV journalism has indeed taken on the power of the state and won. Think, for example, of the investigative reporting which led to the freeing of the Guildford Four and the Birmingham Six. But mostly the journalists who did that were working not for the BBC but for the old ITV companies.

    Like I said, uncomfortable thoughts.

  • Philip Tetlock on political forecasting

    Interesting interview with Philip Tetlock by John Brockman. Excerpt

    Let me say something about how dangerous it is to draw strong inferences about accuracy from isolated episodes. Imagine, for example, that Silver had been wrong and that Romney had become President. And let’s say his prediction had been a 0.8 probability two weeks prior to the election that made Romney President. You can imagine what would have happened to his credibility. It would have cratered. People would have concluded that, yes, his Republican detractors were right, that he was essentially an Obama hack, and he wasn’t a real scientist. That’s, of course, nonsense. When you say there’s a .8 probability there’s 20 percent chance that something else could happen. And it should reduce your confidence somewhat in him, but you shouldn’t abandon him totally. There’s a disciplined Bayesian belief adjustment process that’s appropriate in response to mis-calibrated forecasts.

    What we see instead is overreactions. Silver would either be a fool if he’d gotten it wrong or he’s a god if he gets it right. He’s neither a fool nor a god. He’s a thoughtful data analyst who knows how to work carefully through lots of detailed data and aggregate them in sophisticated ways and get a bit of a predictive edge over many, but not all of his competitors. Because there are other aggregators out there who are doing as well or maybe even a little bit better, but their methodologies are quite strikingly similar and they’re relying on a variant of the wisdom of the crowd, which is aggregation. They’re pooling a lot of diverse bits of information and they’re trying to give more weight to those bits of information that have a good historical track record of having been accurate. It’s a weighted averaging kind of process essentially and that’s a good strategy.