Yesterday’s Observer column:
The political theorist David Runciman draws a useful distinction between scandals and crises. Scandals happen all the time in society; they create a good deal of noise and heat, but in the end nothing much happens. Things go back to normal. Crises, on the other hand, do eventually lead to structural change, and in that sense play an important role in democracies.
So a good question to ask whenever something bad happens is whether it heralds a scandal or a crisis. When the phone-hacking story eventually broke, for example, many people (me included) thought that it represented a crisis. Now, several years – and a judicial enquiry – later, nothing much seems to have changed. Sure, there was a lot of sound and fury, but it signified little. The tabloids are still doing their disgraceful thing, and Rebekah Brooks is back in the saddle. So it was just a scandal, after all.
When the TalkTalk hacking story broke and I heard the company’s chief executive say in a live radio interview that she couldn’t say whether the customer data that had allegedly been stolen had been stored in encrypted form, the Runciman question sprang immediately to mind. That the boss of a communications firm should be so ignorant about something so central to her business certainly sounded like a scandal…
LATER Interesting blog post by Bruce Schneier. He opens with an account of how the CIA’s Director and the software developer Grant Blakeman had their email accounts hacked. Then,
Neither of them should have been put through this. None of us should have to worry about this.
The problem is a system that makes this possible, and companies that don’t care because they don’t suffer the losses. It’s a classic market failure, and government intervention is how we have to fix the problem.
It’s only when the costs of insecurity exceed the costs of doing it right that companies will invest properly in our security. Companies need to be responsible for the personal information they store about us. They need to secure it better, and they need to suffer penalties if they improperly release it. This means regulatory security standards.
The government should not mandate how a company secures our data; that will move the responsibility to the government and stifle innovation. Instead, government should establish minimum standards for results, and let the market figure out how to do it most effectively. It should allow individuals whose information has been exposed sue for damages. This is a model that has worked in all other aspects of public safety, and it needs to be applied here as well.
He’s right. Only when the costs of insecurity exceed the costs of doing it right will companies invest properly in it. And governments can fix that, quickly, by changing the law. For once, this is something that’s not difficult to do, even in a democracy.