Backdoors won’t work. Just ask the TSA. (Or the NSC)


Very nice openDemocracy piece by my colleague Julian Huppert on why putting backdoors in encryption systems is a very bad idea:

This was demonstrated recently with a security disaster involving the US Transport Security Administration. They want to be able to search through people’s luggage, if they think there is contraband inside. But sometimes people quite reasonably want to lock their luggage, so that people cannot just take things from it. So a system was created with TSA approved locks, so that TSA officials can unlock them using a master key. In theory, no one else can, so your luggage is safe.

You might ask: what if someone got hold of these master keys? But the TSA had an even bigger disaster to come. In a piece in the Washington Post praising their work, someone foolishly posed with a set of master keys. The photo was of a high enough resolution that people can now 3D print copies, and use them to open any TSA approved lock. The backdoor is wide open, and security breached.

This fate can happen to any backdoor system, and probably will. That is why the US National Security Council has been quite clear in their draft options paper.

The relevant excerpt from the NSC ‘Options’ paper reads: “the Administration will not seek legislation that compels providers to design their products to enable government access to encrypted information”.

Two things are interesting about this. The first is how useful it is to have a mundane, everyday illustration of an important idea. We have been telling people for ages that backdoors in encryption software is a bad idea, but this gets nowhere with non-geeks because they have no personal experience to which that proposition can be related. But they know about suitcase locks.

This reminds me of all the years I wasted trying to persuade lay audiences about the importance of open source software. My argument was that software that affects our lives should never be impenetrable or unalterable ‘black boxes’ — the the “freedom to tinker” was vital. This argument got precisely nowhere.

And then, one day, I suddenly understood why: my audiences had never written a line of software. It was an entirely alien concept to them. So the next time I gave the talk I brought a copy of my favourite recipe book with me. Before starting, I asked who in the audience cooked or baked? Every hand went up. So then I turned to a particular recipe that had 300ml of double cream as one ingredient. “Now”, I said, “double cream if not good for a guy like me, so I’d like to replace it with creme fraiche. But imagine that we lived in a world where, if I wanted to do that, I would have to write to the authoress to seek her permission, and perhaps to pay a fee. What would you think of that?” And of course they all said that it would be nuts. “Well then”, was the payoff line, “now you understand why open source software is important.”

The second thought raised by Julian’s post is that while the UK government is unlikely to pay much attention to the geek view of the absurdity of backdoors in encryption systems, it’s much more likely to pay attention to the considered view of the US National Security Agency.