ISC Chairman had “nothing to hide” but still got into trouble

So Sir Malcolm Rifkind has fallen on his sword after a journalistic sting operation recorded him apparently touting for work from a fake Chinese company that was supposedly wanting him to join its advisory board. The other former Foreign Secretary, Jack Straw, was similarly embarrassed after he was surreptitiously recorded bragging about the access that his status as a former senior minister granted him. Both men protested vigorously that they had done nothing wrong, which may well be true, at least in the sense that they were adhering to the letter of the rules for public representatives.

What’s interesting about Rifkind’s fall is that he used to be an exponent of the standard mantra — “if you have nothing to hide then you have nothing to fear” from bulk surveillance. Both men claim that they had done nothing wrong, but at the same time it’s clear that they have been grievously embarrassed by public exposure of activities that they wanted to keep private. In that sense, they are in the same boat as most citizens. We all do harmless things that we nevertheless regard as private matters which are none of the government’s business. That’s what privacy is all about.

Thinking of Googling for health information? Think again.

Interesting video by Tim Libert, summarising the results of some research he did on the way health information sites (including those run by government agencies) covertly pass information about health-related searches to a host of commercial companies. Libert is a researcher at the University of Pennsylvania. He built a program called webXray to analyze the top 50 search results for nearly 2,000 common diseases (over 80,000 pages total). He found that no fewer that 91% of the pages made third-party requests to outside companies. So if you search for “cold sores,” for instance, and click the WebMD “Cold Sores Topic Overview” link, the site is passing your request for information about the disease along to “one or more (and often many, many more) other corporations”.

According to Libert’s research (Communications of the ACM, Vol. 58 No. 3, Pages 68-77), about 70% of the time, the data transmitted “contained information exposing specific conditions, treatments, and diseases.”

So think twice before consulting Dr Google. Especially if you think you might have a condition that might affect your insurance rating.

The Telescreen is here

Thinking about getting a ‘smart’ Samsung TV? Think again.

Smart_TV

Thanks to Hannes Sjoblad for the tweet.

Footnote: In Orwell’s 1984 there was a ‘telescreen’ in Winston’s apartment.

“Any sound that Winston made, above the level of a very low whisper, would be picked up by it, moreover, so long as he remained within the field of vision which the metal plaque commanded, he could be seen as well as heard. There was of course no way of knowing whether you were being watched at any given moment. How often, or on what system, the Thought Police plugged in on any individual wire was guesswork.”

Due warning

“FOR PUBLIC SAFETY REASONS, THIS EMAIL HAS BEEN INTERCEPTED BY YOUR
GOVERNMENT AND WILL BE RETAINED FOR FUTURE ANALYSIS.”

Signature line on a friend’s email messages.

What David Cameron doesn’t get: the difference between privacy and secrecy

My colleague Julia Powles has a terrific essay in Wired on the implications of, and fallout from, the Charlie Hebdo massacre, in which she says this:

Cameron claims that there should be “no safe spaces for terrorists to communicate”. What he expects in technical and legal terms is unclear, but the sentiment is stark: no safe spaces for “them”, means none for us. Security is cast as the ultimate law and first priority, while privacy is something for bad people to hide bad things. In truth, privacy is fundamental to all of us, individually and collectively. It is the foundation of trust, relationships, and intellectual freedom. It is a core tenet of a free and healthy society — security’s ally, not its enemy.

It’s strange how the political establishment in most democracies now seem unable to distinguish between secrecy and privacy. Privacy — as Cory Doctorow observed last week on Radio 4’s Start the Week programme — is the ability to control what other people know about you. It’s the state of being unobserved. Secrecy is the act of keeping things hidden for various reasons, some of which may be legitimate — and some conceivably not. We are all entitled to privacy — it’s a human right. Secrecy is a different thing altogether.

She goes on to remind readers that Cameron’s political assertions

are propped up by a formidable line-up of security officials from MI5, MI6, and the ISC, who have been notably more vocal in the last two weeks than at any moment in the last two years. They echo the tone set by the GCHQ director and Metropolitan police commissioner in November. It is only if we can get at everybody’s communications data, they claim, that we can tackle the terrorist problem. But mass data collection, the necessary precursor to recent and proposed laws, can be shown mathematically to make it more difficult to catch terrorists, plus it has a very significant and irrecoverable environmental cost. It is in clear breach of human rights. It also creates unnecessary, unwanted, and costly data storage — and, with it, new vulnerabilities to malevolent actors that far outnumber plausible terrorist threats. What works, by contrast, is well-resourced, targeted intelligence, complemented by strategies directed at mitigating the causes of disaffection and social unrest.

Well worth reading in full.

So what will it take to wake people up?

At dinner last night I had a long talk with one of my Masters students who is as baffled as I am about why people seem to be so complacent about online surveillance. This morning a colleague sent me a link to this TEDx talk by Mikko Hypponen, a well known Finnish security expert. It’s a terrific lecture, but one part of it stood out especially for me in the context of last night’s conversation. It concerned an experiment Hypponen and his colleagues ran in London, where they set up a free wi-fi hot-spot that anyone could use after they had clicked to accept the terms & conditions under which the service was offered. One of the terms was this:

First_born_child_EULA

Every user — every user! — clicked ‘Accept’.

Why ‘cybersecurity’ is such a flawed term

In a sentence: it lumps three very different things — crime, espionage and warfare — under a single heading. And, as I tried to point out in yesterday’s Observer column, instead of making cyberspace more secure many of the activities classified as ‘cyber security’ make it less so.

Bruce Schneier has a thoughtful essay on the subject.

Last week we learned about a striking piece of malware called Regin that has been infecting computer networks worldwide since 2008. It’s more sophisticated than any known criminal malware, and everyone believes a government is behind it. No country has taken credit for Regin, but there’s substantial evidence that it was built and operated by the United States.

This isn’t the first government malware discovered. GhostNet is believed to be Chinese. Red October and Turla are believed to be Russian. The Mask is probably Spanish. Stuxnet and Flame are probably from the U.S. All these were discovered in the past five years, and named by researchers who inferred their creators from clues such as who the malware targeted.

I dislike the “cyberwar” metaphor for espionage and hacking, but there is a war of sorts going on in cyberspace. Countries are using these weapons against each other. This affects all of us not just because we might be citizens of one of these countries, but because we are all potentially collateral damage. Most of the varieties of malware listed above have been used against nongovernment targets, such as national infrastructure, corporations, and NGOs. Sometimes these attacks are accidental, but often they are deliberate.

For their defense, civilian networks must rely on commercial security products and services. We largely rely on antivirus products from companies such as Symantec, Kaspersky, and F-Secure. These products continuously scan our computers, looking for malware, deleting it, and alerting us as they find it. We expect these companies to act in our interests, and never deliberately fail to protect us from a known threat.

This is why the recent disclosure of Regin is so disquieting. The first public announcement of Regin was from Symantec, on November 23. The company said that its researchers had been studying it for about a year, and announced its existence because they knew of another source that was going to announce it. That source was a news site, the Intercept, which described Regin and its U.S. connections the following day. Both Kaspersky and F-Secure soon published their own findings. Both stated that they had been tracking Regin for years. All three of the antivirus companies were able to find samples of it in their files since 2008 or 2009.

Yep. Remember that the ostensible mission of these companies is to make cyberspace more secure. By keeping quiet about the Regin threat they did exactly the opposite. So, as Schneier concludes,

Right now, antivirus companies are probably sitting on incomplete stories about a dozen more varieties of government-grade malware. But they shouldn’t. We want, and need, our antivirus companies to tell us everything they can about these threats as soon as they know them, and not wait until the release of a political story makes it impossible for them to remain silent.

RIPA, the super-elastic statute

When RIPA was going through Parliament in 1999, one of the things critics pointed out was the latitude it provided for mission creep. And so it proved — to the point where local authorities were using it to snoop on parents who were suspected of not living in the catchment area of the schools to which they wanted to send their kids.

Now, more evidence of the extent of the mission creep: Documents released by human rights organisation, Reprieve show that GCHQ and MI5 staff were told they could target lawyers’ communications. This undermines legal privilege that ensures communications between lawyers and their clients are confidential.

The news that legal privilege is being violated comes weeks after it was revealed the Met police have used RIPA to circumvent journalistic privilege that protects journalists’ sources.

The only thing that remains is the (Catholic) Confessional.

After Snowden, what?

This morning’s Observer column.

Many moons ago, shortly after Edward Snowden’s revelations about the NSA first appeared, I wrote a column which began, “Repeat after me: Edward Snowden is not the story”. I was infuriated by the way the mainstream media was focusing not on the import of what he had revealed, but on the trivia: Snowden’s personality, facial hair (or absence thereof), whereabouts, family background, girlfriend, etc. The usual crap, in other words. It was like having a chap tell us that the government was poisoning the water supply and concentrating instead on whom he had friended on Facebook.

Mercifully, we have moved on a bit since then. The important thing now, it seems to me, is to consider a new question: given what we now know, what should we do about it? What could we realistically do? Will we, in fact, do anything? And if the latter, where are we heading as democracies?

I tried to put some of these questions to Snowden at the Observer Ideas festival last Sunday via a Skype link that proved comically dysfunctional. The comedy in using a technology to which the NSA has a backdoor was not lost on the (large) audience — or on Snowden, who coped gracefully with it. But it was a bit like trying to have a philosophical discussion using smoke signals. So let’s have another go.

First, what could we do to curb comprehensive surveillance of the net?

Read on…

Bruce Schneier’s next book

Title: Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World

Publisher: WW Norton

Publication date: March 9, 2015

Table of Contents

Part 1: The World We’re Creating
Chapter 1: Data as a By-Product of Computing
Chapter 2: Data as Surveillance
Chapter 3: Analyzing our Data
Chapter 4: The Business of Surveillance
Chapter 5: Government Surveillance and Control
Chapter 6: Consolidation of Institutional Surveillance

Part 2: What’s at Stake
Chapter 7: Political Liberty and Justice
Chapter 8: Commercial Fairness and Equality
Chapter 9: Business Competitiveness
Chapter 10: Privacy
Chapter 11: Security

Part 3: What to Do About It
Chapter 12: Principles
Chapter 13: Solutions for Government
Chapter 14: Solutions for Corporations
Chapter 15: Solutions for the Rest of Us
Chapter 16: Social Norms and the Big Data Trade-Off

Something to be pre-ordered, methinks.