Archive for the 'Privacy' Category

So who still believes that collecting metadata is harmless?

[link] Friday, September 12th, 2014

Interesting snippet in the latest newsletter from the Open Rights Group:

It was revealed last week that the Met police accessed the telephone records of The Sun’s Political Editor, Tom Newton Dunn, using a RIPA request.

The case should end any discussion about whether or not metadata reveals anything personal about us: Newton Dunn’s calls and when and where they were received, were seen as enough to identify a whistleblower, who contacted him over the Plebgate scandal.

Journalistic privilege, protected by the Police and Criminal Evidence Act, was circumvented by the use of RIPA. Newton Dunn was not even aware that his records had been accessed until the Met published their report into the Plebgate affair.

When DRIP was announced, Newton Dunn wrote in The Sun, that the new powers would give MI5 and cops, “crucial access to plotters’ mobile phone records”. UK public authorities use RIPA over 500,000 a year to access private data. The police refused to answer questions as to how many times they have have accessed journalists’ data. When this is happening without our knowledge, we cannot ignore the threat to our civil liberties that data retention poses.

The interesting bit is the fact that the metadata were sufficient to identify a whistleblower. We all knew that, of course, but the official line is still that bulk collection of metadata does not infringe on privacy.

Dave Eggers has seen the future. Well, a possible future anyway…

[link] Monday, September 1st, 2014

Yesterday’s Observer column.

Fifteen months have passed since Edward Snowden began to explain to us how our networked world works. During that time there has been much outrage, shock, horror, etc expressed by the media and the tech industry. So far, so predictable. What is much more puzzling is how relatively relaxed the general public appears to be about all this. In Britain, for example, opinion polling suggests that nearly two thirds of the population think that the kind of surveillance revealed by Snowden is basically OK.

To some extent, the level of public complacency/concern is culturally determined. Citizens of Germany, for example…

Read on

Web services are ‘free’, which is why we’re all in chains

[link] Sunday, August 24th, 2014

This morning’s Observer column.

‘Be careful what you wish for,” runs the adage. “You might just get it.” In the case of the internet, or, at any rate, the world wide web, this is exactly what happened. We wanted exciting services – email, blogging, social networking, image hosting – that were “free”. And we got them. What we also got, but hadn’t bargained for, was deep, intensive and persistent surveillance of everything we do online.

We ought to have known that it would happen. There’s no such thing as a free lunch, after all…

Read on

TOR, Taylor Swift and breaking the Kafkaesque spiral

[link] Friday, August 22nd, 2014

target

Photo cc https://secure.flickr.com/photos/comedynose/7865159650

Ever since the Snowden revelations began I’ve been arguing that Kafka is as good a guide to our surveillance crisis as is Orwell. The reason: one of the triggers that prompts the spooks to take an interest in someone is if that person is using serious tools to protect their privacy. It’s like painting a target on your back.

So if you use PGP to encrypt your email, or TOR for anonymous browsing, then you are likely to be seen as someone who warrants more detailed surveillance. After all, if you’ve nothing to hide… etc.

And there’s no way you would know that you had been selected for special treatment. This sounds like a situation that Kafka would recognise.

Until the other day, I couldn’t think of a way out of this vicious cycle. And then I came on reports (e.g. here) that a musician of whom I’d never heard — electronic music artist Aphex Twin — had announced the details of his new album on a site only accessible through Tor.

This resulted in the page attracting 133,000 views in little over 24 hours. This is within the limits of what TOR can currently handle, but Tor’s executive director, Andrew Lewman, worries that a more mainstream artist could break the system in its current state.

“If tomorrow, Taylor Swift said ‘to all my hundreds of millions of fans, go to this [Tor] address’, it would not work well. We’re into the millions now, and we have a few companies saying ‘we want to put Tor as a privacy mode in our premier products, can you handle the scale of 75-100m devices of users’, and right now the answer is no, we can’t. Not daily.”

This sounds like — and is — a problem. But it’s also an opportunity. Because what we need is for encrypted email and anonymous browsing to become the norm so that the spooks can’t argue that only evil people would resort to using such tools.

And here’s where Aphix Twin and Taylor Swift come in. They have the power to kickstart the mainstreaming of TOR — to make it normal. Of course for that to be effective it means that TOR has to be boosted and expanded and securely funded. Just as the big Internet companies have finally realised that they have to chip in and support, for example, the OpenSSL project, so they should now chip in to help build the infrastructure that would enable TOR to become the default was we all did web browsing.

Can Google really keep our email private?

[link] Sunday, June 8th, 2014

This morning’s Observer column.

So Google has decided to provide end-to-end encryption for any of its Gmail users who wants it. One could ask “what took you so long?” but that would be churlish. (Some of us were unkind enough to suspect that the reluctance might have been due to, er, commercial considerations: after all, if Gmail messages are properly encrypted, then Google’s computers can’t read the content in order to decide what ads to display alongside them.) But let us be charitable and thankful for small mercies. The code for the service is out for testing and won’t be made freely available until it’s passed the scrutiny of the geek community, but still it’s a significant moment, for which we have Edward Snowden to thank.

The technology that Google will use is public key encryption, and it’s been around for a long time and publicly available ever since 1991, when Phil Zimmermann created PGP (which stands for pretty good privacy)…

Read on

LATER Email from Cory Doctorow:

Wanted to say that I think it’s a misconception that Goog can’t do targeted ads alongside encrypted email. Google knows an awful lot about Gmail users: location, browsing history, clicking history, search history. It can also derive a lot of information about a given email from the metadata: sending, CC list, and subject line. All of that will give them tons of ways to target advertising to Gmail users – — they’re just subtracting one signal from the overall system through which they make their ad-customization calculations.

So the cost of not being evil is even lower than I had supposed!

STILL LATER
This from Business Insider:

Inside the code for Google’s End-to-End email encryption extension for Chrome, there’s a message that should sound very familiar to the NSA: “SSL-added-and-removed-here-;-)”

Followers of this blog will recognise this as quote from a slide leaked by Edward Snowden.

google-cloud-exploitation1383148810

This comes from a slide-deck about the ‘Muscular’ program (who thinks up these daft names?), which allowed Britain’s GCHQ intelligence service and the NSA to pull data directly from Google servers outside of the U.S. The cheeky tone of the slide apparently enraged some Google engineers, which I guess explains why a reference to it resides in the Gmail encryption code.

Yay! Gmail to get end-to-end encryption

[link] Wednesday, June 4th, 2014

This has been a long time coming — properly encrypted Gmail — but it’s very welcome. Here’s the relevant extract from the Google security blog:

Today, we’re adding to that list the alpha version of a new tool. It’s called End-to-End and it’s a Chrome extension intended for users who need additional security beyond what we already provide.

“End-to-end” encryption means data leaving your browser will be encrypted until the message’s intended recipient decrypts it, and that similarly encrypted messages sent to you will remain that way until you decrypt them in your browser.

While end-to-end encryption tools like PGP and GnuPG have been around for a long time, they require a great deal of technical know-how and manual effort to use. To help make this kind of encryption a bit easier, we’re releasing code for a new Chrome extension that uses OpenPGP, an open standard supported by many existing encryption tools.

However, you won’t find the End-to-End extension in the Chrome Web Store quite yet; we’re just sharing the code today so that the community can test and evaluate it, helping us make sure that it’s as secure as it needs to be before people start relying on it. (And we mean it: our Vulnerability Reward Program offers financial awards for finding security bugs in Google code, including End-to-End.)

Once we feel that the extension is ready for primetime, we’ll make it available in the Chrome Web Store, and anyone will be able to use it to send and receive end-to-end encrypted emails through their existing web-based email provider.

We recognize that this sort of encryption will probably only be used for very sensitive messages or by those who need added protection. But we hope that the End-to-End extension will make it quicker and easier for people to get that extra layer of security should they need it.

Google privacy ruling: the thin end of a censorship wedge?

[link] Sunday, May 18th, 2014

This morning’s Observer column.

Sooner or later, every argument about regulation of the internet comes down to the same question: is this the thin end of the wedge or not? We saw a dramatic illustration last week when the European court of justice handed down a judgment on a case involving a Spanish lawyer, one Mario Costeja González, who objected that entering his name in Google’s search engine brought up embarrassing information about his past (that one of his properties had been the subject of a repossession)…

Read on

LATER

Three interesting — and usefully diverse — angles on the ECJ decision.

  • Daithi Mac Sitigh points out that the decision highlights the tensions between EU and US law. “This is particularly significant”, he says, “given that most of the major global players in social networking and e-commerce operate out of the US but also do a huge amount of business in Europe.”

Google’s first line of defence was that its activities were not subject to the Data Protection Directive. It argued that its search engine was not a business carried out within the European Union. Google Spain was clearly subject to EU law, but Google argued that it sells advertising rather than running a search engine.

The court was asked to consider whether Google might be subject to the Directive under various circumstances. A possible link was the use of equipment in the EU, through gathering information from EU-based web servers or using relevant domain names (such as google.es). Another suggestion was that a case should be brought at its “centre of gravity”, taking into account where the people making the requests to delete data have their interests.

But the court never reached these points. Instead, it found the overseas-based search engine and the Spain-based seller of advertising were “inextricably linked”. As such, Google was found to be established in Spain and subject to the directive.

The message being sent was an important one. Although this ruling is specific to the field of data protection, it suggests that if you want to do business in the EU, a corporate structure that purports to shield your activities from EU law will not necessarily protect you from having to comply with local legislation. This may explain the panicked tone of some of the reaction to the decision.

  • In an extraordinary piece, “Right to Forget a Genocide”, Zeynep Tufekci muses about how (Belgian) colonial imposition of ID cards on Rwandan citizens was instrumental in facilitating genocide.

It may seem like an extreme jump, from drunken adolescent photos to genocide and ethnic cleansing, but the shape, and filters, of a society’s memory is always more than just about individual embarrassment or advancement. What we know about people, and how easily we can identify or classify them, is consequential far beyond jobs and dates, and in some contexts may make the difference between life and death.

“Practical obscurity”—the legal term for information that was available, but not easily—has died in most rich countries within just about a decade. Court records and criminal histories, which were only accessible to the highly-motivated, are now there at the click of a mouse. Further, what is “less obscure” has greatly expanded: using our online data, algorithms can identify information about a person, such as sexual orientation and political affiliation, even if that person never disclosed them.

In that context, take Rwanda, a country many think about in conjunction with the horrific genocide 20 years ago during which more than 800,000 people were killed—in just about one hundred days. Often, stories of ethnic cleansing and genocide get told in a context of “ancient hatreds,” but the truth of it is often much uglier, and much less ancient. It was the brutal colonizer of Rwanda, Belgium, that imposed strict ethnicity-based divisions in a place where identity tended to be more fluid and mixed. Worse, it imposed a national ID system that identified each person as belonging to Hutu, Tutsi or Twa and forever freezing them in that place. [For a detailed history of the construction of identity in Rwanda read this book, and for the conduct of colonial Belgium, Rwanda’s colonizer, read this one.]

Few years before the genocide, some NGOs had urged that Rwanda “forget” ethnicity, erasing them from ID cards.

They were not listened to.

During the genocide, it was those ID cards that were asked for at each checkpoint, and it was those ID cards that identified the Tutsis, most of whom were slaughtered on the spot. The ID cards closed off any avenue of “passing” a checkpoint. Ethnicity, a concept that did not at all fit neatly into the region’s complex identity configuration, became the deadly division that underlined one of the 20th century’s worst moments. The ID cards doomed and fueled the combustion of mass murder.

  • Finally, there’s a piece in Wired by Julia Powles arguing that “The immediate reaction to the decision has been, on the whole, negative. At best, it is reckoned to be hopelessly unworkable. At worst, critics pan it as censorship. While there is much to deplore, I would argue that there are some important things we can gain from this decision before casting it roughly aside.”

What this case should ideally provoke is an unflinching reflection on our contemporary digital reality of walled gardens, commercial truth engines, and silent stewards of censorship. The CJEU is painfully aware of the impact of search engines (and ‘The’ search engine, in particular). But we as a society should think about the hard sociopolitical problems that they pose. Search engines are catalogues, or maps, of human knowledge, sentiments, joys, sorrows, and venom. Silently, with economic drivers and unofficial sanction, they shape our lives and our interactions.

The fact of the matter here is that if there is anyone that is up to the challenge of respecting this ruling creatively, Google is. But if early indications are anything to go by, there’s a danger that we’ll unwittingly save Google from having to do so, either through rejecting the decision in practical or legal terms; through allowing Google to retreat “within the framework of their responsibilities, powers and capabilities” (which could have other unwanted effects and unchecked power, by contrast with transparent legal mechanisms); or through working the “right to be forgotten” out of law through the revised Data Protection Regulation, all under the appealing but ultimately misguided banner of preventing censorship.

There is, Powles argues, a possible technical fix for this — implementation of a ‘right to reply’ in search engine results.

An all-round better solution than “forgetting”, “erasure”, or “take-down”, with all of the attendant issues with free speech and the rights of other internet users, is a “right to reply” within the notion of “rectification”. This would be a tech-enabled solution: a capacity to associate metadata, perhaps in the form of another link, to any data that is inaccurate, out of date, or incomplete, so that the individual concerned can tell the “other side” of the story.

We have the technology to implement such solutions right now. In fact, we’ve done a mock-up envisaging how such an approach could be implemented.

Search results could be tagged to indicate that a reply has been lodged, much as we see with sponsored content on social media platforms. Something like this, for example:

Forgotten

(Thanks to Charles Arthur for the Tufekci and Powles links.)

Our Kafkaesque world

[link] Sunday, May 11th, 2014

This morning’s Observer column.

When searching for an adjective to describe our comprehensively surveilled networked world – the one bookmarked by the NSA at one end and by Google, Facebook, Yahoo and co at the other – “Orwellian” is the word that people generally reach for.

But “Kafkaesque” seems more appropriate. The term is conventionally defined as “having a nightmarishly complex, bizarre, or illogical quality”, but Frederick Karl, Franz Kafka’s most assiduous biographer, regarded that as missing the point. “What’s Kafkaesque,” he once told the New York Times, “is when you enter a surreal world in which all your control patterns, all your plans, the whole way in which you have configured your own behaviour, begins to fall to pieces, when you find yourself against a force that does not lend itself to the way you perceive the world.”

A vivid description of this was provided recently by Janet Vertesi, a sociologist at Princeton University. She gave a talk at a conference describing her experience of trying to keep her pregnancy secret from marketers…

Read on

The NSA’s overseas franchise

[link] Wednesday, April 30th, 2014

Just spotted this from Glenn Greenwald.

Britain’s electronic surveillance agency, Government Communications Headquarters, has long presented its collaboration with the National Security Agency’s massive electronic spying efforts as proportionate, carefully monitored, and well within the bounds of privacy laws. But according to a top-secret document in the archive of material provided to The Intercept by NSA whistleblower Edward Snowden, GCHQ secretly coveted the NSA’s vast troves of private communications and sought “unsupervised access” to its data as recently as last year – essentially begging to feast at the NSA’s table while insisting that it only nibbles on the occasional crumb.

The document, dated April 2013, reveals that GCHQ requested broad new authority to tap into data collected under a law that authorizes a variety of controversial NSA surveillance initiatives, including the PRISM program.

PRISM is a system used by the NSA and the FBI to obtain the content of personal emails, chats, photos, videos, and other data processed by nine of the world’s largest internet companies, including Google, Yahoo!, Microsoft, Apple, Facebook, and Skype. The arrangement GCHQ proposed would also have provided the British agency with greater access to millions of international phone calls and emails that the NSA siphons directly from phone networks and the internet.

The Snowden files do not indicate whether NSA granted GCHQ’s request, but they do show that the NSA was “supportive” of the idea, and that GCHQ was permitted extensive access to PRISM during the London Olympics in 2012. The request for the broad access was communicated at “leadership” level, according to the documents. Neither agency would comment on the proposed arrangement or whether it was approved.

This is hard to square with the report by the UK’s communications interception commissioner which found that GCHQ’s arrangements with the NSA to have been within the law and said that the agency was not engaged in “indiscriminate random mass intrusion.”

Greenwald thinks that the newly revealed documents raise questions about the full extent of the clandestine cooperation and about whether information about it has been withheld from lawmakers.

He interviewed Julian Huppert, the Lib-Dem MP for Cambridge who served on a committee that reviewed – and recommended against – the Communications Data Bill that the spooks have been pushing.

At no point during that process, Huppert says, did GCHQ disclose the extent of its access to PRISM and other then-secret NSA programs. Nor did it indicate that it was seeking wider access to NSA data – even during closed sessions held to allow security officials to discuss sensitive information. Huppert says these facts were relevant to the review and could have had a bearing on its outcome.

“It is now obvious that they were trying to deliberately mislead the committee,” Huppert told The Intercept. “They very clearly did not give us all the information that we needed.”

Surprise, surprise.

Making sense of Snowden

[link] Saturday, April 19th, 2014

This is a fantastic example of how to conduct an academic discussion of a really contentious subject. It brings together academics and NSA people to talk calmly about what’s happened and what it means. The participants are Yochai Benkler, Bruce Schneier, and Jonathan Zittrain of the Berkman Center and John DeLong and Anne Neuberger of the National Security Agency. The conversation is expertly moderated by the Berkman Faculty Director Terry Fisher.

It runs for 90 minutes, but is really worth it. So book some time off and watch.

Some thoughts triggered by it, in no particular order…

  1. Tempting thought it might be, I see little point in demonising the agencies (NSA/GCHQ). Most of the people who work in them are conscientious officials engaged on a mission which they believe to be important and necessary. One interesting aspect of the Snowden revelations is that they contain few, if any, horror stories of “bad apples” or corrupt officials abusing their powers. This doesn’t mean that such scandals don’t exist, but my hunch is that this is very different from, say, what went on in MI5 and the CIA during the Kennedy/Nixon/Reagan eras.
  2. The discussion so far has focussed too much on the details or the surveillance programs, and not enough on what the existence of such programs means for society and democracy.
  3. ‘Oversight’ has been interpreted as checking that the agencies strictly adhere to the rules that have been set for them by legislation and executive order. It seems clear already that much of this oversight has been inadequate and flawed. But there has been very little discussion of democratic oversight of the rule-making process itself. It is important, of course, to ensure that rules set by Parliament or Congress are being obeyed at the execution level. But what is equally important – and thus far under-discussed – is whether the rules that have been created by politicians are themselves wise, effective and proportionate. There is little comfort to be derived from government assurances that everything done by NSA/GCHQ is “lawful” if the laws themselves are flawed.
  4. There is an important difference between espionage and bulk surveillance: the former is directed or targeted; the latter is generalised and indiscriminate.
  5. In a way, the agencies were set an impossible task by politicians in the aftermath of 9/11. “Never again” was both the letter and the spirit of the injunction. Societies must never again be vulnerable to the terrible things that terrorists might dream up and conspire about. Charged with this terrible responsibility the agencies attempted to forewarn against any conceivable threat, and the only way they could invent to do that involved the kind of comprehensive surveillance that Snowden reveals. What we don’t know – yet – is whether the agencies were actually doing this kind of surveillance before 9/11, in which case there would be some further awkward questions to be asked.
  6. The “war on terror” proved to be a really pernicious ploy. A state of war implies an existential threat to the nation, which justifies and legitimates very drastic measures. Between 1939 and 1945, for example, Britain was effectively a totalitarian state, and all kinds of civil liberties were drastically curtailed and infringed; but the citizenry grudgingly or willingly accepted these conditions because they understood the existential threat. But the “war on terror” is not a war in that sense; it’s merely a rhetorical device. it did, however provide ideological – and in some cases legal – cover for massive extensions of intrusive surveillance.
  7. Secrecy is always a tricky concept for democracies Because, on the one hand, democracy requires openness and publicity (to ensure that citizens can give their consent to what is being done in their name by state actors); but at the same time, democracies may legitimately need to engage in some activities which have to be kept secret. In some cases, secrecy is legitimate: in 1963, for example, the Cuban missile crisis was resolved by President Kennedy’s decision to offer the prospect of withdrawal of American missiles based in Turkey in return for a Soviet decision to withdraw their missiles from Cuba. This offer was kept secret from the American public for the very good reason that if it had been made public then it might have undermined congressional and public support for the president’s handling of the crisis.
  8. Democracies therefore are always trying to strike a balance between openness and secrecy. This can be a very hard balance to strike, so not surprisingly democracies tend to fudge the issue by offering to lift the veil of secrecy just far enough to provide a semblance of accountability. One of the things we have learned from the Snowdon affair is how threadbare this semblance is. What we have, as one shrewd commentator observed, is not real oversight but “oversight theatre”.
  9. A useful way to conceptualise the problem is to imagine a horizontal line. Activities above the line – for example legislative rule-making – take place in public. This is where policy is formed. Below the line is the area of policy execution by the agencies, and is hidden from the public.
  10. It would be naive to assume that the agencies confine themselves just to execution. They may sometimes be active above the line – for example in framing legislation which meets their needs but which is couched in terms that conceal from an ignorant public and a complacent or incompetent legislature the real import of the legislation. This process has been especially visible since 9/11. In that context, it’s interesting that the legislator who co-authored the Patriot Act has publicly declared his dismay at discovering (pace Snowden) what his statute has supposedly authorised. And in Britain it’s clear that directors of security organisations can play an important role in framing legislation.
  11. In Britain there is a deeply-ingrained tradition of political deference to the security services. This could be because Britain is a society that is more hierarchical and deferential than most. Or it could be that sentiment rules: GCHQ, for example, is seen as the spiritual heir of the wartime Bletchley Park codebreakers, and thus rides on their heroic coat-tails. Whatever the explanation, there are suspicions that budgetary and other proposals from senior security officials receive more favourable treatment in Whitehall than do comparable demands from “civilian” departments. One former senior member of the Blair government told me that in all his time in the Cabinet he could not recall a single instance in which a request from MI5/MI6/GCHQ was turned down by Tony Blair.
  12. Politicians in most Western democracies – including the United States and United Kingdom – are astonishingly ignorant about the capabilities and potential of computing and communication technologies. The proposition that such politicians might be capable of maintaining effective ‘oversight’ of technologically-adept agencies is implausible.
  13. Allied to politicians’ technological ignorance is the fact that “hacker culture” is an entirely alien world to them. This is important in considering the possibility of “mission creep” by surveillance agencies which are staffed by large numbers of talented software engineers. The Snowdon revelations include a few examples of what programmers call “cool hacks” which are indicative of technological exuberance and associated mission creep.
  14. Even if we except that the NSA has strictly adhered to the rules laid down by Congress, there is the problem that some of the activities revealed by Snowdon are nowhere mentioned in the rules. Congress, for example, did not mandate that the RSA encryption which supposedly secures the bulk of the commerce transactions on the open Internet should be covertly compromised by the agency. Nor did Congress mandate that the NSA should approach Microsoft after it acquired Skype with the demand/request that the technology should be modified in order to facilitate surveillance of VoIP communications.
  15. One of the most perplexing aspects of the whole surveillance question is why citizens of some of the most-surveilled societies seem relatively relaxed about it. There are, of course, cultural differences at work here – Germans, for example, seem to much more concerned about the Snowdon revelations than are Britons.
  16. The Snowdon revelations demonstrate the extent to which what one might call the National Surveillance State is a public-private enterprise. In a sense the state has covertly outsourced some of the surveillance to major Internet companies and telecommunications organisations. This is hardly surprising given that the core business of both the NSA/GCHQ and the Internet giants (Google, Yahoo, Facebook, Microsoft) is intensive, detailed, comprehensive surveillance. The only real difference is that the companies claim it is being done with the consent of their users – as registered by their acceptance of the terms and conditions imposed by corporate EULAs (End User Licence Agreements).
  17. One strange aspect of the whole business is the way the US government appeared unaware to the threat that exposure of NSA activities would pose to the country’s big technology companies. It’s inconceivable that policy makers would not have considered the damage that exposure would do. Or is it? Was it just that (see earlier comment about the cluelessness of politicians in this area) that the risk never crossed what might loosely be called their minds?
  18. The biggest question of all — and the one least discussed – is whether the kind of comprehensive surveillance revealed by Snowden and other whistleblowers is compatible with any meaningful conception of democracy.