Archive for the 'Privacy' Category

Making sense of Snowden

[link] Saturday, April 19th, 2014

This is a fantastic example of how to conduct an academic discussion of a really contentious subject. It brings together academics and NSA people to talk calmly about what’s happened and what it means. The participants are Yochai Benkler, Bruce Schneier, and Jonathan Zittrain of the Berkman Center and John DeLong and Anne Neuberger of the National Security Agency. The conversation is expertly moderated by the Berkman Faculty Director Terry Fisher.

It runs for 90 minutes, but is really worth it. So book some time off and watch.

Some thoughts triggered by it, in no particular order…

  1. Tempting thought it might be, I see little point in demonising the agencies (NSA/GCHQ). Most of the people who work in them are conscientious officials engaged on a mission which they believe to be important and necessary. One interesting aspect of the Snowden revelations is that they contain few, if any, horror stories of “bad apples” or corrupt officials abusing their powers. This doesn’t mean that such scandals don’t exist, but my hunch is that this is very different from, say, what went on in MI5 and the CIA during the Kennedy/Nixon/Reagan eras.
  2. The discussion so far has focussed too much on the details or the surveillance programs, and not enough on what the existence of such programs means for society and democracy.
  3. ‘Oversight’ has been interpreted as checking that the agencies strictly adhere to the rules that have been set for them by legislation and executive order. It seems clear already that much of this oversight has been inadequate and flawed. But there has been very little discussion of democratic oversight of the rule-making process itself. It is important, of course, to ensure that rules set by Parliament or Congress are being obeyed at the execution level. But what is equally important – and thus far under-discussed – is whether the rules that have been created by politicians are themselves wise, effective and proportionate. There is little comfort to be derived from government assurances that everything done by NSA/GCHQ is “lawful” if the laws themselves are flawed.
  4. There is an important difference between espionage and bulk surveillance: the former is directed or targeted; the latter is generalised and indiscriminate.
  5. In a way, the agencies were set an impossible task by politicians in the aftermath of 9/11. “Never again” was both the letter and the spirit of the injunction. Societies must never again be vulnerable to the terrible things that terrorists might dream up and conspire about. Charged with this terrible responsibility the agencies attempted to forewarn against any conceivable threat, and the only way they could invent to do that involved the kind of comprehensive surveillance that Snowden reveals. What we don’t know – yet – is whether the agencies were actually doing this kind of surveillance before 9/11, in which case there would be some further awkward questions to be asked.
  6. The “war on terror” proved to be a really pernicious ploy. A state of war implies an existential threat to the nation, which justifies and legitimates very drastic measures. Between 1939 and 1945, for example, Britain was effectively a totalitarian state, and all kinds of civil liberties were drastically curtailed and infringed; but the citizenry grudgingly or willingly accepted these conditions because they understood the existential threat. But the “war on terror” is not a war in that sense; it’s merely a rhetorical device. it did, however provide ideological – and in some cases legal – cover for massive extensions of intrusive surveillance.
  7. Secrecy is always a tricky concept for democracies Because, on the one hand, democracy requires openness and publicity (to ensure that citizens can give their consent to what is being done in their name by state actors); but at the same time, democracies may legitimately need to engage in some activities which have to be kept secret. In some cases, secrecy is legitimate: in 1963, for example, the Cuban missile crisis was resolved by President Kennedy’s decision to offer the prospect of withdrawal of American missiles based in Turkey in return for a Soviet decision to withdraw their missiles from Cuba. This offer was kept secret from the American public for the very good reason that if it had been made public then it might have undermined congressional and public support for the president’s handling of the crisis.
  8. Democracies therefore are always trying to strike a balance between openness and secrecy. This can be a very hard balance to strike, so not surprisingly democracies tend to fudge the issue by offering to lift the veil of secrecy just far enough to provide a semblance of accountability. One of the things we have learned from the Snowdon affair is how threadbare this semblance is. What we have, as one shrewd commentator observed, is not real oversight but “oversight theatre”.
  9. A useful way to conceptualise the problem is to imagine a horizontal line. Activities above the line – for example legislative rule-making – take place in public. This is where policy is formed. Below the line is the area of policy execution by the agencies, and is hidden from the public.
  10. It would be naive to assume that the agencies confine themselves just to execution. They may sometimes be active above the line – for example in framing legislation which meets their needs but which is couched in terms that conceal from an ignorant public and a complacent or incompetent legislature the real import of the legislation. This process has been especially visible since 9/11. In that context, it’s interesting that the legislator who co-authored the Patriot Act has publicly declared his dismay at discovering (pace Snowden) what his statute has supposedly authorised. And in Britain it’s clear that directors of security organisations can play an important role in framing legislation.
  11. In Britain there is a deeply-ingrained tradition of political deference to the security services. This could be because Britain is a society that is more hierarchical and deferential than most. Or it could be that sentiment rules: GCHQ, for example, is seen as the spiritual heir of the wartime Bletchley Park codebreakers, and thus rides on their heroic coat-tails. Whatever the explanation, there are suspicions that budgetary and other proposals from senior security officials receive more favourable treatment in Whitehall than do comparable demands from “civilian” departments. One former senior member of the Blair government told me that in all his time in the Cabinet he could not recall a single instance in which a request from MI5/MI6/GCHQ was turned down by Tony Blair.
  12. Politicians in most Western democracies – including the United States and United Kingdom – are astonishingly ignorant about the capabilities and potential of computing and communication technologies. The proposition that such politicians might be capable of maintaining effective ‘oversight’ of technologically-adept agencies is implausible.
  13. Allied to politicians’ technological ignorance is the fact that “hacker culture” is an entirely alien world to them. This is important in considering the possibility of “mission creep” by surveillance agencies which are staffed by large numbers of talented software engineers. The Snowdon revelations include a few examples of what programmers call “cool hacks” which are indicative of technological exuberance and associated mission creep.
  14. Even if we except that the NSA has strictly adhered to the rules laid down by Congress, there is the problem that some of the activities revealed by Snowdon are nowhere mentioned in the rules. Congress, for example, did not mandate that the RSA encryption which supposedly secures the bulk of the commerce transactions on the open Internet should be covertly compromised by the agency. Nor did Congress mandate that the NSA should approach Microsoft after it acquired Skype with the demand/request that the technology should be modified in order to facilitate surveillance of VoIP communications.
  15. One of the most perplexing aspects of the whole surveillance question is why citizens of some of the most-surveilled societies seem relatively relaxed about it. There are, of course, cultural differences at work here – Germans, for example, seem to much more concerned about the Snowdon revelations than are Britons.
  16. The Snowdon revelations demonstrate the extent to which what one might call the National Surveillance State is a public-private enterprise. In a sense the state has covertly outsourced some of the surveillance to major Internet companies and telecommunications organisations. This is hardly surprising given that the core business of both the NSA/GCHQ and the Internet giants (Google, Yahoo, Facebook, Microsoft) is intensive, detailed, comprehensive surveillance. The only real difference is that the companies claim it is being done with the consent of their users – as registered by their acceptance of the terms and conditions imposed by corporate EULAs (End User Licence Agreements).
  17. One strange aspect of the whole business is the way the US government appeared unaware to the threat that exposure of NSA activities would pose to the country’s big technology companies. It’s inconceivable that policy makers would not have considered the damage that exposure would do. Or is it? Was it just that (see earlier comment about the cluelessness of politicians in this area) that the risk never crossed what might loosely be called their minds?
  18. The biggest question of all — and the one least discussed – is whether the kind of comprehensive surveillance revealed by Snowden and other whistleblowers is compatible with any meaningful conception of democracy.

Military-Industrial Complex 2.0

[link] Sunday, March 23rd, 2014

This morning’s Observer column.

As they burgeoned, the big internet companies looked with disdain on the leviathans of the military-industrial complex. Kinetic warfare seemed so yesterday to those whose corporate mantras were about “not being evil” and adhering to “the hacker’s way”. So when Snowden revealed NSA claims that the spooks had untrammelled access to their servers the companies reacted like nuns accused of running a webcam porn site. It wasn’t true, they protested, and even it if was they knew nothing about it. Of course they did comply with government requests approved by a secret court, but that was the extent of it. As the months rolled by, however, this reassuring narrative has unravelled. We discovered that the NSA and GCHQ had indeed covertly tapped the data-traffic that flows between the companies’ server farms. But since Google and co were – they claimed – unaware of this, perhaps their protestations of innocence seemed justified. More embarrassing were the revelations about the astonishing lengths to which one company (Microsoft) went to facilitate NSA access to its users’ private communications.

Last Wednesday, another piece of the jigsaw slotted into place. The NSA’s top lawyer stated unequivocally that the technology firms were fully aware of the agency’s widespread collection of data. Rajesh De, the NSA general counsel, said that all communications content and associated metadata harvested by the NSA occurred with the knowledge of the companies – both for the Prism system and the covert tapping of communications moving across the internet.

Here we go again: another messaging app, more illusions of privacy and security

[link] Sunday, March 9th, 2014

Post updated — see below.

Simon Davies has an interesting take on the fallout from Facebook’s acquisition of WhatsApp.

In one of the most persuasive displays ever of the market power of consumer privacy, Facebook’s recent $19BN acquisition of the popular messaging app WhatsApp appears to have been given the thumbs-down by millions of users.

While it may be too early to produce a conclusive analysis, there are solid indications that the trend of new sign-ups to messaging apps over the past two weeks has overwhelmingly favoured the privacy-friendly Telegram app and has shifted decisively away from WhatsApp. Telegram has reportedly picked up between two and three million new users a day since the purchase was announced just over two weeks ago.

Davies says that “Telegram has built a range of attractive privacy features, including heavy end-to-end encryption and a message destruct function. As a result, many privacy professionals regard the app as the market leader for privacy.”

Hmmm… Davies points out that a German product test group recently criticised Telegram, on the grounds that

Telegram ist als einzige der getesteten Apps zumindest teil­weise quell­offen. Eine voll­ständige Analyse der verschlüsselten Daten­über­tragung war jedoch aufgrund der nur partiell einsehbaren Software-Programmierung nicht möglich…

…which I interpret as a view that judgement has to be withheld because the Telegram code is not fully open source — and therefore not open to independent scrutiny.

Anyway, intrigued, I downloaded the IoS version of the Telegram App to see what the fuss was about. The download was quick and efficient. The interface is clean. To get started you enter your mobile number and Telegram sends you a code when you then enter to confirm that it is indeed your phone. It then asks for access to your phone contacts which, it tells you, will be stored in the Cloud in heavily encrypted form…

Oh yeah? Can’t you just imagine the hoots of laughter in Fort Meade!

LATER: A colleague who is less linguistically-challenged than me writes:

I’m not sure that Simon Davis or you got the right angle on that test.de report on WhatsApp and alternatives. It’s true that test.de didn’t like it much, but their point about open source in the part you quoted is actually quite positive – it’s saying saying that it’s the only one of the apps they looked at that was even partly open source. A translation of the bit you quoted would be something like , “Telegram is, at least, the only one of the apps we tested that is partly open source. However, because the programming is only partly transparent, a complete analysis of its encrypted data transmission was not possible.” And the next sentence goes on to say, “But the testers can rule out the possibility that it transmits data unencrypted.”

That’s actually more positive than what they say in the corresponding section about any of the other apps, where they generally say they aren’t open source so that the testers can’t be sure that some data are not transmitted in unencrypted form.

Obviously that’s not a killer point for the German testers, however, because the only app they didn’t regard as having important problems is Threema, which isn’t open source.

What they didn’t like about Telegram is that:
* You have to choose explicitly to use encrypted transmission by choosing the “Secret Chat” option.
* The app automatically stores all your address book (contact) entries without asking you or asking the other people in the address book.
* In their conditions of use, users agree that the software house can store the user’s address book entries. No official address details (‘Impressum’) are given for the software house and there’s no contact adrdess where you can ask questions about data protection.

He’s put his finger on the biggest problem, in a way, which is not just that the App’s owners require you to upload your contact information in the Cloud, but that by accepting this requirement you compromise all those contacts without their knowledge or consent. This is the point that Eben Moglen was making in his wonderful Snowden lectures when he pointed out that acceptance of Gmail’s Terms and Conditions allows Google not only to read your own mail, but also that of your correspondents, none of whom have consented to that. (Though no doubt a slick lawyer will try on the argument that anyone who emails someone with a Gmail address implicitly gives his/her consent.)

Why your health secrets may no longer be safe with your GP

[link] Tuesday, January 28th, 2014

Last Sunday’s Observer column about the NHS plan to create a national database of health records.

Those planning this healthcare data-grab are clearly hoping that citizen inertia will enable them to achieve their aim, which is to make our most intimate personal details available for data-mining by “approved researchers”. If they succeed, then, starting in March, the medical data of everyone who has not opted out will be uploaded to the repository controlled by the NHS information centre. And for the first time the medical history of the entire nation will have been stored in one place.

What’s wrong with this?

How long have you got?

More fallout from the NSA revelations

[link] Friday, January 24th, 2014

From today’s New York Times

For years, Microsoft has let its customers in Europe, including businesses and organizations, keep their online data close to them. The company operates big data centers in Amsterdam and Dublin for that very purpose.

It now looks as if the company will deepen its commitment to letting those customers decide where their information is stored, at least partly because of concern about spying by the National Security Agency.

In an interview with The Financial Times, Brad Smith, Microsoft’s general counsel, said the company’s customers should be able to “make an informed choice of where their data resides.”

“Technology today requires that people have a high degree of trust in the services they are using ,” he told the paper. “ The events of the last year undermine some of that trust,” he said. “That is one of the reasons new steps are needed to address it.”

Interesting. In some ways, Microsoft is closer to the business community than are Google & Co. They may also be sensitive to the fact that some big European companies (e.g. Siemens) are offering European-based cloud services.

Even our grunts could be monetised by Facebook

[link] Sunday, December 22nd, 2013

This morning’s Observer column.

As Mark Twain observed: “A lie can travel halfway around the world while the truth is putting on its shoes.” And that was a long time before the web. Which brings us to a meme that was propagating last week though social media. Its essence was an assertion that Facebook monitored – and stored – not only the stuff that its subscribers post on their Facebook pages, but even stuff that they started to type and then deleted! Shock, horror!

Read on…

The US fears back-door routes into the net because it’s building them too

[link] Sunday, October 13th, 2013

This morning’s Observer column.

At a remarkable conference held at the Aspen Institute in 2011, General Michael Hayden, a former head of both the NSA and the CIA, said something very interesting. In a discussion of how to secure the “critical infrastructure” of the United States he described the phenomenon of compromised computer hardware – namely, chips that have hidden “back doors” inserted into them at the design or manufacturing stage – as “the problem from hell”. And, he went on, “frankly, it’s not a problem that can be solved”.

Now General Hayden is an engaging, voluble, likable fellow. He’s popular with the hacking crowd because he doesn’t talk like a government suit. But sometimes one wonders if his agreeable persona is actually a front for something a bit more disingenuous. Earlier in the Aspen discussion, for example, he talked about the Stuxnet worm – which was used to destroy centrifuges in the Iranian nuclear programme – as something that was obviously created by a nation-state, but affected not to know that the US was one of the nation-states involved.

Given Hayden’s background and level of security clearance, it seems inconceivable that he didn’t know who built Stuxnet. So already one had begun to take his contributions with a modicum of salt. Nevertheless, his observation about the intractability of the problem of compromised hardware seemed incontrovertible…

Read on.

LATER: I come on this amazing piece of detective work which uncovers a backdoor installed in some D-Link routers.

Why big data has made your privacy a thing of the past

[link] Sunday, October 6th, 2013

This morning’s Observer column.

Watching the legal system deal with the internet is like watching somebody trying to drive a car by looking only in the rear-view mirror. The results are amusing and predictable but not really interesting. On the other hand, watching the efforts of regulators – whether national ones such as Ofcom, or multinational, such as the European Commission – is more instructive.

At the moment, the commission is wrestling with the problem of how to protect the data of European citizens in a world dominated by Google, Facebook and co. The windscreen of the metaphorical car that the commission is trying to drive has been cracked so extensively that it’s difficult to see anything clearly through it.

So in her desperation, the driver (Viviane Reding, the commission’s vice-president) oscillates between consulting the rear-view mirror and asking passers-by (who may or may not be impartial) for tips about what lies ahead. And just to make matters worse, she also has to deal with outbreaks of fighting between the other occupants of the car, who just happen to be sovereign states and are a quarrelsome bunch at the best of times…

More.

American ‘justice’

[link] Sunday, August 4th, 2013

This morning’s Observer column.

Do you think that, as a society, the United States has become a basket case? Well, join the club. I’m not just thinking of the country’s dysfunctional Congress, pathological infatuation with firearms, addiction to litigation, crazy healthcare arrangements, engorged prison system, chronic inequality, 50-year-old military-industrial complex and out-of-control security services. There is also its strange irrationality about the use and abuse of computers.

Two events last week provided case studies of this…

There are lies, damned lies and… official statements about NSA surveillance

[link] Saturday, August 3rd, 2013