Want to network your Jeep Cherokee? Try smoke signals: they’re safer

This morning’s Observer column:

‘‘Jeep Cherokee hacked in demo; Chrysler owners urged to download patch”, was the heading on an interesting story last week. “Just imagine,” burbled the report, “one moment you’re listening to some pleasant pop hits on the radio, and the next moment the hip-hop station is blasting at full volume – and you can’t change it back! This is just one of the exploits of Charlie Miller and Chris Valasek … when they hacked into a Jeep Cherokee. They were able to change the temperature of the air conditioning, turn on the windshield wipers and blast the wiper fluid to blur the glass, and even disable the brakes, turn off the transmission, take control of the steering, and display their faces onto the dashboard’s screen.”

In some ways, this was an old story: cars have been largely governed by electronics since the 1980s, and anyone who controls the electronics controls the car. But up to now, the electronics have not been connected to the internet. What makes the Jeep Cherokee story interesting is that its electronics were hacked via the internet. And that was possible because internet connectivity now comes as a consumer option – Uconnect – from Chrysler.

If at this point you experience a sinking feeling, then join the club. So let us return to first principles for a moment…

Read on

LATER: Chrysler has issued a recall for 1.4 million vehicles as a result of the hacking revelations.

Common sense about hacking

From the Economist blog:

FOR companies, there are two strategies for dealing with people who uncover flaws in their IT security: a right way and a wrong way. Our leader on hacking this week tells of the approach that Volkswagen took when a group of academics informed it that they had uncovered a vulnerability in a remote-car-key system: the firm slapped a court injunction on them. It is difficult to conceive of an approach more likely to be counter-productive.

United Airlines, it seems, has a far more enlightened attitude. It has just awarded two hackers 1m air miles each after they managed to spot security weak spots in its website. The move is part of a scheme called “bug bounty”, in which hackers are incentivised to contact the company with security flaws, rather than post them online. This approach is common at Silicon Valley firms, and makes just as much sense for old-fashioned industries too. Pound to a penny, there are nefarious types out there trying to break into most big companies’ IT systems. Encouraging “white-hat” hackers to uncover flaws, and then rewarding them for not revealing them to the wider world, may sit uncomfortably with people’s sense of fairness. However, if it gives firms time to fix the problem, in pragmatic terms the benefit is obvious.


Old cryptopanic in new iBottles

Repeat after me:

A ‘backdoor’ for law enforcement is a deliberately introduced security vulnerability, a form of architected breach.

Or, if you’d like the more sophisticated version

It requires a system to be designed to permit access to a user’s data against the user’s wishes, and such a system is necessarily less secure than one designed without such a feature. As computer scientist Matthew Green explains in a recent Slate column (and, with several eminent colleagues, in a longer 2013 paper) it is damn near impossible to create a security vulnerability that can only be exploited by “the good guys.” Activist Eva Galperin puts the point pithily: “Once you build a back door, you rarely get to decide who walks through it.” Even if your noble intention is only to make criminals more vulnerable to police, the unavoidable cost of doing so in practice is making the overwhelming majority of law-abiding users more vulnerable to criminals.

Bruce Schneier’s next book

Title: Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World

Publisher: WW Norton

Publication date: March 9, 2015

Table of Contents

Part 1: The World We’re Creating
Chapter 1: Data as a By-Product of Computing
Chapter 2: Data as Surveillance
Chapter 3: Analyzing our Data
Chapter 4: The Business of Surveillance
Chapter 5: Government Surveillance and Control
Chapter 6: Consolidation of Institutional Surveillance

Part 2: What’s at Stake
Chapter 7: Political Liberty and Justice
Chapter 8: Commercial Fairness and Equality
Chapter 9: Business Competitiveness
Chapter 10: Privacy
Chapter 11: Security

Part 3: What to Do About It
Chapter 12: Principles
Chapter 13: Solutions for Government
Chapter 14: Solutions for Corporations
Chapter 15: Solutions for the Rest of Us
Chapter 16: Social Norms and the Big Data Trade-Off

Something to be pre-ordered, methinks.

Neoliberalism’s revolving door

Well, guess what? The former Head of the NSA has found a lucrative retirement deal.

As the four-star general in charge of U.S. digital defenses, Keith Alexander warned repeatedly that the financial industry was among the likely targets of a major attack. Now he’s selling the message directly to the banks.

Joining a crowded field of cyber-consultants, the former National Security Agency chief is pitching his services for as much as $1 million a month. The audience is receptive: Under pressure from regulators, lawmakers and their customers, financial firms are pouring hundreds of millions of dollars into barriers against digital assaults.

Alexander, who retired in March from his dual role as head of the NSA and the U.S. Cyber Command, has since met with the largest banking trade groups, stressing the threat from state-sponsored attacks bent on data destruction as well as hackers interested in stealing information or money.

“It would be devastating if one of our major banks was hit, because they’re so interconnected,” Alexander said in an interview.

Nice work if you can get it. First of all you use your position in the state bureaucracy to scare the shit out of banks. Then you pitch your services as the guy who can help them escape Nemesis.

The US fears back-door routes into the net because it’s building them too

This morning’s Observer column.

At a remarkable conference held at the Aspen Institute in 2011, General Michael Hayden, a former head of both the NSA and the CIA, said something very interesting. In a discussion of how to secure the “critical infrastructure” of the United States he described the phenomenon of compromised computer hardware – namely, chips that have hidden “back doors” inserted into them at the design or manufacturing stage – as “the problem from hell”. And, he went on, “frankly, it’s not a problem that can be solved”.

Now General Hayden is an engaging, voluble, likable fellow. He’s popular with the hacking crowd because he doesn’t talk like a government suit. But sometimes one wonders if his agreeable persona is actually a front for something a bit more disingenuous. Earlier in the Aspen discussion, for example, he talked about the Stuxnet worm – which was used to destroy centrifuges in the Iranian nuclear programme – as something that was obviously created by a nation-state, but affected not to know that the US was one of the nation-states involved.

Given Hayden’s background and level of security clearance, it seems inconceivable that he didn’t know who built Stuxnet. So already one had begun to take his contributions with a modicum of salt. Nevertheless, his observation about the intractability of the problem of compromised hardware seemed incontrovertible…

Read on.

LATER: I come on this amazing piece of detective work which uncovers a backdoor installed in some D-Link routers.

Why having a passcode might not protect your iPhone 5s from unauthorised use

Well, well. Alongside the discovery that the iPhone 5s fingerprint system isn’t quite as secure as advertised comes this.

If you have an iPhone 5 or older and have updated your operating system to Apple’s new iOS 7 version, you should be aware that the password (or “passcode”) required on your phone’s lock screen no longer prevents strangers from accessing your phone.

They can use Siri, the voice-command software, to bypass the password screen and access your phone, instead.

The good news is that distressed iPhone 5S owners can apparently foil this workaround by controlling access to Siri in the phone’s settings menu. The trail is: Settings –> General –> Passcode Lock [enter passcode] –> Allow access when locked > Siri > switch from green to white.

Smart meters might not be so clever after all

This morning’s Observer column.

Those whom the gods wish to destroy, they first make credulous. In the case of technology, especially technology involving computers, that’s pretty easy to do. Quite why people are so overawed by computers when they are blase about, say, truly miraculous technologies such as high-speed trains, is a mystery that we will have to leave for another day. The only thing we need to remember is that when important people, for example government ministers, are confronted with what a sceptical friend of mine calls “computery” then they check in their brains at the door of the meeting room. From then on, credulity is their default setting.

In which state, they are easy meat for technological visionaries, evangelists and purveyors of snake oil. This would be touching if it weren’t serious. Exhibit A in this regard is the government’s plan for “smart meters”…

American ‘justice’

This morning’s Observer column.

Do you think that, as a society, the United States has become a basket case? Well, join the club. I’m not just thinking of the country’s dysfunctional Congress, pathological infatuation with firearms, addiction to litigation, crazy healthcare arrangements, engorged prison system, chronic inequality, 50-year-old military-industrial complex and out-of-control security services. There is also its strange irrationality about the use and abuse of computers.

Two events last week provided case studies of this…