The benefits of assuming the worst

From Technology Review. What should banks and other ‘secure’ services do when dealing with customers who are incapable of keeping their machines free of malware?

“Our premise,” Ledingham says, “is that, rather than trying to clean up the machines, assume the machine is already infected and focus on protecting the transaction that goes on between the consumer and the enterprise website.”

The problem of malware on users’ computers is “the number-one problem that the financial institutions are wrestling with today,” says Forrester Research senior analyst Geoffrey Turner, an expert on online fraud. Financial institutions can take steps to secure the connections between their servers and their customers’ PCs, Turner says; they can even ensure the security of the customer’s Web browser. But they’re stumped, he says, when it comes to the customer’s operating system. Most successful attempts to steal computer users’ identities, Turner says, involve using malware to capture their credentials or conduct transactions behind the scenes without their knowledge. “The challenge is, how do you secure the end-user computer?” he says. “Should you even, as a bank, be trying to do that?”

Needless to say, his answer is “yes”. But then he runs SiteTrust, a tool recently released by a data-security company, Verdasys, which aims to protect users from fraud, even when their computers have been compromised.

Social malware

From Technology Review

Ever since Facebook opened its doors to third-party applications a year and a half ago, millions of users have employed miniature applications to play games, share movie and song recommendations, and even “zombie-bite” their friends. But as the popularity of third-party applications has grown, computer-security researchers have also begun worrying about ways that social-networking applications could be misused. The same thing that makes social networking such an effective way to distribute applications–deep access to a user’s networks of friends and acquaintances–could perhaps make it an ideal way to distribute malicious code…

Interesting article. I’ve been wondering about this ever since Facebook apps arrived.

Genius? What genius?

This morning’s Observer column

In triumph of the Nerds, Robert Cringely’s 1996 TV documentary series about the rise of the personal computer industry, Steve Jobs was asked what made Apple such an unusual company. ‘It comes down,’ he said, ‘to trying to expose yourself to the best things that humans have done and then try to bring those things into what you’re doing. Picasso had a saying, “good artists copy, great artists steal”, and we have always been shameless about stealing great ideas.’

Before we get too sanctimonious about this, it’s worth remembering that Jobs’s adoption of Picasso’s mantra is what has made Apple such an innovative force in the computer business. Its unique selling proposition is that it takes good ideas and turns them into products that ordinary human beings can use…

Ian Hibell RIP

The Economist has a lovely obit of Ian Hibbell, the man who cycled the equivalent of six times round the world.

In a career of hazards, from soldier ants to real soldiers to sleet that cut his face like steel, only motorists did him real damage. The drivers came too close, and passengers sometimes pelted him with bottles (in Nigeria), or with shovelfuls of gravel (in Brazil). In China in 2006 a van drove over his arm and hand. He recovered, but wondered whether his luck would last. It ran out on the road between Salonika and Athens this August, where he was knocked out of the way by a car that appeared to be chasing another.

At bad moments on his trips he had sometimes distracted himself by thinking of Devonian scenes: green fields, thatched cottages and daffodils. He would return to a nice house, a bit of garden, the job. But that thought could never hold him long. Although his body might long for the end of cycling—a flat seat, a straight back, unclenched hands—his mind was terrified of stopping. And in his mind, he never did.

A handbag!!!

Brings back memories of Margaret Rutherford in The Importance of Being Earnest.

Er, I suspect that it’s really the HP Mini-Note (on which I’m writing this) in sheep’s clothing.

PA sacked by Ministry of the Interior

From The Register

The Home Office has today terminated a £1.5m contract with PA Consulting after it lost the personal details of the entire UK prison population.

In August the firm admitted to officials that it had downloaded the prisons database to an unencrypted memory stick, against the security terms of its contract to manage the JTrack prolific offender tracking system. The data included names, addresses and dates of birth, and was broken down by how frequently individuals had offended.

Following an inquiry into the gaffe, Jacqui Smith told the House of Commons today that PA Consulting’s £8m of other Home Office contracts are now also under review. She said: “The Home Office have decided to terminate this contract. My officials are currently working with PA to take this work back in-house without affecting the operation of JTrack.”

Data handling for JTrack has been taken on by the Home Office, and maintenance and training are due in-house by December.

The inquiry found the Home Office had transferred the data to PA Consulting securely, but that the firm then dumped it to unlabelled USB memory to transfer it between computers at its premises. The stick hasn’t been found. Smith said: “This was a clear breach of the robust terms of the contract covering security and data handling.”

What took them so long?

Google: the essence

Nick Carr has a thoughtful meditation on Google. Its vitality stems, he thinks,

from the vast number of complements to its core business. Complements are, to put it simply, any products or services that tend be consumed together. Think hot dogs and mustard, or houses and mortgages. For Google, literally everything that happens on the Internet is a complement to its main business. The more things that people and companies do online, the more ads they see and the more money Google makes. In addition, as Internet activity increases, Google collects more data on consumers’ needs and behavior and can tailor its ads more precisely, strengthening its competitive advantage and further increasing its income. As more and more products and services are delivered digitally over computer networks — entertainment, news, software programs, financial transactions — Google’s range of complements expands into ever more industry sectors. That’s why cute little Google has morphed into The Omnigoogle.

Because the sales of complementary products rise in tandem, a company has a strong strategic interest in reducing the cost and expanding the availability of the complements to its core product. It’s not too much of an exaggeration to say that a company would like all complements to be given away. If hot dogs became freebies, mustard sales would skyrocket. It’s this natural drive to reduce the cost of complements that, more than anything else, explains Google’s strategy. Nearly everything the company does, including building big data centers, buying optical fiber, promoting free Wi-Fi access, fighting copyright restrictions, supporting open source software, launching browsers and satellites, and giving away all sorts of Web services and data, is aimed at reducing the cost and expanding the scope of Internet use. Google wants information to be free because as the cost of information falls it makes more money…

No black holes — but a data tsunami

From CERN

The Large Hadron Collider will produce roughly 15 petabytes (15 million gigabytes) of data annually – enough to fill more than 1.7 million dual-layer DVDs a year!

Thousands of scientists around the world want to access and analyse this data, so CERN is collaborating with institutions in 33 different countries to operate a distributed computing and data storage infrastructure: the LHC Computing Grid (LCG).

Data from the LHC experiments is distributed around the globe, with a primary backup recorded on tape at CERN. After initial processing, this data is distributed to eleven large computer centres – in Canada, France, Germany, Italy, the Netherlands, the Nordic countries, Spain, Taipei, the UK, and two sites in the USA – with sufficient storage capacity for a large fraction of the data, and with round-the-clock support for the computing grid.

These so-called “Tier-1” centres make the data available to over 120 “Tier-2” centres for specific analysis tasks. Individual scientists can then access the LHC data from their home country, using local computer clusters or even individual PCs…

Hopefully, all of this is not orchestrated by Windows servers.