MyTrojan

Here’s something from Insecure.org to make Rupert Murdoch choke on his muesli.

Overview

========

Myspace.com provides a site navigation menu near the top of every page.

Users generally use this menu to navigate to the various areas of the website. The first link that the menu provides is called “Home” which navigates back to the user’s personalized Myspace page which is essentially the user’s “home base” when using the site. As such this particular link is used quite frequently and is used to return from other areas of the website, most importantly from other user’s profile pages.

A content-replacement attack coupled with a spoofed Myspace login page can be used to collect victim users’ authentication credentials. By replacing the navigation menu on the attacker’s Myspace profile page, an unsuspecting victim may be redirected to an external site of the attacker’s choice, such as a spoofed Myspace login page. Due to Myspace.com’s seemingly random tendency to expire user sessions or log users out, a user being presented with the Myspace login page is not out of the ordinary and does not raise much suspicion on the part of the victim.

Impact

======

Users are unexpectedly redirected to a website of the attacker’s choice.

Users may be tricked into revealing their authentication credentials.

Affected Systems

================

Myspace.com: http://www.myspace.com

Here’s GMSV’s account:

Some MySpace users are getting their first taste of an STD — a socially transmitted disease. Identity thieves are using a vulnerability in the popular social network’s navigation to spread a particularly virulent worm that steals log-in credentials and lures users to phishing sites. Attacks begin with a rigged QuickTime video. “Once a user’s MySpace profile is infected (by viewing a malicious embedded QuickTime video), that profile is modified in two ways,” WebSense explains. “The links in the user’s page are replaced with links to a phishing site, and a copy of the malicious QuickTime video is embedded into the user’s site. Any other users who visit this newly-infected profile may have their own profile infected as well.” MySpace hasn’t revealed the extent of the infection, but an informal scan of 150 user profiles by FaceTime Communications found that close to a third were infected. That same ratio probably doesn’t translate to MySpace’s 73 million registered users — if it did we’d have a Black Death-style Web pestilence on our hands. So in the end this mostly serves as a reminder that everyone needs to pay more attention to security. “We’re continuing to make the same mistakes by putting security last,” Billy Hoffman, lead engineer at Web security specialist SPI Dynamics, recently told News.com. “People are buying into this hype and throwing together ideas for Web applications, but they are not thinking about security, and they are not realizing how badly they are exposing their users.”

Carbon footprints

My friend and OneWorld colleague, Peter Armstrong, never does anything by halves. About two years ago he decided that he wanted to reduce his family’s carbon footprint (which was high because he and his partner Anuradha have to do a lot of air-travel). He started with their house in Oxfordshire and installed a heat-pump as well as doing a lot of insulation etc. He also blogged the entire process in a fascinatingly open way. Here is his assessment of where they’ve got to after the first year of the new regime.

October marked the end of the first year with the heat pump and the other energy saving measures we have put in place. The results are very interesting and to some extent surprising. We can look at them in a number of different ways.

Our baseline was 2004 when our heating oil cost £2,431 and our electricity £2,292, giving a total energy cost for the house of £4,732.

Now in 2006 (Oct 2005-Oct 2006) we have only electricity to consider. This breaks down as non-heat pump £1,481 and heat pump £1,663, giving a total energy cost of £3,144.

So we may conclude that we have a crude saving of £1,579 on the year, about half from using less general electricity and half from using the heat pump instead of oil.

Perhaps more interestingly, the cost of oil in 2006 would have been £3,403, which would have made us another £1,000 worse off.

So we could say that the heat pump (cost £13,000) will pay for itself in seven years at 2006 oil prices…

The economics of abundance

This morning’s Observer column

MIPS is to computer geeks what BHP (brake horse-power) is to Jeremy Clarkson. It is an acronym for ‘Millions of Instructions Per Second’, a measure of the speed of a central processing unit (CPU). Mips measures raw CPU performance, but not overall system performance, which is determined by lots of factors (such as disk speed and data in and out of Ram) so it would be foolish to use it as the only measure of how powerful your computer is. But Mips is an interesting indicator none the less….

The problem with programming

Interesting Technology Review interview with Bjarne Stroustrup, the guy who dreamed up C++. Excerpt:

Technology Review: Why is most software so bad?
Bjarne Stroustrup: Some software is actually pretty good by any standards. Think of the Mars Rovers, Google, and the Human Genome Project. That’s quality software! Fifteen years ago, most people, and especially most experts, would have said each of those examples was impossible. Our technological civilization depends on software, so if software had been as bad as its worst reputation, most of us would have been dead by now.

On the other hand, looking at “average” pieces of code can make me cry. The structure is appalling, and the programmers clearly didn’t think deeply about correctness, algorithms, data structures, or maintainability. Most people don’t actually read code; they just see Internet Explorer or Windows “freeze,” have their cell phone drop a call, read the latest newspaper story about viruses, and they shudder.

I think the real problem is that “we” (that is, we software developers) are in a permanent state of emergency, grasping at straws to get our work done. We perform many minor miracles through trial and error, excessive use of brute force, and lots and lots of testing, but–so often–it’s not enough.

Software developers have become adept at the difficult art of building reasonably reliable systems out of unreliable parts. The snag is that often we do not know exactly how we did it: a system just “sort of evolved” into something minimally acceptable. Personally, I prefer to know when a system will work, and why it will.

TR: How can we fix the mess we are in?
BS: In theory, the answer is simple: educate our software developers better, use more-appropriate design methods, and design for flexibility and for the long haul. Reward correct, solid, and safe systems. Punish sloppiness.

In reality, that’s impossible. People reward developers who deliver software that is cheap, buggy, and first. That’s because people want fancy new gadgets now. They don’t want inconvenience, don’t want to learn new ways of interacting with their computers, don’t want delays in delivery, and don’t want to pay extra for quality (unless it’s obvious up front–and often not even then). And without real changes in user behavior, software suppliers are unlikely to change.

We can’t just stop the world for a decade while we reprogram everything from our coffee machines to our financial systems. On the other hand, just muddling along is expensive, dangerous, and depressing. Significant improvements are needed, and they can only come gradually. They must come on a broad front; no single change is sufficient…

It’s a good interview, worth reading in full. There’s a lovely exchange towards the end:

TR: How do you account for the fact that C++ is both widely criticized and resented by many programmers but at the same time very broadly used? Why is it so successful?
BS: The glib answer is, There are just two kinds of languages: the ones everybody complains about and the ones nobody uses.

This email address will self-destruct in ten minutes…

Here’s a neat idea for dealing with sites which won’t let you use them unless you provide a valid email address that they can then use to spam you. — 10 Minute Mail. Blurb reads:

Welcome to 10 Minute Mail. By clicking on the link below, you will be given a temporary e-mail address. Any e-mails sent to that address will show up automatically on the web page. You can read them, click on links, and even reply to them. The e-mail address will expire after 10 minutes. Why would you use this? Maybe you want to sign up for a site which requires that you provide an e-mail address to send a validation e-mail to. And maybe you don’t want to give up your real e-mail address and end up on a bunch of spam lists. This is nice and disposable. And it’s free.

The dictatorship of the presentation layer

Bill Thompson is eloquently sceptical about Web 2.0. (I prefer the term techBubble 2.0 btw.) Here’s a sample of his Register blast:

If Web 2.0 is the answer then we are clearly asking the wrong question, and we must not be fooled by the cool sites and apparently open APIs. Most of the effort is – literally – window dressing, designed to attract venture capitalists to poorly-considered startups and get hold of enough first-round funding to build either a respectable user base or enough barely runnable alpha code to provide Google or Yahoo! with yet another tasty snack. We need to take a wider view of what is going on.

Back in the 1870s Karl Marx outlined the steps through which he believed a capitalist society needed to pass before it could reach socialism. After the revolution came the dictatorship of the proletariat, a painful but necessary stage of oppression and correction, during which the organs of the state would whither away as humanity achieved its true potential and coercion became unnecessary.

Web 2.0 marks the dictatorship of the presentation layer, a triumph of appearance over architecture that any good computer scientist should immediately dismiss as unsustainable.

Ajax is touted as the answer for developers who want to offer users a richer client experience without having to go the trouble of writing a real application, but if the long term goal is to turn the network from a series of tubes connecting clients and servers into a distributed computing environment then we cannot rely on Javascript and XML since they do not offer the stability, scalability or effective resource discovery that we need.

There is a massive difference between rewriting Web pages on the fly with Javascript and reengineering the network to support message passing between distributed objects, a difference that too many Web 2.0 advocates seem willing to ignore. It may have been twenty years since Sun Microsystems trademarked the phrase ‘the network is the computer’ but we’re still a decade off delivering, and if we stick with Ajax there is a real danger that we will never get there…M/blockquote>

Vista: the torture begins

This morning’s Observer column

Next Thursday, 30 November, is the feast day of St Andrew, the patron saint of Scotland. Pity he’s not also the patron saint of computer users, because soon they are going to need all the divine help they can get.

How come? Well, 30 November is also the day that Microsoft releases Vista, the new version of Windows, to its corporate customers. Because companies don’t squeal, we may expect the occasion to pass off reasonably peacefully. The screaming proper will only start on 30 January next year, when the system is released to consumers.

Vista, you see, is a new kind of beast. It’s not enough just to install it on your computer; you must also ‘activate’ it…

Oh no — not another article about Wikipedia’s failings

Yet another tired article on the subject of “Can Wikipedia Ever Make the Grade?” Wonder why people continue to publish this stuff — especially a supposedly high-class site like The Chronicle (of Higher Education)? The article starts in the predictable way of such guff — with a good-news story:

Two years ago, when he was teaching at the State University of New York at Buffalo, the professor hatched a plan designed to undermine the site’s veracity — which, at that time, had gone largely unchallenged by scholars. Adopting the pseudonym “Dr. al-Halawi” and billing himself as a “visiting lecturer in law, Jesus College, Oxford University,” Mr. Halavais snuck onto Wikipedia and slipped 13 errors into its various articles. He knew that no one would check his persona’s credentials: Anyone can add material to the encyclopedia’s entries without having to show any proof of expertise.

Some of the errata he inserted — like a claim that Frederick Douglass, the abolitionist, had made Syracuse, N.Y., his home for four years — seemed entirely credible. Some — like an Oscar for film editing that Mr. Halavais awarded to The Rescuers Down Under, an animated Disney film — were more obviously false, and easier to fact-check. And others were downright odd: In an obscure article on a short-lived political party in New Brunswick, Canada, the professor wrote of a politician felled by “a very public scandal relating to an official Party event at which cocaine and prostitutes were made available.”

Mr. Halavais expected some of his fabrications to languish online for some time. Like many academics, he was skeptical about a mob-edited publication that called itself an authoritative encyclopedia. But less than three hours after he posted them, all of his false facts had been deleted, thanks to the vigilance of Wikipedia editors who regularly check a page on the Web site that displays recently updated entries. On Dr. al-Halawi’s “user talk” page, one Wikipedian pleaded with him to “refrain from writing nonsense articles and falsifying information.”

Mr. Halavais realized that the jig was up.

Writing about the experiment on his blog (http://alex .halavais.net), Mr. Halavais argued that a more determined “troll” — in Web-forum parlance, a poster who contributes only inflammatory or disruptive content — could have done a better job of slipping mistakes into the encyclopedia. But he said he was “impressed” by Wikipedia participants’ ability to root out his fabrications. Since then several other high-profile studies have confirmed that the site does a fairly good job at getting its facts straight — particularly in articles on science, an area where Wikipedia excels.

Experienced readers will know what follows next — the “but” clause. And, lo!, here it is:

Among academics, however, Wikipedia continues to receive mixed — and often failing — grades. Wikipedia’s supporters often portray the site as a brave new world in which scholars can rub elbows with the general public. But doubters of the approach — and in academe, there are many — say Wikipedia devalues the notion of expertise itself.

The rest of the piece then rehashes a lot of old stuff that anyone with access to an RSS feed has read a thousand times. What I’d really like to see is something that moves on the discussion about user-generated reference material.

The Wikipedia cycle

Fastinating post by LeeAnn Prescott based on Hitwise data about how people access Wikipedia. The chart shows

the steadily increasing market share of visits to Wikipedia. What you’ll notice upon closer examination is that Wikipedia’s traffic is tied to the academic school year. That bump in December 2005? Finals and term paper time. The subsequent dip? Christmas vacation. The larger bump in May 06? Finals again. Another dip in traffic during the summer months, and another surge in September as school starts.