MySpace: the story continues

From this morning’s New York Times

The Universal Music Group, the world’s largest music company, filed a copyright infringement lawsuit yesterday against MySpace, the popular social networking Web site, for allowing users to upload and download songs and music videos.

The suit, which also names MySpace’s corporate parent, the News Corporation, comes as the recording industry contends with how to exploit its copyrighted material online. The issue has taken on more importance as services built around user-generated content become popular and generate advertising revenue.

The lawsuit, filed in federal court in Los Angeles, is seen as part of a strategy by Universal to test provisions of a federal law that provides a “safe harbor” to Internet companies that follow certain procedures to filter out copyrighted works. The law requires sites to remove such content after being notified by the copyright holder…

ACAP, hypocrisy and Orwell

In his column this morning, my colleague Peter Preston mentions ACAP — the initiative launched by the World Association of Newspapers to control access to newspaper sites by search engines. Here’s a useful summary of the proposal:

The World Association of Newspapers, European Publishers Council (EPC), International Publishers Association and European Newspaper Publishers’ Association will pilot an Automated Content Access Protocol (ACAP) beginning 6 October at the Frankfurt Book Fair, said Kaye, who is advising on the project.

ACAP will allow content providers to systematically grant permissions information relating to access and use of content in a form that can be read by ‘crawlers’ so search engine operators and any other users can automatically comply with applicable licenses or policies, the EPC said. There are already existing protocols to help website owners tell search engine ‘spiders’ which areas of a site can be indexed. ACAP will not replace them, but will try to overcome problems such as the simplistic nature of the permissions they control, basically, ‘yes, please spider this page’ or ‘no, please do not spider this page.’

During the 12-month pilot, publishers will develop terms and conditions for the search engines to whom they have given the authority to automatically search and index their works. If successful, the standard will allow all publishers to take a tailored approach to search engines, ultimately enriching users’ experiences, the EPC said. While the project will focus first on the needs of print publishers, it will be usable for every type of online content, including video and audio.

To an Orwellian analyst of language, the interesting phrase is “enriching users’ experiences”. What form will this “enrichment” take? Why, this:

ACAP is supposed to tell a search engine something like this: ALLOW, but only for two weeks, then delete from cache and redirect to payment gateway instead.

I can see why newspapers would want to do this, but the only “enrichment” that would follow from it is theirs.

More seriously: do newspapers really think this is going to help them in the long run? In a way, we’ve been here before — with those arguments years ago about ‘deep linking’. I seem to remember that the New York Times got it and negotiated a deal with Dave Winer which gave blogs access to deep-linked pages while diverting ‘ordinary’ visitors to the paywall gateway. In a networked world, the only way you’re going to have any influence (or be read) is to have properly-linkable content. End of story. And if that doesn’t fit with your existing business model, then maybe you need a new one.

Our cognitive bias against openness

Lovely FT column by James Boyle. Sample:

Studying intellectual property and the internet has convinced me that we have another cognitive bias. Call it the openness aversion. We are likely to undervalue the importance, viability and productive power of open systems, open networks and non-proprietary production.

Test yourself on the following questions. In each case, it is 1991 and I have removed from you all knowledge of the past 15 years.

You have to design a global computer network. One group of scientists describes a system that is fundamentally open – open protocols and systems so anyone could connect to it and offer information or products to the world. Another group – scholars, businessmen, bureaucrats – points out the problems. Anyone could connect to it. They could do anything. There would be porn, piracy, viruses and spam. Terrorists could put up videos glorifying themselves. Your activist neighbour could compete with The New York Times in documenting the Iraq war. Better to have a well-managed system, in which official approval is required to put up a site; where only a few actions are permitted; where most of us are merely recipients of information; where spam, viruses, piracy (and innovation and anonymous speech) are impossible. Which would you have picked?

Set yourself the task of producing the greatest reference work the world has ever seen. It must cover everything from the best Thai food in Raleigh to the annual rice production of Thailand, the best places to see blue whales to the history of the Blue Dog Coalition. Would you create a massive organisation of paid experts with layers of editors producing tomes that are controlled by copyright and trademark? Or would you wait for hobbyists, scientists and volunteer encyclopedists to produce, and search engines to organise, a cornucopia of information? I know which way I would have bet in 1991. But I also know that the last time I consulted an encyclopedia was in 1998….

Billy Bragg and MySpace

Interesting piece in today’s NYT…

When he is not writing or performing protest songs, the British folk-rocker Billy Bragg is apparently reading the fine print.

In May, Mr. Bragg removed his songs from the MySpace.com Web site, complaining that the terms and conditions that MySpace set forth gave the social networking site far too much control over music that people uploaded to it. In media interviews and on his MySpace blog, he said that the MySpace terms of service made it seem as though any content posted on the site, including music, automatically became the site’s property.

Although MySpace had not claimed ownership of his music or any other content, Mr. Bragg said the site’s legal agreement — which included the phrase “a nonexclusive, fully paid and royalty-free worldwide license” — gave him cause for concern, as did the fact that the formerly independent site was now owned by a big company (the News Corporation, which is controlled by Rupert Murdoch).

Mr. Bragg said that he himself had kept most of the copyrights to his recordings, licensing them out to the various record companies that have released his albums over the years. “My concern,” he said in a telephone interview, “is the generation of people who are coming to the industry, literally, from their bedrooms.”

About a month later, without referencing Mr. Bragg’s concerns, MySpace.com clarified its terms of service, which now explain who retains what rights. A sample line: “The license you grant to MySpace.com is nonexclusive (meaning you are free to license your content to anyone else in addition to MySpace.com).”

Jenny Toomey, executive director of the Future of Music Coalition, an advocacy group for musicians that focuses on intellectual property rights, said the Internet could help musicians warn one another about potential contractual problems. “Information is now shared in a different way,” she said, “and artists who are getting a bad deal can connect with each other.”

Mr. Bragg, who said he never had any direct communication with executives from MySpace, has put some of his music back on the site. And he offered some praise for the site’s effectiveness in spreading his message. “That’s the amazing thing about MySpace,” he said. “If you say something, word gets out.”

Net Neutrality: Rules vs. Principles

I found this post from Tim O’Reilly very helpful in thinking about the Net Neutrality debate.

Tim focusses on a helpful distinction made by Chris Savage — between rules and principles. The gist is:

A lot of confusion in the Net Neutrality debate has do with the hoary distinction in jurisprudence between “rules” and “principles.”

A first approximation for the non-lawyers here: the tax code is full of RULES: Take this number, divide it by that number, place the result on line 17 if it’s greater than $57,206 and on line 19 if it’s less. Etc. RULES are intended to direct or forbid very specific behaviors.

PRINCIPLES, on the other hand, are more general. When driving you are required to use “reasonable care.” If you don’t, then you are negligent and can be held liable, in a tort case, for the damages you cause. And though there are plenty of rules about driving, tort liability is based on the PRINCIPLE of reasonable care, and is assessed on a case-by-case basis.

“Net Neutrality” is a principle, not a rule…

It seems to me that this distinction is useful in all kinds of areas. For example, in relation to IP legislation, an important principle is that monopolies are at best a necessary evil and should be avoided or limited wherever possible. This means that any proposal to extend an IP right (which is, remember, a legislative grant of monopoly rights) should always be viewed with extreme scepticism.

Taking on copyright abusers

This morning’s Observer column

This year, Bloomsday was marked in somewhat different ways. In Dublin, the festivities were cancelled because of the state funeral of Charlie Haughey, the former Taoiseach. Such restraint was entirely out of character with the spirit of Bloomsday, for Haughey was as colourful a rogue as any encountered by Leopold Bloom on his perambulations on the day in 1904 on which the novel is set. The correct thing to do would have been to infiltrate the obsequies, thereby highlighting the absurdity of a political establishment seeking to pretend that Haughey had been somehow a statesman of note.

The failure of Joycean nerve in Dublin was, however, offset by a laudable display of spunk in California, where Professor Lawrence Lessig of Stanford University filed a legal suit against James Joyce’s grandson, Stephen Joyce, in a US district court, accusing the administrator of the writer’s estate of ‘copyright misuse’.

Given that the entire publishing world has been legally intimidated by Stephen Joyce for decades, this is a landmark action. And the case will be followed with interest in every jurisdiction in which works on James Joyce are published….

35% of software in the world is pirated

From ZDNet.com

35% of the packaged software installed on personal computers (PC) worldwide in 2005 was illegal, amounting to $34 bln in global losses due to software piracy. Piracy rates decreased moderately in more than half (51) of the 97 countries, and increased in only 19. The global rate was unchanged from 2004 to 2005 as large developed markets like the United States, Western Europe, Japan and a handful of Asian countries continue to dominate the software market while their combined piracy rate hardly moved….

Scan This Book!

Kevin Kelly published an interesting paen to Google Books in the New York Times recently. Sample:

When Google announced in December 2004 that it would digitally scan the books of five major research libraries to make their contents searchable, the promise of a universal library was resurrected. Indeed, the explosive rise of the Web, going from nothing to everything in one decade, has encouraged us to believe in the impossible again. Might the long-heralded great library of all knowledge really be within our grasp?

Brewster Kahle, an archivist overseeing another scanning project, says that the universal library is now within reach. “This is our chance to one-up the Greeks!” he shouts. “It is really possible with the technology of today, not tomorrow. We can provide all the works of humankind to all the people of the world. It will be an achievement remembered for all time, like putting a man on the moon.”

And unlike the libraries of old, which were restricted to the elite, this library would be truly democratic, offering every book to every person.

But the technology that will bring us a planetary source of all written material will also, in the same gesture, transform the nature of what we now call the book and the libraries that hold them. The universal library and its “books” will be unlike any library or books we have known. Pushing us rapidly toward that Eden of everything, and away from the paradigm of the physical paper tome, is the hot technology of the search engine….

This is typical Kelly hyperbole, and it attracted a lot of attention in the blogosphere. Including some perceptive criticism from here.

Of course, the difference between now and then is that doing gives a single company – Google – enormous market power.

And if history is any guide, not once has a firm with absolute power – Standard Oil, Microsoft, you know the score – been anything less than evil.

Google is, in a very real sense, profiting enormously from the utopian naivete of the Valley. And though Kevin’s article is a great read – and I’m a huge fan of his new work – this flaw makes his conclusion – a utopian vision of ubiquitous, “free”, information totally invalid.

Has Kevin used Google Scholar? If you haven’t, try a simple query like this.

That screen is the polar opposite of ubiquitous, free information – it is a set of links which send you to walled gardens built by academic publishers who want to charge $20, $50, or $100 or more for a single article.

But it is the future the Googleverse leads to. It’s the inevitable result of handing informational market power over to Google – just like physical distribution economies (and price hypersensitive consumers) inevitably lead to Wal-Mart. Either one is just as evil as far as consumers are concerned.

Kevin argues that we should scan books because there is a “moral imperative to scan” – a moral imperative to make information free, essentially.

Are you kidding? That’s like saying there’s a moral imperative to buy gas, or to buy the cheapest goods possible – because this so-called moral imperative has a single economic effect: to line Google’s pockets, handing market power over to it.

Take books – what we’re talking about here. The so-called moral imperative is only valid if there’s a level playing field for scanning; if the scanning market can be made competitive.

Of course, it can’t – it’s a natural monopoly; who scans the most wins, because the average cost is always falling.

And this – profiting from the natural monopoly dynamics of information – is, make no mistake about it, exactly Google’s game – not creating some kind of Gutopia.

The Google Scholar example is very compelling. This guy is sharp.

Trolling for business

This morning’s Observer column

Question: Six months ago you set up a technology company in your garage. You’ve got your first round of serious funding and can hire people. Which of the following do you employ first? A software engineer? An office manager? A book-keeper? A salesman?

Answer: none of the above. What you may need most of all is a patent lawyer. Otherwise in two years’ time – just when you’ve had the first really big order for 200,000 units of your new gizmo – you may find yourself opening an unpleasantly worded letter from a company based in Virginia or Delaware claiming that the aforementioned gizmo infringes one of their patents and threatening legal action unless you pay them whopping royalties. You have no idea whether this claim would be upheld by a court. But it will cost at least $100,000 in legal fees to find out, and even the hint of litigation will scare off the venture capitalists you desperately need to provide the second round of funding needed to fulfil that first big order.

Welcome to the world of high technology…

The Devil’s new tune

This morning’s Observer column

The devil, famously, has the best tunes – ‘Honky Tonk Women’, ‘I Can’t Get No Satisfaction’, etc. But what do you do when he suddenly starts singing ‘Lead Kindly Light’? This is the kind of puzzle set last week when Warner Brothers announced plans to make over 200 films available for downloading. That’s not the funny bit, though: the real scream is that they propose to use BitTorrent to do it…