The House of Commons Select Committee on Culture, Media and Sport has been pondering the ‘problem’ of ‘unsuitable’ online content and, having deliberated, has brought forth a report which is a flake of Cadbury proportions. Here’s an excerpt from the Summary:
Sites which host user-generated content—typically photos and videos uploaded by members of the public—have taken some steps to set minimum standards for that content. They could and should do more. We recommend that terms and conditions which guide consumers on the types of content which are acceptable on a site should be prominent. It should be made more difficult for users to avoid seeing and reading the conditions of use: it would then become more difficult for users to claim ignorance of terms and conditions if they upload inappropriate content.
It is not standard practice for staff employed by social networking sites or video-sharing sites to preview content before it can be viewed by consumers. Some firms do not even undertake routine review of material uploaded, claiming that the volumes involved make it impractical. We were not persuaded by this argument, and we recommend that proactive review of content should be standard practice for sites hosting user-generated content. We look to the proposed UK Council to give a high priority to reconciling the conflicting claims about the practicality and effectiveness of using staff and technological tools to screen and take down material. We also invite the Council to help develop agreed standards across the Internet industry on take-down times—to be widely publicised—in order to increase consumer confidence.
It is common for social networking sites and sites hosting user-generated content to provide facilities to report abuse or unwelcome behaviour; but few provide a direct reporting facility to law enforcement agencies. We believe that high profile facilities with simple, preferably one-click mechanisms for reporting directly to law enforcement and support organisations are an essential feature of a safe networking site. We would expect providers of all Internet services based upon user participation to move towards these standards without delay…
One wonders if any of the boobies who sit on the Committee have ever actually used the Internet. I’ve just checked with Flickr (one of the user-generated content sites which exercises these Tribunes of the People). A total of 4,219 images were uploaded to it in the last minute.
Charles Arthur has the measure of these crazies.
And then they drop the big idea: “We recommend that proactive review of content should be standard practice for sites hosting user-generated content.” Not just that, but there should be a hotline to the police: “Few [social sites] provide a direct reporting facility to law enforcement agencies. We believe that high profile facilities with simple, preferably one-click mechanisms for reporting directly to law enforcement and support organisations are an essential feature of a safe networking site.”
And as if that weren’t enough, their final, big, razzle-dazzle is a call for, yes, a centralised body, a fabulous new self-regulatory quango:
“Under which the industry would speedily establish a self-regulatory body to draw up agreed minimum standards based upon the recommendations of the UK council for child internet safety, monitor their effectiveness, publish performance statistics, and adjudicate on complaints. In time, the new body might also take on the task of setting rules governing practice in other areas such as online piracy and peer to peer file-sharing, and targeted or so-called “behavioural” advertising.”
Oh, my aching neurons. Let’s start at the top. Proactive review? That means checking before putting up. That means one pair of eyes per pair of eyes uploading stuff. Unfeasible, unless we demand Facebook employ, say, 50,000 new staff to look over all the content being uploaded by Facebook’s 8 million-plus UK users. Hey, I’m sure Mark Zuckerberg would be delighted.
A hotline to the police? Have you noticed how uninterested the police are when you call them to say that your bank card has been cloned and hundreds taken from your account? And how will they deal with a zillion people clicking “report to police” each time someone says, “I’m going to kill you!” on some user forum? The problem with this is that it doesn’t – to use the net phrase – “scale”.
Bill Thompson also has a go at this in Index on Censorship.
A bunch of MPs has decided the best way to get some publicity at the start of the summer recess, when newspaper editors are starved of ‘serious’ stories, is to announce that the Internet is like the Wild West, and children are constantly exposed to unsuitable material on YouTube, reveal intimate personal details on Bebo and surf the web looking for pro-anorexia or suicide support sites.
Sadly, it seems that John Whittingdale and his committee members have not been poring over the technical details of IPv6 and OpenID, so what we’ve got in their report is yet more condemnation of the dark side of today’s Internet and a few poorly-grounded suggestions as to what might be done, most of which seem to comprise a call for Internet service providers and web hosts to become the net’s new morality police.