This morning’s Observer column.
Wikipedia is a typical product of the open internet, in that it started with a few simple principles and evolved a fascinating governance structure to deal with problems as they arose. It recognised early on that there would be legitimate disagreements about some subjects and that eventually corporations and other powerful entities would try to subvert or corrupt it.
As these challenges arose, Wikipedia’s editors and volunteers developed procedures, norms and rules for addressing them. These included software for detecting and remedying vandalism, for example, and processes such as the “three-revert” rule. This says that an editor should not undo someone else’s edits to a page more than three times in one day, after which disagreements are put to formal or informal mediation or a warning is placed on the page alerting readers that there is controversy about the topic. Some perennially disputed pages, for example the one on George W Bush, are locked down. And so on.
In trying to figure out how to run itself, Wikipedia has therefore been grappling with the problems that will increasingly bug us in the future. In a comprehensively networked world, opinions and information will be super-abundant, the authority of older, print-based quality control and verification systems will be eroded and information resources will be intrinsically malleable. In such a cacophonous world, how will we know what is reliable and true? How will we deal with disagreements and disputes about knowledge? How will we sort out digital wheat from digital chaff? Wikipedia may be imperfect (what isn’t?) but at the moment it’s the only model we have for addressing these problems.