This morning’s Observer column:
[In the 1960s] the thought that we would one day live in an “information society” that was comprehensively dependent on computers would have seemed fanciful to most people.
But that society has come to pass, and suddenly the algorithms that are the building blocks of this world have taken on a new significance because they have begun to acquire power over our everyday lives. They determine whether we can get a bank loan or a mortgage, and on what terms, for example; whether our names go on no-fly lists; and whether the local cops regard one as a potential criminal or not.
To take just one example, judges, police forces and parole officers across the US are now using a computer program to decide whether a criminal defendant is likely to reoffend or not. The basic idea is that an algorithm is likely to be more “objective” and consistent than the more subjective judgment of human officials. The algorithm in question is called Compas (Correctional Offender Management Profiling for Alternative Sanctions). When defendants are booked into jail, they respond to a Compas questionnaire and their answers are fed into the software to generate predictions of “risk of recidivism” and “risk of violent recidivism”.
It turns out that the algorithm is fairly good at predicting recidivism and less good at predicting the violent variety. So far, so good. But guess what? The algorithm is not colour blind…