By Christoph Heitz (ZHAW)
translated from original German language version published at Inside IT
Can a prisoner be released early, or released on bail? A judge who decides this should also consider the risk of recidivism of the person to be released. Wouldn’t it be an advantage to be able to assess this risk objectively and reliably? This was the idea behind the COMPAS system developed by the US company Northpoint.
The system makes an individual prediction of the chance of recidivism for imprisoned offenders, based on a wide range of personal data. The result is a risk score between 1 and 10, where 10 corresponds to a very high risk of recidivism. This system has been used for many years in various U.S. states to support decision making of judges – more than one million prisoners have already been evaluated using COMPAS. The advantages are obvious: the system produces an objective risk prediction that has been developed and validated on the basis of thousands of cases.
In May 2016, however, the journalists’ association ProPublica published the results of research suggesting that this software systematically discriminates against black people and overestimates their risk (Angwin et al. 2016): 45 percent of black offenders who did not reoffend after their release were identified as high-risk. In the corresponding group of whites, however, only 23 percent were attributed a high risk by the algorithm. This means that the probability of being falsely assigned a high risk of recidivism is twice as high for a black person as for a white person.
Continue reading
Recent Comments