A research center of the
UCR School of Public Policy
A computer program used for bail and sentencing decisions was labeled biased against blacks. It’s actually not that clear
The Washington PostThe article focused on previous analysis of the algorithm, called COMPASS, and its usage on a nationwide level in deciding pretrial clearance or bail access based on the danger one poses to the community. Building off of a claim by ProPublica that stated COMPASS is biased against black defendants, the Washington post reanalyzed the percentage of Defendants reoffended percentages. That detailed blacks being more than twice as likely as whites to be classified as medium or high risk (42 percent vs. 22 percent). The authors ultimately call for consideration on ending bail requirements altogether — electronic monitoring so that no one is unnecessarily jailed. This is in combination with much needed reform and analysis requests from COMPASS that they have historically denied access to to address efficiency and equity of consequential decisions.
Keywords: risk assessment, race, algorithms, sentencing, crime prevention, decision-making, prediction, violence risk assessment
- Recidivism