The Algorithm That Decides Your Fate in the Criminal Justice System
Florida, and other states, use computerized formulas for bail, sentencing and parole
By Judy Malmon, J.D. | Reviewed by Canaan Suitt, J.D. | Last updated on May 1, 2023Use these links to jump to different sections:
Algorithms are computerized analyses of data that use statistical probabilities to predict everything from what music you’ll enjoy to when a traffic light should change. One area where algorithmic calculations have been used in recent years is in the criminal justice system. Algorithms are used to predict flight risk and the likelihood of reoffending while also setting bail, sentencing, and parole.

It’s understandable why judges and other decision-makers would be drawn to the idea to help determine the most appropriate approach for a given defendant. Problem is: It has some problems.
The COMPAS Program
Algorithms used in many courts throughout the country have come under fire lately for keeping their data points secret, claiming that what goes into the formula is proprietary.
The COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) software is based on 137 questions about prior criminal activity, family history, educational background, residential location, employment status and more.
Some answers are provided by the defendants themselves, some extracted from records. Ultimately, the results are used to predict whether the individual is likely to commit a crime in the future.
The Problems with Automation
West Palm Beach criminal defense attorney Tama Kudman has represented defendants in Florida, New York, and in federal criminal proceedings, and has observed disparities in the use of these tools. “Especially in predicting recidivism among violent offenders, when people are identified as at risk for recidivism, they’re very highly inaccurate,” she says.
Even worse: In jurisdictions relying on computerized risk assessments, research indicates that bias built into the programs can perpetuate racial and socioeconomic disparities.
The job of a really good attorney is to recognize the shortfalls of these algorithms and to look for the data showing how skewed a particular assessment is, so they can alert the judge to the weakness of that assessment. What I do as a criminal defense lawyer is to distinguish my client and his or her situation from the general population they’re looking at when creating these risk assessments.
Racial Disparities
A 2016 study undertaken by ProPublica looked at the results of 700 algorithmic predictions in Broward County, between 2013 and 2014. It found that the COMPAS system correctly predicted recidivism only 61 percent of the time.
Though the algorithm doesn’t ask your race, the program was twice as likely to incorrectly predict that African-Americans would reoffend than similarly situated Caucasians. Conversely, it was more likely for whites who would go on to commit future crimes to be labeled low risk. Across all segments, the formula was correct only 20 percent of the time in predicting future commission of violent crimes.
Age Disparities
“There is also an age component to these so-called predictors,” Kudman says. “As people get older, particularly in young men, when you get past that 21-year-old point, you see a significant drop in violence. So, if you’re looking at behaviors that predate that age, they’re not going to be predictive of who they are when they’re older.”
Inaccurate Criteria
In setting up the algorithm to function based on historical datasets, it’s a foregone conclusion that it will spit out a predictor as biased as the system providing the data. Disproportionately high African-American prosecution rates will predict disproportionately high African-American re-offense rates.
“From my perspective, I understand that we have to have tools to do some kind of overview assessment,” Kudman says. “The problem is, the psychiatric studies looking at these tools show that we don’t have any good evidence substantiating that these are the right criteria. There’s more awareness that these algorithms are particularly stacked against minorities and poor people, and they’re trying to adjust, but no one’s come up with an answer yet.”
How a Lawyer Can Help
Kudman’s experience is that legal representation can help to overcome the challenges of the artificial intelligence metrics.
“The job of a really good attorney is to recognize the shortfalls of these algorithms and to look for the data showing how skewed a particular assessment is, so they can alert the judge to the weakness of that assessment,” she says. “What I do as a criminal defense lawyer is to distinguish my client and his or her situation from the general population they’re looking at when creating these risk assessments.”
If you’re facing a risk assessment appraisal, it’s imperative that you have an experienced criminal defense attorney providing this representation. For more information on this area of law, see our overview of criminal defense.
What do I do next?
Enter your location below to get connected with a qualified attorney today.Additional Criminal Defense articles
Related topics
Attorney directory searches
Helpful links
Find top lawyers with confidence
The Super Lawyers patented selection process is peer influenced and research driven, selecting the top 5% of attorneys to the Super Lawyers lists each year. We know lawyers and make it easy to connect with them.
Find a lawyer near you