How Are Algorithms Used in the Criminal Justice System?
Several states use algorithms for bail, sentencing, and paroleBy Judy Malmon, J.D. | Reviewed by Canaan Suitt, J.D. | Last updated on November 30, 2023 Featuring practical insights from contributing attorney Tama Beth Kudman
Use these links to jump to different sections:
- The Use of Algorithms in the Criminal Justice Decision-Making Process
- Three Major Problems with Algorithmic Decision-Making
- How a Lawyer Can Help
Algorithms are computerized analyses of data that use statistical probabilities to predict many aspects of people’s lives, from what music they’ll enjoy to when a traffic light should change. One area where algorithmic calculations have been used in recent years is in the criminal justice system. Algorithms are used to predict flight risk and the likelihood of reoffending while also setting bail, sentencing, and parole.
It’s understandable why judges and other decision-makers might be drawn to using algorithms to help determine the most appropriate approach for a given defendant. Algorithms can assess a large number of data points and help streamline the human-decision making process. The problem is that outsourcing to algorithms has some problems.
The Use of Algorithms in the Criminal Justice Decision-Making Process
Algorithms used in many courts nationwide have come under fire lately for keeping their data points secret, claiming that what goes into the formula is proprietary.
The COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) software is based on 137 questions about prior criminal activity, family history, educational background, residential location, employment status, and more.
The defendants themselves provide some answers; others are extracted from records. Ultimately, the results are used to predict whether the individual is likely to commit a crime in the future.
Three Major Problems with Algorithmic Decision-Making
West Palm Beach criminal defense attorney Tama Kudman, who has represented defendants in Florida, New York, and federal criminal proceedings, has observed disparities in automation tools. “Especially in predicting recidivism among violent offenders, when people are identified as at risk for recidivism, they’re very highly inaccurate,” she says.
Even worse, in jurisdictions relying on computerized risk assessments, research indicates that bias built into the programs can perpetuate racial and socioeconomic disparities.
1. Racial Disparities
A 2016 study undertaken by ProPublica looked at the results of 700 algorithmic predictions in Broward County, Florida, between 2013 and 2014. It found that the COMPAS system correctly predicted recidivism only 61 percent of the time.
Though the algorithm doesn’t ask your race, the program was twice as likely to incorrectly predict that African-Americans would re-offend than similarly situated Caucasians. Conversely, it was more likely for whites who would go on to commit future crimes to be labeled low risk. Across all segments, the formula was correct only 20 percent of the time in predicting future commission of violent crimes.
2. Age Disparities
“There is also an age component to these so-called predictors,” Kudman says. “As people get older, particularly in young men, when you get past that 21-year-old point, you see a significant drop in violence. So, if you’re looking at behaviors that predate that age, they’re not going to be predictive of who they are when they’re older.”
3. Inaccurate Criteria
In setting up the algorithm to function based on historical datasets, it’s a foregone conclusion that it will spit out a predictor as biased as the system providing the data. Disproportionately high African-American prosecution rates will predict disproportionately high African-American re-offense rates.
“From my perspective, I understand that we have to have tools to do some kind of overview assessment,” Kudman says. “The problem is, the psychiatric studies looking at these tools show that we don’t have any good evidence substantiating that these are the right criteria. There’s more awareness that these algorithms are particularly stacked against minorities and poor people, and they’re trying to adjust, but no one’s come up with an answer yet.”
How a Lawyer Can Help
Kudman’s experience is that legal representation can help to overcome the challenges of the artificial intelligence metrics.
“The job of a really good attorney is to recognize the shortfalls of these algorithms and to look for the data showing how skewed a particular assessment is, so they can alert the judge to the weakness of that assessment,” she says. “What I do as a criminal defense lawyer is to distinguish my client and his or her situation from the general population they’re looking at when creating these risk assessments.”
If you’re facing a risk assessment appraisal, it’s imperative that you have an experienced criminal defense attorney providing this representation. For more information on this area of law, see our overview of criminal defense.
Additional Criminal Defense articles
- What is Criminal Law?
- Lying About Your Age Online and on Social Media: Is It Legal?
- Driving Barefoot: Is It Legal?
- Private Gun Sales: Are They Legal?
- When Should I Hire a Criminal Defense Attorney?
- How To Pay For a Criminal Defense Attorney
- What Is The Difference Between Civil and Criminal Law?
- What Are the Different Types of Criminal Defense Attorneys?
- How To Choose the Right Criminal Defense Attorney for You
- Can Your DNA Link You to Crimes You Didn’t Commit?
- What's a Subpoena, and Why Have I Been Served?
- What Attorney-Client Privilege Means For You
- An Overview on Assault and Battery Law
State Criminal Defense articles
Find top lawyers with confidence
The Super Lawyers patented selection process is peer influenced and research driven, selecting the top 5% of attorneys to the Super Lawyers lists each year. We know lawyers and make it easy to connect with them.Find a lawyer near you