In the News

Law, Order & Algorithms | Sharad Goel

Sharad Goel, Assistant Professor of Management Science & Engineering

We need to be judicious in how we use algorithms in our legal system; they carry the power to reinforce disparities as well as reduce them.

“In the criminal justice system,” intones the introduction to “Law and Order”, “the people are represented by two separate yet equally important groups: the police, who investigate crime, and the district attorneys, who prosecute the offenders.” According to Dr. Sharad Goel, an assistant professor in Stanford’s Department of Management Science and Engineering, we need to add another party to this list: algorithms.

In a recent talk at the Research Institute of the Center for Comparative Studies in Race and Ethnicity, Goel described how big data and computational methods are upending business as usual in the courtroom, offering possibilities for detecting and addressing biases of police officers and judges, while at the same time raising new challenges for building a just legal system. As the executive director of the Stanford Computational Policy Lab, Dr. Goel is at the forefront of these discussions. In a recent presentation at the Research Institute of CCSRE, he spoke on the use and misuse of algorithms in the criminal justice system.

What, exactly, is an algorithm? Simply put, an algorithm is a sequence of rules used to solve a problem. For example, to make a cup of coffee, one algorithm would be to grind the beans, put them in a filter, fill the chamber with water, then turn on the coffeemaker (or skip that algorithm altogether and locate the nearest coffee shop). However, not all questions are so straightforward. How do police officers make decisions on who to stop? And are there racial biases in these decisions? What’s the best way for a judge to decide if a defendant is a flight risk or should be released on bail? Just as computer algorithms help us when we conduct a Google search, Dr. Goel’s research suggests that they can inform these profound issues.

 

Take the example of stop and frisk. As part of this controversial policy, New York City police officers detained and searched large numbers of pedestrians: almost 500,000 a year at the program’s height. 80% of those stopped were Black or Latino, even though these groups comprised only half of New York City’s population. Attention-grabbing to be sure– but is this bias? Until recently, our best test involved comparing outcomes of searches. That is, the rate of “hits” –searches that found drugs or weapons- relative to “false alarms”, searches that resulted in nothing but wasted time and community resentment. In some cases, however, this approach can create the appearance of bias where there is none.

Goel’s team developed an algorithm that instead compared the threshold for officers’ decisions to frisk minority and White pedestrians, overcoming the limitations of the outcome test. In an analysis of millions of stops, Dr. Goel and colleagues found evidence of racial bias: officers set a lower threshold for searching Black and Latino pedestrians than they did for Whites. By taking a big data approach, the team was able to rule out alternative explanations for this racial disparity. For example, this bias wasn’t limited to “a few bad blocks”; officers were more reticent to search White pedestrians in almost every neighborhood in New York.

Big data can reveal big disparities in peoples’ decisions. But can it be used as a tool to address these shortcomings as well?  Goel and his team have put algorithms to use to reduce the number of individuals held in jail while they await trial. In the jurisdiction Dr. Goel analyzed, judges are supposed to make the decision to imprison or release the defendant on bail based on their flight risk. Given the consequences of awaiting trial in prison –missing work, being separated from family- one would hope that judges would choose wisely. In reality, however, their decisions were all over the map. A judge was as likely to deny bail to 50-year-old man who had missed a single court date as they were to an 18-year-old with four or more missed dates, a much greater flight risk.

In light of such decisions, algorithms may provide a fairer alternative. Specifically, Goel’s team identified two factors that predicted whether a defendant would actually skip trial: their age, and the number of court dates they had missed in the past. By assigning points to these features, judges could have a simple and transparent way to calculate risk. The result? Goel estimates that 35% fewer defendants could be detained with almost no change in the number who skip trial.

This sounds like a compelling case for kicking back and letting the computer take over. But Professor Goel cautions that we need to be judicious in how we use algorithms in our legal system; they carry the power to reinforce disparities as well as reduce them. For example, St. George’s Hospital in the UK developed an algorithm to sort medical school applicants, based on the past decisions of the human admissions committee. All too human, it turns out, as the model picked up the same biases of the human committee members. Similarly, an algorithm that learns from police stops or past judicial decisions runs the risk of incorporating racial biases into its code.

Because of limitations like these, Goel sees algorithms as supporting our decisions, not replacing them altogether. While these tools can help identify and overcome limitations in our decision making, an algorithm can’t set priorities for criminal justice policy. It is still up to us whether the person a model identifies as “at-risk” is sent to prison or connected to community resources. Given the promises, limitations, and challenges of these tools, Professor Goel’s research provides more than enough material for several seasons of “Law, Order, and Algorithms”.

Nicholas Camp is a PhD Candidate in Psychology and a CCSRE Graduate Dissertation Fellow