By: Brandon Larabee
Some of the ethical dilemmas of using artificial intelligence to address criminal justice issues are familiar to anyone who watched “Person of Interest.” The CBS science-fiction show revolved around the efforts of a team of human beings and “The Machine” — an artificial super-intelligence — to stop crimes before they could happen.
In the real world of criminal justice and the legal system, though, problems not anticipated by “Person of Interest” are cropping up with algorithms are used to predict criminal behavior. Where The Machine was relentlessly rational and unfailing (unless being interfered with), real-world machines are increasingly facing questions about whether they produce outcomes just as biased as the humans who build them.
As with many controversies in the public sphere, a counter-backlash is brewing. Writing recently for Wired, Noam Cohen argued that algorithms (and the computers that crunch the numbers) could as easily be sources of justice as of injustice. Cohen highlighted reporting by The New York Times that eventually led some New York City district attorneys to be more lenient with low-level marijuana offenses.[1]
“But imagine if we turned that spigot of data and incisive algorithms toward those who presume to judge and control us: Algorithms should be another important check on the system, revealing patterns of unfairness with a clarity that can be obscured in day-to-day life,” Cohen writes.[2]
That argument, though, comes amid a sustained pushback against efforts to use algorithms and predictive technology to do everything from making bail decisions to assisting in sentencing to deciding where police should focus their enforcement efforts.
New York City, for example, established an Automated Decision Systems Task Force to start looking at how the city uses its powerful data tools.[3] Activists have criticized a Los Angeles Police Department program that uses computer programs to choose surveillance targets because the data input into the system creates a “racist feedback loop.”[4] The COMPAS algorithm, used to create recidivism scores for judges to consider during sentencing, has been accused of bias against people of color.[5]
There are defenders of algorithms beyond Cohen. Sharad Goel of Stanford University told Nature: International Journal of Science that, in the journal’s words, discrepancies between error rates for whites and people of color “instead reflect the fact that one group is more difficult to make predictions about than another.”[6]
“It turns out that that’s more or less a statistical artifact,” Goel said.[7]
That might come as cold comfort to an offender being sentenced based on a flawed formula: The formula is working against him or her because it has a problem predicting what people of the offender’s race will do, not because it’s biased per se.
Those inclined to seek a compromise have started to float ideas meant to answer the questions of bias while still using the data algorithms produce to (one hopes) improve society. One idea is simply to accept that, by their very nature, algorithms are “biased” — so the public should have more information and more input into what goes into the formulas.[8]
At least one avenue for a possible resolution seems to be closed for now. The U.S. Supreme Court faced a decision last year about whether to take the case of Loomis v. Wisconsin, a frontal assault on the use of COMPAS in sentencing decisions.[9] But the court passed.[10]
[1] Noam Cohen, Algorithms can be a tool for justice — if used the right way, Wired (Oct. 25, 2018, 1:23 PM), https://www.wired.com/story/algorithms-netflix-tool-for-justice/.
[2] Id.
[3] Mayor de Blasio announces first-in-nation task force to examine automated decision systems used by the city, NYC.gov (May 16, 2018), https://www1.nyc.gov/office-of-the-mayor/news/251-18/mayor-de-blasio-first-in-nation-task-force-examine-automated-decision-systems-used-by.
[4] George Joseph, The LAPD has a new surveillance formula, powered by Palantir, The Appeal (May 8, 2018), https://theappeal.org/how-walmart-is-helping-prosecutors-get-10-year-sentences-for-shoplifting-7d868e8b38b8/
[5] See Sara Chodosh, Courts use algorithms to help determine sentencing, but random people get the same results, Popular Science (Jan. 18, 2018), https://www.popsci.com/recidivism-algorithm-random-bias#page-2
[6] Rachel Courtland, Bias detectives: The researchers striving to make algorithms fair, Nature: International Journal of Science (June 20, 2018), https://www.nature.com/articles/d41586-018-05469-3
[7] Id.
[8] Matthias Spielkamp, Inspecting algorithms for bias, MIT Technology Review (June 12, 2017), https://www.technologyreview.com/s/607955/inspecting-algorithms-for-bias/amp/.
[9] Adam Liptak, Sent to prison by a software program’s secret algorithms, N.Y. Times (May 1, 2017), https://www.nytimes.com/2017/05/01/us/politics/sent-to-prison-by-a-software-programs-secret-algorithms.html.
[10] Loomis v. Wisconsin, SCOTUSBlog, http://www.scotusblog.com/case-files/cases/loomis-v-wisconsin/.
Image Source: https://deadline.com/2016/06/person-of-interest-finale-jonah-nolan-interview-x-files-batman-taraji-p-henson-greg-plageman-1201775530/