By Jeffrey A. Phaup

 

In September of 2020 a Tampa Bay Times investigative report brought to light a policing operation in Pasco County Florida.[1] The county’s Sheriff’s Department had deployed mass monitoring, targeted intimidation, and harassment tactics against selected families and individuals for years based on the implementation of a questionably designed algorithm that relied on dubious data and arbitrary decisions.[2]

 

There are two broad types of predictive policing tools.[3] The first are location-based algorithms that draw on links between places, events, and historical crime rates to predict where and when crimes are more likely to happen.[4] The second type are tools that draw on data about individuals, such as age, gender, marital status, substance abuse history, and criminal records.[5] This second type of tool is used both by police in attempts to intervene before crimes take place, and by the court system  to make determinations about pretrial release and sentencing; where a person’s score is used to quantify how likely they are to be rearrested if released.[6]

 

In Pasco County, the algorithm assigned targets utilizing scores based on individuals’ criminal records, including charges that were later dropped.[7] Once the algorithm identified individuals it considered at risk of committing more crimes, Deputies were then instructed to visit these individuals’ homes and make arrests for any reason they could.[8]These arrests in turn would then be fed back into the algorithm, creating a feedback loop and leading to increased targeting of select individuals by the algorithm.[9]

 

The Tampa Bay Times report highlights the dangers of algorithm-based policing, also known as predictive policing.[10]While proponents of the practice argue that algorithm-based policing can help predict crimes more accurately and effectively than traditional police methods do, critics have raised concerns about transparency and accountability.[11]There is also an underlying danger that lies with the data the algorithms feed upon.[12] Bias that may be baked into the algorithms themselves, as they rely on historical data produced by biased decisions and consequences.[13]

 

According to the US Department of Justice a person is more then twice as likely to be arrested if they are Black then if they are White.[14] A Black person is also five times more likely to be stopped without just cause then a White person is.[15] These arrests in turn would then be fed back into the algorithm, creating a feedback loop and leading to increased targeting of select individuals by the algorithm.[16]

 

The Constitution protects the right of individuals to be “secure in their persons, houses, papers, and effects, against unreasonable searches and seizures.”[17] Under the Fourth Amendment individuals are afforded a subjectively reasonable expectation of privacy that society is prepared to recognize as objectively reasonable.[18] Thus, in the United States, the standard generally requires probable cause or a warrant for any search or seizure.[19] However, none of the determinations produced by predictive policing programs rise to the legal standard of a Fourth Amendment search or seizure, so their use by the police does not require probable cause or a warrant.[20]

 

Data analytics is increasingly a part of the way society operates. However, predictive policing algorithms come with considerable risks to individuals’ privacy and rights. Substituting inherently biased or otherwise flawed algorithmic predictions for prior investigative techniques risks skewing the judgment of law enforcement officials; resulting in arbitrary and discriminatory stops, searches, and arrests. While police departments may view algorithms as a way to replace possible prejudicial human judgment, doing so with an algorithm that conceals and embodies those same prejudices is not a viable solution to the problem at hand.

 

[1] Kathleen McGrory, Neil Bedi & Douglas R. Clifford, A futuristic data policing program is harassing Pasco County families Pasco’s sheriff created a futuristic program to stop crime before it happens. It monitors and harasses families. | Investigations | Tampa Bay Times(2020), https://projects.tampabay.com/projects/2020/investigations/police-pasco-sheriff-targeted/intelligence-led-policing/ (last visited Nov 11, 2020).

[2] Id.

[3] Will Douglas Heaven, Predictive policing algorithms are racist. They need to be dismantled. MIT Technology Review (2020), https://www.technologyreview.com/2020/07/17/1005396/predictive-policing-algorithms-racist-dismantled-machine-learning-bias-criminal-justice/ (last visited Nov 15, 2020).

[4] Id.

[5] Id.

[6] Id.

[7]  McGrory, Bedi & Clifford, supra note 1.

[8]  Id.

[9]  Id.

[10] Tim Lau, Predictive Policing Explained Brennan Center for Justice (2020), https://www.brennancenter.org/our-work/research-reports/predictive-policing-explained (last visited Nov 15, 2020).

[11]  Id.

[12] McGrory, Bedi & Clifford, supra note 1.

[13]  Lau, supra note 9.

[14] Heaven, supra note 2.

[15] Id.

[16]  McGrory, Bedi & Clifford, supra note 1.

[17] U.S. CONST. amend. IV.

[18] See Katz v. United States, 389 U.S. 347, 361 (1967) (Harlan, J., concurring) (establishing the reasonable expectation of privacy test).

[19] See, e.g., United States v. Ross, 456 U.S. 798, 824–25 (1982) (quoting Mincey v. Arizona, 437 U.S. 385, 390 (1978)).

[20] Lindsey Barrett, REASONABLY SUSPICIOUS ALGORITHMS: PREDICTIVE POLICING AT THE UNITED STATES BORDER, 41 N.Y.U. Review of Law & Social Change 327, 329 (2018).

Image Source: https://croga.org/how-to-reduce-crime-rates-without-gun-control/