By: Niesha Gibbs,

Should jurors be completely void of emotion? When posed with this question, I impulsively answered no. Prompting my answer are cases like that of Cyntoia Brown. Brown’s story has garnered the attention of thousands after a recent post about her went viral. In 2006, the then 16-year-old Brown was sentenced to life imprisonment for the murder of 43-year year old Jonathan Allen.[1] Allen attempted to rape Cyntoia, and in self-defense she shot and killed him.[2] Brown was tried as an adult and ultimately convicted of first degree felony murder and aggravated robbery.[3] Cyntoia’s case, like many others, is a prime example of when just a bit more understanding from a jury would have had an entirely different outcome. Nonetheless, there are instances where far too much emotion certainly clouds judgment. So, where does this leave the plight of nearly all jurors? The unspoken plight of finding the delicate balance between acknowledging sympathy while applying an impartial balance of the law? One solution being offered is artificial intelligence.[4]

With technology present in nearly every facet of the legal profession, from e-discovery to electronic filing, it should come as no surprise that technology has now pervaded to the courtroom itself. Currently, technical advances are being used to “transfer” jurors to the actual crime scene of the trial they are sitting on.[5] However, the strides of modernizing courtrooms haven’t stopped there, robots or algorithms are now being used to determine the guilt or innocence of a defendant.[6] The company Nortpointe, Inc. has created software, Compas, that is designed to assist courts or judges with making “better” – or at least more data-centric – decisions in court.”[7]

While this hasn’t become a widespread practice, it has attracted the attention of prominent legal figures, namely the Honorable Chief Justice John G. Roberts Jr.[8] After being asked about the effect of artificial intelligence in the courtroom, Justice Roberts described it as “putting a significant strain on how the judiciary goes about doings things.”[9] Roberts comments were given over two months after the Supreme Court declined to review the case of Eric Loomis.[10]

In early 2013, Loomis was “sentenced to six years in prison at least in part by the recommendation of a private company’s secret proprietary software.”[11] One can’t help but to believe this decision was partially or even wholly motivated by the implications of such a ruling. Additionally, as with any new discovery, the potential problems that could arise if this method was endorsed by the Court. For example, hacking is regularly associated with some of the most sensationalized scandals. One could only imagine the potential issues that could arise during a highly contentious case, with someone’s life in the balance. Regardless of which side you are positioned on this issue, one point that both sides will concede is that as jurors each person brings their own world view. This worldview is a lens through which legal advocates often rely during a trial. Further, as humans, we have the unique ability to empathize with one another. An ability I think should never be undervalued or overlooked.


[1] See AJ Willingham, Why Cyntoia Brown, Who Is Spending Life in Prison for Murder, Is All Over Social Media, CNN (2017),

[2] See id.

[3] See id.

[4] See Kayla Mathews, Is AI Getting Closer to Replacing Jurors, ProductivityBytes (2017),

[5] See Nick Caloway, Investigators Use 3D Technology to Solve Crimes, Bring Scenes to Juries, (2016),

[6] See Mathews, supra note 4.

[7] Christopher Markou, Why Using AI to Sentence Criminals is a Dangerous Idea, The Conversation (2017),

[8] See See Adam Liptak, Sent to Prison by a Software Program’s Secret Algorithms Sidebar, New York Times (2017),

[9] See id.

[10] Amy Howe, Federal Government Files Invitation Briefs, SCOTUSblog (2017),

[11] See Liptak, supra note 8.

Image Source: