By Kyle Durch
If a picture is worth a thousand words, then a video is priceless. In court, the fallibility of witness testimony favors the introduction of photographic or video evidence, when available, rather than a description of its content by a witness.[1]Video evidence is seen as so obviously reliable that it provides the foil against which witness testimony is compared.[2]
But just as Adobe Photoshop forever changed digital photography, “deep fake” processing threatens the reliability of video and audio recordings. Rather than simply pasting a photo on top of an underlying video,[3] a “deep fake” video is produced by training a machine learning algorithm with example images of a subject face, then digitally stitching the resulting model onto a target host.[4] The same technique applies to audio to produce convincing, though fake, recordings.[5] Complicating the issue further is the source of training data, which may be voluntarily-uploaded images scraped (automated extraction of data) from Facebook, LinkedIn, or YouTube, or actively collected by from across the entire internet by companies such as Clearview AI.[6] Because of their unapproachable cost, traditional Hollywood movie techniques have largely been ignored as a threat to “everyday” recordings.[7] “Deep fakes,” on the other hand, may be produced using free applications running on relatively low-cost, retail graphics processors installed in home computers.[8]Results vary depending on the available computing power, extent of training data, and time to process.[9]
“Deep fakes” present several issues, and although some uses may be benevolent, potential threats are substantial. For instance, one artist used this medium to produce synthetic videos of prominent people doing things that they have never done publicly, such as responding to sexual assault allegations or discussing the spread of conspiracy theories.[10]Similarly, others leveraged the technology to develop a satire news report placing various public figures in ironic situations.[11] Both of these sets of examples are acknowledged by their producers to be fake videos. But in 2019, videos went viral on Facebook depicting House Speaker Nancy Pelosi “drunkenly slurring her words.”[12] Though the editing technique was fairly rudimentary, these Speaker Pelosi videos illustrated how an anonymous producer can make a reasonably believable video with ease. People shared the videos without much thought, spreading doubt and sowing discord through the click of a mouse.[13] Ideologically opposed communities may be entrenched further through tactical use of skewed information.[14] Many public figures and celebrities have fallen victim to similar tactics; as the process proliferates, those most impacted will likely be women and marginalized communities.[15] As “deep fake” production software continues to metastasize and those who use it access more and more social media profiles, the likelihood of victimization of regular folks becomes more acute.[16]
While these developments present obvious concern over public discourse, safety, and the understanding of truth, what implications do they have on evidence presented in court? Assuming that an initial recording is itself considered reliable, any break in the chain of custody of video evidence could cast doubt on a video’s authenticity, especially as “deep fake” integration continues to improve.[17] Juries may “be swayed by arguments to not take certain evidence into their consideration” in the absence of authentication by qualified experts.[18] Indeed, this development is related to a phenomenon termed “reality apathy,” where “constant contact with misinformation compels people to stop trusting what they see and hear.”[19] This apathy spread quickly as Donald Trump allegedly coined the term “fake news” following the 2016 election and continued to repeat it over the ensuing years.[20] For these reasons, and to combat the rising presumption against accuracy, it is likely that video editing experts will be commonplace in courtroom protocol in the near future.[21]
However, it will not be long until “deep fake” production transcends the so-called “uncanny valley,” the point at which it will be nearly indistinguishable from reality.[22] Detection will become more important than ever, but technology to conclusively detect and block “deep faked” media may be decades away.[23] Use of “defensive artificial intelligence” may eventually become commonplace to provide requisite expertise on the question of authenticity.[24] Of course, such a protocol raises critical access to justice questions, especially when parties are unable to bear the cost of expert testimony, let alone the additional use of complex computer analysis.[25] And in spite of the assurance of experts, reality apathy could erode trust and negatively affect the basic assumptions made by jurors even before entering the courtroom.[26]
Federal and state efforts have attempted to address cybercrimes, but these efforts failed to directly target the spread of “deep fakes” and did nothing to address evidentiary concerns. California, Texas, Florida, and New York each attempted to address “revenge porn” and cyberstalking.[27] Existing federal law chews around the edges, while efforts to pass reform have all died in committee.[28] A cohesive federal strategy is likely required to address the threat of “deep fakes,” the effects of which respect no border, are instigated by actors from behind layers of anonymity and encryption, and against which requires expensive and complex litigation.[29]
As for evidence at trial, the federal rules are written to afford judges’ discretion.[30] Similar to the way courts, in many cases, defer to agency expertise to resolve statutory ambiguities rather than substitute their own interpretation,[31] the flexible nature of the evidence rules may leave too much ambiguity for judges to effectively rule when faced with a possible “deep fake.” Realizing that the absence of qualified expert testimony could become commonplace, it may be necessary to build presumptions into the rules to guide judges in their rulings. Regardless, the rapid advance of “deep fake” technology will continue to challenge courts and society for years to come.
[1] Fed. R. Evid. 1002.
[2] See generally Stephen L. Chew, Myth: Eyewitness Testimony is the Best Kind of Evidence, Psych. Sci. (Aug. 20, 2018), https://www.psychologicalscience.org/teaching/myth-eyewitness-testimony-is-the-best-kind-of-evidence.html (comparing the unreliability of witness testimony with the completeness of a video recording).
[3] See, e.g., JibJab (2020), https://www.jibjab.com/ (providing a service with which consumers may send greeting cards featuring friends and family in humorous videos).
[4] See Bobby Chesney & Danielle Citron, Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security, 107 Calif. L. Rev. 1753, 1759 (2019).
[5] 3 Computer Contracts § 15.02(2)(p) (2020).
[6] See Angela Morris, Don’t Roll That Tape: Deepfakes Creating Litigation Nightmares, Tex. Law. (Feb. 10, 2020, 11:40 AM), https://www.law.com/texaslawyer/2020/02/10/dont-roll-that-tape-deepfakes-creating-litigation-nightmares/.
[7] See 6 Best Deepfake Apps and Tools in 2020, RankRed (Mar. 1, 2020), https://www.rankred.com/6-best-deepfake-apps-and-tools-in-2020/#:~:text=DeepFaceLab.
[8] See id.
[9] See id.
[10] About, Deep Reckonings, http://www.deepreckonings.com/about.html (last visited Nov. 9, 2020) (depicting the likenesses of Mark Zuckerberg, Justice Brett Kavanaugh, and Alex Jones).
[11] Sassy Justice, YouTube (Oct. 26, 2020), https://youtu.be/9WfZuNceFDM; Dave Itzkoff, The ‘South Park’ Guys Break Down Their Viral Deepfake Video, N.Y. Times (Oct. 29, 2020), https://www.nytimes.com/2020/10/29/arts/television/sassy-justice-south-park-deepfake.html.
[12] Corinne Reichert, Congress Investigating Deepfakes After Doctored Pelosi Video, Report Says, CNet (June 4, 2019, 3:35 PM), https://www.cnet.com/news/congress-investigating-deepfakes-after-doctored-pelosi-video-report-says/.
[13] See Wallace Baine, Welcome to Deepfake Hell, Good Times (Nov. 26, 2019), https://goodtimes.sc/cover-stories/deepfake/.
[14] See Oscar Schwartz, You Thought Fake News was Bad? Deep Fakes are Where Truth Goes to Die, Guardian (Nov. 12, 2018, 5:00 AM), https://www.theguardian.com/technology/2018/nov/12/deep-fakes-fake-news-truth.
[15] Symposium, 21st Century-Style Truth Decay: Deep Fakes and the Challenge for Privacy, Free Expression, and National Security, 78 Md. L. Rev. 882, 885–86 (2019).
[16] See Baine, supra note 12.
[17] See David Notowitz, Deepfakes and the Growing Trend of Fabricated Video Evidence, Recorder (May 2, 2019, 11:50 AM), https://www.law.com/therecorder/2019/05/02/deepfakes-and-the-growing-trend-of-fabricated-video-evidence/.
[18] Id.
[19] Schwartz, supra note 14.
[20] See Andrew Beaujon, Trump Claims He Inveneted the Term “Fake News”—Here’s an Interview with the Guy Who Actually Helped Popularize It, Washingtonian (Oct. 2, 2019), https://www.washingtonian.com/2019/10/02/trump-claims-he-invented-the-term-fake-news-an-interview-with-the-guy-who-actually-helped-popularize-it/.
[21] Notowitz, supra note 16.
[22] Baine, supra note 13 (explaining that “the ‘uncanny valley effect’ . . . [defines that] the closer technology got to reality, the more dissonant small differences would appear to a sophisticated viewer”).
[23] Chesney & Citron, supra 4, at 1788.
[24] 3 Computer Contracts § 15.02(2)(p) (2020).
[25] See generally David Medine, The Constitutional Right to Expert Assistance for Indigents in Civil Cases, 41 Hastings L.J. 281, 303 (1990) (arguing that the Supreme Court should develop a “right to expert assistance based upon equal protection principles”).
[26] Chesney & Citron, supra note 4, at 1779.
[27] Rebecca Delfino, Pornographic Deepfakes: The Case for Federal Criminalization of Revenge Porn’s Next Tragic Act, 88 Fordham L. Rev. 887, 909–17 (2019).
[28] See id. at 904–10.
[29] See Chesney & Citron, supra note 4, at 1792–93.
[30] See, e.g., Fed. R. Evid. 1008 (instructing that “the court determines whether the proponent has fulfilled the factual conditions for admitting” evidence, rather than specifying very particular lists of requirements for each type of evidence).
[31] See Chevron U.S.A., Inc. v. NRDC, Inc., 467 U.S. 837, 844 (1984).
Image Source: “Security Cameras” by JeepersMedia is licensed under CC BY 2.0