Deepfakes: Navigating Legal Challenges

By Moses Hutchinson

Imagine a courtroom where the authenticity of every audio and video recording is suspect. Deepfakes force the legal field to grapple with a fundamental question: can our courts adapt to this new reality.

What are Deepfakes?

The term “deepfake” is a combination of “deep learning” and “fake.” [1] It involves manipulating or fabricating audio, video, or other digital content to make it appear as if someone is saying or doing things they never did. [2] Essentially, a deepfake is “synthesized when [a] video creator takes an image or likeness of one person’s face and . . . replaces it with another face or body using an artificially intelligent facial recognition algorithm.” [3] The most widely used algorithm, Generative Adversarial Networks, uses a machine learning technique consisting of two neural networks, the generator, and the discriminator. Simply put, the generator produces fake content, while the discriminator assesses its quality, resulting in hyper-realistic outputs after many cycles. [4]

 

Deepfakes: Test Your Detection Abilities

To illustrate the alarming capabilities of deepfakes, test your detection abilities with the quiz below.

Quiz: https://youtu.be/qj5LqiSabP4?si=ZXrUatQaf5NhfMRw&t=316

Answers: https://youtu.be/VtW9la3tKBo?si=64V0l4BUKnV-kkrn&t=56

 

How did you do? After publishing that same test in a newspaper, only 54.6% of participants accurately detected the deepfakes. Slightly better odds than a coin flip. [5] If you were successful, don’t get overly confident quite yet. Another study showed that “people overestimate their own deepfake detection abilities.” [6]

 

Current and Potential Legal Challenges

With deepfakes already proving themselves difficult to detect, experts predict that completely indistinguishable deep fakes are not far from reality. [7] William and Mary Law School Chancellor Professor, Fred Lederer noted, “[a]s technologies advance, deepfakes will become more convincing.” [8] In fact, they are already so convincing that some courts have heard deepfake arguments for admitted evidence. [9]

For example, following a fatal car accident involving a self-driving Tesla, the plaintiff brought video evidence of Elon Musk touting the capabilities of Tesla’s self-driving cars. In response, Elon Musk’s attorneys argued the video could be a deepfake. To combat their argument, the court decided to bring Elon in, to question the video’s validity in a deposition. [10] As deepfakes permeate the legal field, judges must decide “[a]m I going to admit this evidence or not?.” Further, this dilemma will put “pressure on the judges on how to authenticate the evidence.” [11]

Yet, the admissibility of evidence is just one facet of the complex challenges posed by deepfakes. For instance, criminals have disrupted court security by using deepfakes, mimicking IT staff or court administrators with synthesized voices in an attempt to access court employees’ passwords. [12] Moreover, in the future, parties may require costly expertise to determine whether evidence was manipulated. Even judicial efficiency could be impacted due to potentially widespread deepfake arguments when opposing any digital evidence.

As shown in Tesla’s case involving Elon Musk, the “deepfake defense” immediately delays the outcome of a case. Judge Evette Pennypacker stated that Tesla’s arguments are “deeply troubling.” [13] Without comprehensive legislation providing how to attack deepfakes [14], legal battles may hinder anything from movie productions to elections. [15] Last, deepfakes may pose potential ethical violations for attorneys unknowingly presenting synthesized evidence in court. Specifically, rules 1.1, Competence, and 8.4 Misconduct. [16]

 

Conclusion

The rise of deepfakes presents a profound challenge to the legal community. As our ability to detect these manipulated audio and video recordings remains far from perfect, the authenticity of these recordings becomes increasingly suspect. The likelihood of indistinguishable deep fakes on the horizon raises concerns about evidence admissibility, judicial efficiency, ethical implications, and more. With technology continually advancing, the legal field must grapple with whether our courts can adapt to this new reality.

 

 

 

 

 

 

[1] What the Heck is a Deepfake?, Information Security at UVA, https://security.virginia.edu/deepfakes (last visited Oct. 23, 2023).

[2] Shannon Bond, People are Trying to Claim Real Videos are Deepfakes. The Courts are Not Amused, NPR (May 8, 2023, 5:01 AM), https://www.npr.org/2023/05/08/1174132413/people-are-trying-to-claim-real-videos-are-deepfakes-the-courts-are-not-amused.

[3] Molly Mullen, Article, A New Reality: Deepfake Technology and the World Around Us, 48 Mitchell Hamline L. Rev. 210, 212 (2022).

[4] Eric Kocsis, Comment, Deepfakes, Shallowfakes, and the Need for a Private Right of Action, 126 Dick. L. Rev. 621, 626 (2022) (citing Bobby Chesney and Danielle Citron, Article, Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security, 107 Calif. L. Rev. 1753, 1760 (2019)).

[5] Light Catching Toby, Can You Detect Deep Fakes? Quiz Results (Part 2), YouTube (March 25, 2021), https://www.youtube.com/watch?v=VtW9la3tKBo.

[6] Nils C. Köbis, Barbora Doležalová, and Ivan Soraperra, Fooled Twice: People Cannot Detect Deepfakes But Think They Can, Nov. 19, 2021, PMCID: PMC8602050, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8602050/#bib60.

[7] Rob Toews, Deepfakes are Going to Wreak Havoc on Society. We are Not Prepared, Forbes (May 25, 2020, 11:54PM), https://www.forbes.com/sites/robtoews/2020/05/25/deepfakes-are-going-to-wreak-havoc-on-society-we-are-not-prepared.

[8] Jule Pattison-Gordon, Courts Consider the Coming Risk of Deepfake Evidence, Gov. Tech. (Sep. 14, 2023), https://www.govtech.com/products/courts-consider-the-coming-risks-of-deepfake-evidence#:~:text=Deepfakes%20aren%27t%20only%20a,the%20voices%20of%20specific%20individuals.

[9] Pittman v. Commw., 2023 Va. App. LEXIS 255 (2023); Valenti v. Dfinity USA Rsch. LLC, 2023 U.S. Dist. LEXIS 80222 (2023).

[10] Malathi Nayak and Sean O’Kane, Elon Musk’s Lawyers Argue Recordings of Him Touting Tesla Autopilot Safety Could be Deepfakes, So a Judge is Bringnig Him in to Clarify in Testimony, Fortune (April 27, 2023, 11:17AM), https://fortune.com/2023/04/27/elon-musk-lawyers-argue-recordings-of-him-touting-tesla-autopilot-safety-could-be-deepfakes/#; Peter Blumberg, Tesla Judge Slams ‘Deep Fake’ Detour in Autopilot Case (1), BLOOMBERG LAW (April 27, 2023, 3:27PM), https://news.bloomberglaw.com/esg/musk-likely-must-give-deposition-in-fatal-autopilot-crash-suit.

[11] Isha Marathe, Deepfakes Are Coming to Courts. Are Judges, Juries and Lawyers Ready?, Legal Tech news (May 26, 2023, 1:53PM), https://www.law.com/legaltechnews/2023/05/26/deepfakes-are-coming-to-courts-are-judges-juries-and-lawyers-ready/#:~:text=Tiedrich%20pointed%20out%20that%20deepfake,any%20civil%20or%20criminal%20proceeding.

[12] Jule Pattison-Gordon, Courts Consider the Coming Risk of Deepfake Evidence, Gov. Tech. (Sep. 14, 2023), https://www.govtech.com/products/courts-consider-the-coming-risks-of-deepfake-evidence#:~:text=Deepfakes%20aren%27t%20only%20a,the%20voices%20of%20specific%20individuals.

[13] Hyunjoo Jin and Dan Levine, Elon, or Deepfake? Musk Must Face Question on Autopilot Statements, Reuters (Apr. 26, 2023, 11:39PM), https://www.reuters.com/legal/elon-or-deepfake-musk-must-face-questions-autopilot-statements-2023-04-26/.

[14] Caroline Quirk, The High Stakes of Deepfakes: The Growing Necessity of Federal Legislation to Regulate This Rapidly Evolving Technology, Princeton Leg. J., June 19, 2023.

[15] Id; Eriq Gardner, Deepfakes Pose Increasing Legal and Ethical Issues for Hollywood, The Hollywood Reporter (July, 12, 2019, 6:00AM), https://www.hollywoodreporter.com/business/business-news/deepfakes-pose-increasing-legal-ethical-issues-hollywood-1222978/.

[16] Model Rules of Prof’l Conduct R. 1.1, 8.4 (2023).

 

 

Image Source: https://images.pexels.com/photos/9685285/pexels-photo-9685285.jpeg?auto=compress&cs=tinysrgb&w=1260&h=750&dpr=1