Can I Be Protected Against Myself? Artificial Intelligence and Voice Replication

By Jarrid Outlaw

With recent advancements in artificial intelligence (“AI”), voice replication has become a simple process that anyone can access and utilize.  Having one’s voice replicated to say anything an AI user wants is scary and can have extremely sinister effects, something that citizens should be protected from.  While everyday citizens are less likely to fall victim to abuse, the fact that such technology, fraught with the potential for violations exists, makes it so legal implications are bound to appear.  For the purposes of this analysis, since celebrities and non-celebrities are looked at differently under the law, this analysis will mainly focus on how celebrities are treated since they are a class of people who have the potential to face the most abuse.  These AI bots operate by mimicking an individual’s speech pattern and cadence through exposure to recordings of a person’s speech.[1]  Though proponents of “voice cloning” AI bots say that the bots can be utilized to propel marketing and advertisements to prolong partnerships, the ease in which programs can emulate voices is concerning.[2]  Anyone with computer access and the wrong intentions can combine a voice cloned AI bot to speak with a “deepfake” and create a full video of a speaker saying anything the user wants them to say.  Where can the line be drawn between a user applying their First Amendment rights to parody, and where a user can be subject to a defamation cause of action?

The First Amendment protects the right to comment on or parody on a person’s image.[3] Due to this right, celebrities cannot fully censor imitations against them.  In U.S. law, voices are not “fixed” and copyright protection is only available for “original works of authorship fixed in any tangible medium of expression.”[4] Looking at the First Amendment, since voice cloning is an original work of authorship that is put out on a tangible medium, it should be able to pass.  While copyright might not be the best means to resolve issues of abuse, defamation may possess the proper elements to meet a case for voice cloning violations.

In a defamation suit, “acts of communication that tend to damage another’s reputation to the extent of lowering their regard in the community or deterring others from associating with them” is protected.[5]  This protection can come in the form of libel, a written false statement, or slander, a spoken false statement.[6]  While it may not seem so, both libel and slander may apply to cases of voice cloning abuse.  Libel will generally apply to video or audio published on the internet.[7] On the other hand, slander may apply in these cases since once created, the replicated voice speaks false statements.  While the case law is practically nonexistent for voice cloning violations, the question of whether slander will apply should be answered soon enough due to the rise of AI.

With the rise of AI, the legal field will undoubtably face a wave of new legislation to fight the abuses AI can create.  The idea that AI would run untethered with the high capabilities each system retains and the continued expansion of knowledge is wrong.  AI systems such as voice cloning represent another area in AI where rampant abuse can be prevalent.  While the law provides some refuge for victims of voice cloning violations, legislation needs to be created or case law needs to appear to clarify cloudier areas of AI.

 

 

 

Image Source: AI Voice Cloning – Customized AI Text-to-Speech – VEED.IO

[1] Bryn Wells-Edwards, What’s In a Voice: The Legal Implications of Voice Cloning, 64 Ariz. L. Rev. 1213, 1224 (2022).

[2] Id.

[3] Teresa Segalman, Hey, That’s My Voice! Can I Sue Them?, Gottlieb, Rackman & Reisman, P.C., https://grr.com/publications/hey-thats-my-voice-can-i-sue-them/

[4] 17 U.S.C. § 102(a)

[5] Supra note 1.

[6] Id.

[7] Id.