By Brianna Hughes




The internet and social media platforms have become a modern staple in many people’s everyday lives.[1] Most people with internet access can share information, ideas, and other content easier than ever before and interact with those who are like-minded.[2] While this dissemination of information can be positive, social media platforms may face the problem of users sharing content that can be considered objectionable or illegal.[3] Platforms attempt to combat this phenomenon by imposing guidelines prohibiting sharing certain types of content.[4] This can include limitations on users posting hate speech, harassment, and revenge pornography.[5] Congress has highlighted the importance of shielding online providers from the potential liability of what their users could post by enacting the Communications Decency Act (“CDA”).[6] The CDA, 47 U.S.C § 230(c)(1), states that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”[7] While this provides media platforms with significant immunity from being held liable for most of their users’ posts, if the moderators violate a criminal statute, they may be subject to liability.[8]

The parents of victims of child pornography attempted to impose civil liability on one of the most widely used social discussion online platforms, Reddit, but were unable to do so.[9] Reddit is a social media platform that allows its users to post content publicly as well as participate in forums devoted to specific topics called subreddits.[10] Reddit holds the power to remove moderators and content if it does not conform to Reddit’s policies.[11] The parents of the victims found sexually explicit photos of their children on the website. The parents reported the images and images took the photos down. However, the parents then found themselves in a cycle where the images would get reposted.[12] The plaintiffs argued that Reddit earned substantial revenue from these explicit subreddits and did little to place protections for those who were being exploited.[13]

The parents sued pursuant to 18 U.S.C. § 1595, which states that “an individual who is a victim… may bring a civil action against the perpetrator (or whoever knowingly benefits, financially or by receiving anything of value from participation in a venture which that person knew or should have known has engaged in an act…) in an appropriate district court.”[14] Reddit claimed they were shielded from liability based on section 230 of the CDA.[15] This section has been amended to include the Fight Online Sex Trafficking Act to allow victims of trafficking to bring civil lawsuits against platforms that helped traffickers.[16] The district court dismissed the parents’ claim, and the Ninth Circuit affirmed the district court’s decision.[17]

While the CDA shields Reddit from potential liability based on their user’s posts, it does not provide immunity if the information violates 18 U.S.C. § 1595.[18] However, the defendant must have actual knowledge of the trafficking and must “assist, support, or facilitate” the trafficking venture.[19] This knowledge standard creates a higher bar for imposing liability on social media platforms.[20] In this “actual knowledge” analysis, the court will look at the defendant’s website’s “participation in the venture” and that they knowingly benefited from participating in child sex trafficking.[21] “Mere association with sex traffickers is insufficient absent some knowing ‘participation’ in the form of assistance.”[22] The Plaintiffs, in this case, failed to show that Reddit knowingly participated or benefited from a sex trafficking venture but rather “turned a blind eye,” which is not enough for the court to impose liability on the media platform.[23] Therefore, the court held that Reddit had not knowingly benefited from knowingly facilitating sex trafficking.[24]

The high bar set by the “knowing” standard proves difficult for victims of child pornography and revenge pornography to receive the remedy they demand.[25] Future litigators in cases like these may have to frame their cases around elements common to civil torts, such as a claim for intentional infliction of emotional distress, defamation, or breach of privacy. [26] For claims that do not involve intentional torts to be successful, plaintiffs must demonstrate that the platform “materially contributed to the illicit nature of the content by showing that they did more than passively transmit information from third parties.”[27] The CDA presents obstacles for parties seeking redress, but it is not impossible to overcome.[28]




[1] Matthew P. Hooker, Censorship, Free Speech & Facebook: Applying the First Amendment to Social Media Platforms Via the Public Function Exception, 15 Wash. J.L. Tech & Arts 36, 39 (2019).

[2] See id. at 39-40.

[3] Id. at 42.

[4] Id.

[5] Id.

[6] Id. at 55.

[7] 47 U.S.C § 230(c)(1).

[8] Reddit Child Porn Suit Escape Under Section 230 Affirmed (1), Bloomberg law (Oct. 24, 2022, 3:35 PM),

[9] See Does v. Reddit, Inc., 2022 U.S. App. Lexis 29510, 3 (9th Cir. 2022).

[10] Id. at 4.

[11] Id.

[12] Id. at 4-5.

[13] Id. at 5-6.

[14] 18 U.S.C. §1595.

[15] Reddit, Inc., 2022 U.S. App. Lexis at 6.

[16] Bloomberg Law, supra note 8.

[17] Id.

[18] Reddit, Inc., 2022 U.S. App. Lexis at 7.

[19] Id. at 9.

[20] Bloomberg Law, supra note 8.

[21] Reddit, Inc., 2022 U.S. App. Lexis at 12.

[22] Id. at 20

[23] Id.

[24] Id. at 21

[25] Jessy R. Nations, Revenge Porn and Narrowing the CDA: Litigating a Web-Based Tort in Washington, 12 Wash. J.L. Tech. & Arts. 189, 200 (2017).

[26]See Id. at 192-195.

[27] Id. at 200.

[28] Id. at 209.

Image source: