Volume VI, Issue 3

Winter 1999-2000


You Can't Always Get What You Want:

Government's Good Intentions v. The First Amendment's Prescribed Freedoms in Protecting Children From Sexually-Explicit Material on the Internet


Abbigale E. Bricker[*]

Cite As: Abbigale E. Bricker, Note, You Can't Always Get What You Want: Government's Good Intentions v. The First Amendment's Prescribed Freedoms in Protecting Children From Sexually-Explicit Material on the Internet, 6 RICH. J.L. & TECH. 17 (Winter 1999-2000) <http://www.richmond.edu/jolt/v6i3/note5.html>. [**]

Table of Contents:

I. Introduction

II. Historical Background

A. Traditional Obscenity Law

1. Free Speech Develops Limits

2. "Obscenity" and "Indecency" Defined

B. Obscenity Law Endeavors to Adjust to Emerging Technological Innovations

C. Problems in Regulating the Internet

1. Communications Decency Act of 1996

2. ACLU v. Reno ["Reno I"]

3. The Aftermath of Reno I: The COPA

III. Statement of Case

A. Procedural History

B. Facts

C. Holding

D. Court's Rationale

IV. Analysis and Implications

A. Direction to the Government for Perfecting Legislation to Meet its Objectives

B. Available Alternatives to the COPA

1. Will Any Internet Regulation Survive the First Amendment?

C. How the Decisions of Reno I and II Bear Upon Proposed Legislation of the 106th Congress

1. Congress Attempts to Legislate Since Reno II

2. Questionable Attempts to Regulate the Internet in the 106th Congress

3. Court Interpretations of Similar State Actions

D. Other Alternatices to Protect Children

V. Conclusion


I. Introduction

{1}Once a small and diverse community of a handful of government computers, the Internet has expanded to an estimated 157 million users worldwide.[1] According to current studies, the fastest growing user populations on the Internet are thirteen to eighteen year-olds and five to twelve year-olds.[2] In addition, the latest "research . . . predicts that the number of children online [will increase] by 155% between 1998 and 2002."[3]

{2}Because online pornography is the third largest earner of cyber sales, and an estimated thirty-nine new pornographic sites appear on the Internet daily, the potential for children to be exposed to objectionable material online is unquestionable.[4] That opportunity is made even easier when promoters of such material make it easy for anyone to view their sites;[5] some even purposefully use deceptive means to trick children into visiting their websites.[6]

{3}A number of parent groups and commercial organizations have become alarmed at the widespread availability and accessibility of sexually-explicit materials online, and have worked to create blocking and filtering software for households with children, in addition to providing parents with information on alternative methods of regulating the types of material that children view online.[7] This software has been created in hopes that parents will both observe and control what their children encounter on the Internet, in much the same ways that parents do with television.[8] The problem arises when the government involves itself in this sensitive area, in the interest of protecting the nation's children, and imposes restrictions on access to websites which communicate constitutionally-protected speech. The question is: who has the ultimate responsibility to regulate what material children can access online?

{4}This case note contends that Congress needs to consider our fundamental First Amendment rights more carefully when preparing legislation that ultimately burdens free speech. It argues that the problem of inappropriate content on the Internet is best addressed by parents, who are the better authority over what is acceptable for their children. The disparity between an ever-growing number of resources available to parents, versus the burdensome, cost-prohibitive requirements of Internet regulations on communications, demonstrate that Internet supervision is best left in the hands of parents. Further, this case note proposes that Congress use its legislative authority to protect children in ways that do not obstruct free speech, by targeting those who prey on children or those who intend to harm children.

{5}Part II of this note examines the development of traditional obscenity law, examining the initial emergence of limitations on free speech, and the ways in which obscenity laws have adjusted to emerging technological innovations. Part II also discusses the problems that have materialized as a result of communication's newest medium - the Internet. Part III introduces the facts, holding, and reasoning behind the court's decision in ACLU v. Reno.[9] Part IV discusses the results of the Supreme Court's decision in Reno II; namely, the restrictions that have been set forth for subsequent attempts at regulation, after the failure of the Child Online Protection Act ("COPA").[10] It also discusses available alternatives to the Child Online Protection Act, how the decisions of Reno I and Reno II will bear upon proposed legislation of the 106th Congress, and reviews other alternatives to protect children. Finally, Part V provides a conclusion to this case note.


II. Historical Background

A. Traditional Obscenity Law

1. Free Speech Develops Limits

{6}The first legislative provision prohibiting obscene communications was written in the Communications Act of 1934; this same provision was subsequently incorporated into 18 U.S.C. § 1464, which, to this day, prohibits the "utterance of any 'obscene, indecent, or profane language'" over radio communications.[11] Naturally, First Amendment concerns are raised when particular types of speech are either regulated or altogether prohibited. Through numerous challenges to such legislation, the fine line between protected speech and unprotected speech has been drawn to clarify the situation.

{7}The First Amendment does not protect all speech.[12] The court has demarcated generally-offensive speech into distinct categories, deciding what will and will not be protected based on the content of that communication.[13] Generally, when the government has a compelling interest in regulating a particular type of speech, and that particular speech is considered to have low social value, the First Amendment will not protect it.[14] Specifically, the Supreme Court has held that the First Amendment will not protect obscene speech.[15]

2. "Obscenity" and "Indecency" Defined

{8}The current definition of "obscene" was established by the Supreme Court in Miller v. California[16] with a three-part test. The Court decided that a work would be considered obscene if found by the average person, applying contemporary community standards, (a) to appeal to a prurient interest in sex; (b) to depict or to describe sexual conduct specifically defined by the applicable state law in a patently offensive manner; which (c) when taken as a whole, lacks any serious literary, artistic, political, or redeeming scientific value.[17]

{9}Obscenity can be differentiated from indecency when a communication fails to conform with accepted morality, but lacks prurient appeal.[18] The fact that certain objectionable communications do not meet the definition of "obscene," however, does not guarantee that such speech can exist free of governmental interference.[19] Communications which are "indecent," but not "obscene," are able to be governmentally regulated, but only in a limited manner. Despite First Amendment protections, 18 U.S.C. § 1464 entitles the Federal Communications Commission to regulate "indecent" radio communications after the broadcast has been made, in connection with renewal of a broadcaster's licence.[20] Generally, indecent communications will be regulated only when the context in which such communications were made indicates a need for a stricter standard of regulation.[21]

{10}However, the courts have held that a total prohibition of indecent interstate commercial telephone communications violates the First Amendment.[22] While some "indecent" communications would be subject to scrutiny, the determinative factor is, again, the context in which the communications were made. In allowing this medium of communication much greater free-speech protections, the Supreme Court stated that the prohibition on indecent commercial telephone conversations was different for a number of reasons.[23] Primarily, those who receive these commercial telephone messages must take affirmative steps in order to obtain the communication; whereas, turning on a radio and hearing a broadcast could be construed as much less affirmative activities.[24]

{11}Indeed, First Amendment rights to access otherwise protected expression are trumped by the government's interest in the "well-being of its youth."[25] This interest is so great, that the State does have the authority to regulate material that is harmful to minors, even when it would not have the authority to regulate this material as to adults.[26]

B. Obscenity Law Endeavors to Adjust to Emerging Technological Innovations

{12}Despite case law clarification regarding the transmission of obscenity and indecency as it adapts to emerging technology, the Internet as a communications medium poses unique problems which are not explicitly covered by traditional obscenity law. Like the telephone, the most- protected medium of communication, Internet users must make several intentional actions in order to connect online.[27] However, like television or radio broadcasts, two of the least-protected media of speech, Internet users may inadvertently come across objectionable websites while searching for unrelated information.[28] Taking on characteristics of both, the Internet escapes traditional determinations of scrutiny.

{13}The need for specific content regulation of the Internet was made even more clear in United States v. ,[29] one of the first Internet cases involving an obscenity conviction.[30] Under a federal statute prohibiting the transportation of obscenity in interstate commerce, the government charged that the es, through the operation of their computer bulletin board, violated federal obscenity laws.[31] In order to convict the es, the court had to "stretch" the meaning of the statute, which prohibited the "transportation" in interstate commerce of obscene books, pictures, films, writings, images, or any "other matter."[32] However, as the es argued, the nature of Internet commerce is not adaptable to traditional laws, since what is transported within Internet communications is a series of codes and numbers, none of which could be considered "obscene."[33] The result of was that a federal legislative provision, originally written in 1934, was used to accommodate the prohibition of obscenity on the Internet.[34] Furthermore, while this decision arguably regulated "obscene" materials being transmitted through interstate commerce, it did nothing to discuss Internet communications which might fall below the "obscenity" standard, but would still be considered "indecent" or "harmful to minors." The government's interest in protecting children, in conjunction with the unanswered questions raised by , compelled Congress to address the issue specifically through legislation.[35]

C. Problems in Regulating the Internet

1. Communications Decency Act of 1996[36]

{14}Passed in February 1996, the Communications Decency Act of 1996 ("CDA"), amended Chapter 5 of Title 47, which regulates common carriers of communications through wire or radio. To the list of prohibited interstate communications, section (a) added any "indecent" Internet communications knowingly made accessible to minors.[37] Section (d) of the Act, however, subjected any person to punishment who knowingly used interactive computer services to publish "patently offensive" materials, as determined by contemporary community standards, irrespective of who initiated the communication.[38] Further, section (e) of the CDA provided affirmative defenses, which limited the liability of a number of related parties.[39] Primarily, those lacking knowledge of the publication were provided with a defense. In addition, publishers of sexually-explicit materials on the Internet also had a defense to CDA prosecution if the publisher took good faith "reasonable, effective, and appropriate actions under the circumstances to restrict or prevent access by minors to [the prohibited communications]," which amounted to implementing restrictive features, such as access codes, credit card numbers, or other age verification measures.[40]

2. ACLU v. Reno [41]

{15}On the day the CDA was signed into law, various businesses, libraries, non-commercial, and not-for-profit organizations, educational societies and consortia, all in some way associated with groups that publish materials on the Internet, challenged the constitutionality of portions of the CDA.[42] The plaintiffs argued that two sections of the CDA, sections 233(a) and 233(d), were entirely too vague and violated the First Amendment freedom of speech and the Fifth Amendment due process rights of Internet publishers.[43]

{16}In declaring the CDA unconstitutional, the court held that the terms "patently offensive" and "indecent" were unconstitutionally vague, and that the CDA was too broad in its regulatory scope.[44] The CDA did not adequately define "indecent" or "patently offensive" in a way that would limit its applicability to material that is of a "prurient" nature, and instead, imposed restrictions on the publication of materials which might be of serious value to individuals.[45] In addition, the Court was concerned that the CDA had enormous potential to regulate and impose criminal sanctions upon publishers of material that was "sexually-explicit," but not patently obscene, thereby banning constitutionally-protected speech.[46]

{17}The current technological infeasibility of screening for age would result in a chilling of free speech because publishers would be limited to publishing information that was appropriate for all ages out of fear of criminal prosecution.[47] In addition, the Court decided that the CDA's affirmative defenses provided under the statute were unable to save the Act from constitutional challenge.[48] In defense of its decision, the Court gave weight to the availability and successful results of at-home software packages which parents could purchase in order to censor the websites children would be able to access from home.[49]

{18}After the United States Supreme Court held that the CDA was an unconstitutional abridgement of free speech, Congress made another attempt at regulating Internet content with the Child Online Protection Act.[50] With the COPA, Congress intended to address the problem of children's inadvertent access to sexually-explicit material on the Internet, rather than to material which was deliberately accessed.[51]

3. The Aftermath of Reno I: The COPA

{19}In light of the decision in Reno I, Congress attempted to cure the defects of the CDA by limiting the scope of prohibited materials that would be covered by the COPA. Instead of prohibiting "patently offensive" material or "indecent" material over the Internet, the COPA prohibits the knowing communication for commercial purposes of "any material that is harmful to minors."[52] The COPA, unlike the CDA, provides only one "affirmative defense." Namely, this defense consists of the premise that an Internet publisher of materials "harmful to minors" will escape prosecution if he "in good faith, has restricted access by minors to material that is harmful to minors."[53] The statute outlines ways in which an Internet publisher may restrict access to his website: first, by requiring use of a credit card or access code; second, by accepting a "digital certificate" to determine age; or third, any other reasonably feasible measures.[54]

{20}In addition to providing affirmative defenses within the COPA, subsection (b) makes the COPA inapplicable to Internet access providers, publishers of Internet search engines, or persons who are "similarly engaged in the transmission, storage, retrieval, hosting, formatting, or translation (or any combination thereof) of a communication made by another person, without selection or alteration of the content of the communication."[55]


III. Statement of Case

A. Procedural History

{21}After the CDA was determined to be violative of the First Amendment in Reno I, Congress attempted to remedy its constitutional defects with the Child Online Protection Act.[56] Ten days before the COPA was to go into effect, plaintiffs similar to those litigating in Reno I, challenged the constitutionality of the statute in the United States District Court for the Eastern District of Pennsylvania, seeking injunctive relief.[57] Although the court and the parties considered consolidating the hearing on the motion for a temporary restraining order contemporaneously with a trial on the merits, the court ultimately decided to proceed only on the motion for a preliminary injunction.[58] On November 20, 1998, the district court granted a temporary restraining order, enjoining the enforcement of the COPA until December 4, 1998. The defendant later agreed to extend the temporary restraining order until February 1, 1999. In addition, the defendant filed a motion to dismiss for lack of standing under Federal Rule of Civil Procedure 12(b)(1).[59] On February 1, 1999, the court considered this motion, in addition to the question of whether to issue a preliminary injunction against enforcement of the COPA.

B. Facts

{22}The plaintiffs attacked the COPA for placing an unconstitutional burden on adults for protected speech, for violating First Amendment rights of minors, and for being unconstitutionally vague, in violation of the First and Fifth Amendments.[60] Plaintiffs argued that the direct ban of speech, the statutory availability of only affirmative defenses, and the overbroad wording of the statute, all enable the COPA to unconstitutionally burden free speech.[61] Specifically, plaintiffs argued that the availability of the affirmative defenses within the statute required implementation of complicated and invasive security measures on websites, the use of which required users to provide personal information, resulting in loss of users to such websites. Furthermore, plaintiffs argue that the governmental ends are not sufficiently compelling to justify the means by which the COPA attempts to accomplish them. Finally, the plaintiffs contended that the statute, even if applied to a more narrow class of speakers, regulates more speech than can be constitutionally justified, due to its vague language.[62]

{23}The defendant argued that the requirements of the statute did not burden the ability for adults to access constitutionally-protected speech, and that the affirmative defenses were technologically and economically feasible methods to restrict minors' access to the targeted websites. [63] Furthermore, the defendants argued that the statue was not overbroad, and that the statute did not restrict access by minors to any of the plaintiffs' websites.[64] The defendant's motion to dismiss for lack of standing was based on the fact that plaintiffs' webpages were not affected by the statute. Rather, defendant argued that the COPA targeted those who "distribute harmful material to minors 'as a regular course' of their business."[65] Finally, the defendant contended that plaintiffs were unable to demonstrate a likelihood of success on the merits of their claims and that the plaintiffs were unable to substantiate the claim of irreparable harm.[66]

C. Holding

{24}The district court granted the preliminary injunction against the government, to remain in effect until completion of a final trial on the merits.[67] The court made this determination based on findings that the plaintiffs' likelihood of success on the merits, the potential for irreparable harm, and the balance of interests all weighed in favor of granting the preliminary injunction.[68]

{25}The court applied strict scrutiny to determine the constitutional challenge's likelihood of success on the merits based on the COPA's potential to drive particular types of speech from the marketplace entirely.[69] As with Reno I, the court decided that limiting the types of content made available to children on the web would burden free speech and infringe upon the constitutional rights of adults.[70]

D. Court's Rationale

{26}The plaintiffs presented much information regarding the burdens associated with becoming a registered user of a particular website and the chilling effect pass codes and other verification systems might impose upon adult users of the restricted websites, in addition to the burden of additional costs and lost revenues.[71] The court agreed that age verification technology available to Internet publishers was technologically and economically burdensome on the publishers.[72] In many cases, it was no more certain to ensure that children were actually being protected by these measures than by the use of widely-available commercial filtering software.[73]

{27}Although defendants argued that the COPA targeted commercial pornographers by limiting its scope to "commercial" transactions, the court found that the language of the COPA did not in fact apply only to commercial pornographers.[74] Instead, the court found that the COPA's definition of "commercial purposes" was over-inclusive; first, because it did not apply only to those who make a profit from prohibited materials; and second, because it applied to "commercial" websites which included any "harmful to minors" material, regardless of the ratio of harmful to non-harmful materials on that site.[75]

{28}Furthermore, the court found that the government had not used the least restrictive means to further its interest in protecting children from inappropriate content.[76] The court found that the COPA was entirely too broad to effectively protect children from harmful websites, while affording adults their constitutional rights to both publish and view the material.[77]

{29}The court made a preliminary finding that, even in light of the compelling governmental interest to protect children from viewing sexually-explicit materials, the language in the COPA was not sufficiently narrowly tailored.[78] Specifically, the "sweeping category" of forms of content prohibited by the statute easily included material not typically used by children to achieve inadvertent access to sexually-explicit websites.[79] The court noted that the forms of content prohibited were over-inclusive, even in light of the government's stated objective, and that these objectives could be better served, if the statute limited the scope of prohibited content to graphics and pictures, which are the common format of the pornographic websites the statute intended to prohibit minors from accessing.[80] Finally, the court stated that the penalties imposed by the COPA were "excessive" and suggested that the government's objectives might be better served by incorporating the affirmative defenses provided as elements of the crime.[81]


IV. Analysis and Implications

A. Direction to the Government for Perfecting Legislation to Meet Its Objectives

{30}In Reno III, the court granted a preliminary injunction against the enforcement of the COPA, finding that the plaintiffs had established a substantial likelihood of success on the merits of their claims.[82] The court made an introductory finding that the COPA, like the CDA, was still not sufficiently narrow to achieve the stated governmental interest in protecting children from harmful materials on the Internet, when considered in opposition to the constitutional right to free speech.[83] According to this court, the COPA's over-inclusive list of prohibited content was too easily construed to require Internet publishers to incorporate age-restrictive mechanisms into a great number of sites that the government never stated an interest in regulating.[84]

{31}Although demonstrated to Congress that legislation specifically geared toward regulating the Internet was necessary, so far, both legislative attempts to do so have failed. The courts indicated serious defects in both the CDA and COPA, many of which overlapped.

{32}With the COPA, Congress specifically changed the description of material that would be prohibited, from "indecent" and "patently offensive,"[85] as in the CDA, to one uniform term: "harmful to minors."[86] Therefore, if the CDA's prohibited communications were vague, the COPA's language failed to correct the defect by incorporating overbroad language. First, the category of prohibited materials was much more inclusive than would satisfy the government's stated objective. Second, the COPA failed to limit its applicability appropriately to impose liability on commercial pornographers only.[87] Of course, the result was an unconstitutional infringement of First Amendment rights. As such, the COPA ultimately cannot survive the constitutional challenge.

B. Available Alternatives to the COPA

1. Will Any Internet Regulation Survive the First Amendment?

{33}The messages sent back to Congress from Reno I and Reno II were that First Amendment freedoms are fundamental, no matter how worthy the cause underlying such legislation.[88] The district court's discussion of the capabilities and usefulness of "blocking or filtering software" indicated the level of First Amendment intrusion the court would be willing to accept in order to protect children-- virtually none.[89] The court has correctly asserted that parents have the ultimate responsibility to monitor and regulate children's Internet use and that the incorporation of blocking and filtering software is the least restrictive means to achieve the objective of shielding children from inappropriate material.[90]

{34}Since enacting the COPA, Congress has demonstrated that there are appropriate, albeit limited, ways in which the Internet can be governmentally regulated to protect children. Rather than unduly burdening certain constitutionally-protected communications and publications, the newest government legislation focuses on preventing the most serious threats to children on the Internet: predators of children and those who intend to harm children.[91]

{35}While both the CDA and the COPA failed for encroaching upon constitutionally-guaranteed freedoms, another piece of legislation which endeavors to protect children using the Internet has thus far escaped challenge. The Children's Online Privacy Protection Act of 1998 ("COPPA"), signed into law on October 21, 1998, two days after the COPA, essentially directs the Federal Trade Commission ("FTC") to promulgate rules concerning the collection and use of personal information about children on the Internet.[92] COPPA itself authorizes the FTC to regulate much more than it chooses to with the present proposed rules.[93]

{36}Although the FTC's regulations under COPPA failed to address children's access to sexually-explicit websites, the regulations do serve an important purpose. The safety of children is ultimately protected by such regulations, as the Internet is "a medium in which children can be placed at risk," when those who collect personal data about children disseminate it to third parties, some of whom would harm children, unbeknownst to the children or their parents.[94]

{37}The FTC's proposed rules essentially regulate the activities of operators whose websites are directed to children.[95] If the operator of a website directed towards children collects personal information from a child under thirteen years-old, he must comply with a number of requirements.[96] First, he must have verifiable parental consent to the collection of data; second, he must provide on that website, information about what types of data are collected from children, how this information is used, and to whom this information is disseminated; third, he must provide a means for parents to review the information collected from the child; fourth, he may not require more disclosure than is "reasonably necessary" for a child to participate in a game or other activity; and, finally, he must undertake "reasonable procedures" to protect the information obtained from children.[97] The FTC's regulations do not appear to have the same constitutional defects that both the CDA and the COPA suffered. These rules do not result in a burden upon the diffusion of speech or the ability to access such speech,[98] but instead burden the collection of data.[99]

{38}If enacted, the FTC's proposed regulations would affect a modicum of the websites that collect material from children because the regulations are limited to "online services directed toward children."[100] According to the proposed FTC regulations, an "online service[] directed to children" is defined as "a commercial or online service, or portion thereof, that is targeted to children."[101] Further, the FTC may determine whether a site is "directed towards children" based on its subject matter, audio and visual content, advertising, language used, and other characteristics of the website that would lead the Commission to arrive at a conclusion that the site is targeted to children.[102]

{39}But what of those websites not specifically targeted to children? Websites that are frequently, or almost exclusively visited by children for research or recreational purposes, such as webpages about dinosaurs, games, or beanie babies, for example, might not be specifically "targeted towards children," but do, in fact, receive a great many underage visitors. Operators of such sites could easily collect information on children without being in violation of the proposed FTC rules, even if all of the personal data they collect is from and about children. Concerns of this sort have prompted the legislature to propose a number of bills in the 106th Congress that endeavor to help children by regulating broader categories of activities, such as the sale of personal data collected from children. Unfortunately, the government's good intentions must withstand the guarantees of the First Amendment as espoused in Reno I and II.

C. How the Decisions of Reno I and II Will Bear Upon Proposed Legislation of the 106th Congress

1. Congress Attempts to Legislate Since Reno II

{40}Legislation introduced to the 106th Congress expands upon this method of protecting the privacy of children on the Internet and goes further to limit the ways in which data can be collected and used. In January 1999, the House of Representatives introduced H.R. 369, the Children's Privacy Protection and Parental Empowerment Act of 1999 [103] ("Children's Privacy Protection Act"). H.R. 369 imposes criminal sanctions for receiving or disseminating personal information about children in a manner inconsistent with the Act.[104] Basically, the act levies fines or imposes jail sentences upon whoever knowingly "sells, purchases, or receives remuneration for providing personal information about a child" without the written consent of that child's parents.[105] Further, H.R. 369 requires that those who obtain and use information collected about children to solicit commercial products and services must, upon request, provide the parent with the following information: (1) the information obtained; (2) the source of that information; and (3) the persons to whom the data has been disclosed.[106]

{41}Although the purposes behind the proposed FTC regulations under the COPPA and H.R. 369 seem similar, the scope of H.R. 369 is not so limited as the proposed FTC regulations. First, H.R. 369 does not limit its application to websites "targeted to children."[107] Furthermore, the class of protected children is larger. "Child" as defined in H.R. 369 describes any person under age sixteen;[108] whereas, in the FTC regulations, a "child" is any person under the age of thirteen.[109]

{42}In addition, H.R. 369 does not stand on firm ground constitutionally. H.R. 369 actually proscribes certain communications, unlike the FTC regulations which merely forbid the collection of data on children without providing parents with the proper notice. Like both the COPA and the CDA before it, if enacted, H.R. 369 would likely be unable to withstand constitutional challenge. H.R. 369 does not attempt to demand the burdensome mechanisms of the COPA and CDA in order to satisfy statutory requirements, which were ultimately found to result in the chilling of free speech. Instead, H.R. 369 creates a similarly over-inclusive class of covered parties and entirely prohibits the dissemination of some classes of protected speech.

{43}While First Amendment protection of the dissemination and sale of children's personal information might be unpalatable, one must remember much of the information collected and disseminated about children is not intended to harm them. For example, marketing companies might benefit by selling personal information about members of the Pokemon Online Fan Club to a toy company with the intention to solicit commercial products to the child and her family. Under H.R. 369, this activity would be prohibited without the written consent of the parents of each child about whom information is collected. Arguably, a court might find that a allowing the dissemination of data only with the written consent of every parent, like the requirements of the COPA and CDA,[110] is not a narrowly-tailored restriction on free speech.[111]

{44}A better-reasoned approach to address problems found in the collection and dissemination of children's data is found in the FTC regulations. Essentially, these regulations require more extensive parental consent for the collection of the personal information of the child; however, the dissemination of the information, as speech, is allowed to be broadcast unfettered by restrictions and categorical prohibitions that may violate the First Amendment.

2. Questionable Attempts to Regulate the Internet in the 106th Congress

{45}Provisions similar to the Children's Internet Protection Act are contained in S. 97, H.R. 543, H.R. 2560, S. 545, and H.R. 896, which introduce variations that would require schools and libraries to use Internet filtering software on their computers in order to continue to receive universal service assistance.[112] The filtering mechanisms must block material "deemed to be harmful to minors," so long as that computer is to be used by the public, minors or adults.[113] This rule purports to be flexible: in a library with more than one Internet computer, only one computer need employ the filtering software, while in a library with only one Internet computer, the rule will be satisfied if that library can certify that it employs a reasonable alternative to the filtering mechanism.

{46}Naturally, S. 97 and its counterparts have some serious opponents. Many civil liberties groups have decried government intervention into an area of regulation they deem is properly left between parents and children.[114] Groups, such as the Center for Democracy and Technology argue that the federal government should not impel such an inflexible solution upon communities.[115] These communities often set standards for Internet use in their libraries and may wish to test various approaches to attain the goal of protecting children online.[116]

{47}Furthermore, requiring libraries to use particular technology to effectuate the goals of protecting children on the Internet allows Congress to influence the market.[117] Currently, the market has "[over] 125 technological hardware and software tools available that are designed to promote child safety and to empower families to choose what material is appropriate for their children."[118] Rather than allowing the market to determine which method of filtering and blocking will best meet the needs of families, Congress singles out one particular method and makes it mandatory on all libraries wishing to receive universal service assistance.[119] For similar reasons, a Loudoun County, Virginia citizen plaintiff's group and Washington lawyer Robert Corn-Revere fought against Loudoun County's decision to require filtering software in its libraries.[120]

3. Court Interpretations of Similar State Actions

{48}In Mainstream Loudoun v. Board of Trustees,[121] the United States District Court of the Eastern District of Virginia in Alexandria considered whether a public library could enact a policy prohibiting the access of library patrons to certain content-based categories of Internet publications through the use of filtering software.[122] On October 20, 1997, the Board of Trustees of the Loudoun County Library passed the "Policy on Internet Sexual Harassment" which provided, in part, that all library computers would be outfitted with software to block sites that displayed child pornography and obscene material, and material viewed as "harmful to juveniles."[123] Plaintiffs included community members who felt that their respective free speech rights had been infringed upon, either because their access to many websites had been blocked due to the Internet policy, or because websites they created or maintained were blocked from library computers.[124]

{49}The court found that the library policy violated the guarantees of the First Amendment, in that it, "asserted a broad right to censor the expressive activity of the receipt and communication of information through the Internet."[125] The court found that the policy did not further any compelling government interest, was not narrowly tailored, and restricted the access of adults and children to the same degree - even though the material restricted is protected speech as to adults.[126] This case clearly demonstrates the unconstitutionality of federal legislative attempts to require such policies at libraries and schools in exchange for the receipt of universal service assistance. The Board of Trustees of the Loudoun County Library instituted a policy that did not meet the rigorous requirements of the First Amendment. The collective legislative attempts to compel a similar policy at the federal level will also fail a challenge on First Amendment grounds.

{50}Ultimately, S. 97 and its state and federal counterparts have a bleak outlook. Aside from the obvious interference with parental control over what their children are able to access, the knowledge that filtering software is used on these public computers might instill a false sense of security among parents.[127] Parents who are unfamiliar with the Internet or computers may believe that their children are unable to view objectionable materials when using school library computers.[128] This is simply not the case. While blocking and filtering software is able to prohibit younger, less sophisticated users from accessing sexually-explicit material, an older child or one with more advanced computer skills, will probably be able to disable the software.[129] Furthermore, the filtering software is not "fool-proof"- it may not block out every single website that parents would wish it to block.[130]

{51}Of the legislation introduced to the 106th Congress so far, none stand a very good chance of surviving a constitutional challenge if enacted. However, the proposed FTC regulations under the COPPA appear narrowly-tailored to sufficiently meet the strict First Amendment standards applied by the court. These proposed rules are much narrower in scope than either, the COPA or CDA, and will likely survive any challenge for that very reason. At the same time, the overall effectiveness of child online protection is questionable. However, in light of the judicial decisions in both, Reno I and Reno II, the courts have suggested that the ability to control what children are able to view online can be effectively controlled by parents, using existing and developing technologies on home computers.[131] At the same time, in an area where legislation is more appropriate -- protecting the privacy of children by regulating the collection of data from them -- the FTC has proposed rules that will be effective without overstepping constitutional bounds.

D. Other Alternatives to Protect Children

{52}Clearly, something must be done about the rampant availability of material inappropriate for children on the Internet. The FTC's proposed regulations, while potentially effective, are not all-inclusive. Sexually-explicit materials are widely available on the Internet, whether accessed inadvertently or purposefully, and they are easily accessible to any person with rudimentary computer skills, regardless of age.[132] Providing more difficult access to information for those seeking to harm children is only a small step in affording ample protection to this vulnerable population. At the same time, the fact that adults have a constitutional right to view these same materials poses a problem. The government can go only so far to create legislation that will protect children using the Internet without overstepping constitutional bounds; the remaining guidance and supervision must be provided by parents. Parents cannot rely upon the government to determine what material is acceptable or unacceptable for their children to access over the Internet. Instead, parents need to take the time to become involved in their children's daily activities, and if those activities include spending time on the Internet, then parents need to supervise that time and to set guidelines on what the household feels is appropriate use of the service.


V. Conclusion

{53}First Amendment rights to free speech are continually adapting to our ever-changing world. >From print media to radio and television broadcasts, obscenity law has adjusted to new forms of communication. Obscenity law's adaptation to the Internet has not been an easy one. In , the court demonstrated that traditional obscenity principles apply, at least in limited form, to the Internet. However, questions remained as to just how much could be regulated, and to what extent traditional obscenity law would be able to control all aspects of Internet communications. The legislature's first attempt to regulate "patently offensive" and "indecent" communications online was the Communications Decency Act of 1996.[133] This Act was immediately challenged for its vague language and its unconstitutional effects of burdening protected forms of speech.

{54}After the government was enjoined from enforcing the Act, Congress made another attempt at regulation purportedly under the guidance of the Reno I court, and introduced the Child Online Protection Act. Again, the statute was challenged and met the same criticisms as the CDA. The court found that the statute posed a great potential for irreparable harm upon those whose rights to free speech it would abridge, and as such, granted a preliminary injunction against its enforcement.

{55}At present, the Child Online Privacy Protection Act serves to impose some government regulation upon the Internet, but the true force behind this statute lies in the FTC's proposed rules, which have not yet been adopted. The 106th Congress has introduced a number of bills pertaining to the Internet, none of which, it appears, will pass constitutional muster, in light of Reno I and II's prohibitions on overbroad, vague language limiting protected speech. In each case, the courts recognized that the most effective regulation of what children are and are not able to access is best served by non-legislative means, namely, parental guidance, supervision, and other technological methods that can be instituted at home.

{56}While the court's "hands-off" approach to the regulation of the Internet seems to leave children vulnerable to the deceptive practices of commercial online pornographers, sites featuring deviant ideas and information, and graphic depictions of all variety of grotesque material, the court has concluded that the Internet is a means of communication not amenable to totalitarian regulation. In order to preserve the rights of all, government regulation imposing impermissible burdens on constitutionally-protected speech cannot be maintained. Instead, parents must take an active role in their children's Internet activity, and establish for their children the types of materials which are acceptable and unacceptable in their own homes.

Abbigale E. Bricker

[*] Abbigale E. Bricker received a B.A. in English from James Madison University in 1998. She will be receiving a J.D. from the University of Richmond School of Law in December, 2000. Ms. Bricker is a Senior Staff member of the Richmond Journal of Law & Technology. She would like to thank Professor Rodney Smolla for his critical review of her work.


[**]. NOTE: All endnote citations in this article follow the conventions appropriate to the edition of THE BLUEBOOK: A UNIFORM SYSTEM OF CITATION that was in effect at the time of publication. When citing to this article, please use the format required by the Seventeenth Edition of THE BLUEBOOK, provided below for your convenience.

Abbigale E. Bricker, Note, You Can't Always Get What You Want: Government's Good Intentions v. The First Amendment's Prescribed Freedoms in Protecting Children From Sexually-Explicit Material on the Internet, 6 RICH. J.L. & TECH. 17 (Winter 1999-2000), at http://www.richmond.edu/jolt/v6i3/note5.html.


[1]. See Appellant's Brief at 28, ACLU v. Reno, 521 U.S. 844 (1997) (No. 96-511), available in 1997 WL 32931; see also Kelly M. Doherty, www.obscenity.com: An Analysis of Obscenity and Indecency Regulation on the Internet, 32 Akron L. Rev. 259, 262 (1999); Brian M. Werst, A Survey of First Amendment "Indecency" Legal Doctrine and its Inapplicability to Internet Regulation: A Guide for Protecting Children from Internet Indecency after Reno v. ACLU, 33 Gonz. L. Rev. 207, 208 (1998) (citing reference statistics documenting the phenomenal growth of the Internet).

[2]. See Jeff Chatfield, The Children's Technology Group [L]icenses the Crayon Crawler [a]nd Other Proprietary Web Modules From 1st Net Technologies, Bus.Wire, June 9, 1999, at 2 <http://www.businesswire.com/>.

[3]. See id. (citing a recent study by Jupiter Communications, a new media research company).

[4]. See Werst, supra note 1, at 208.

[5]. See Doherty, supra note 1, at 264.

[6]. See Rik Espinosa, The [G]ood, [B]ad and [U]gly [O]nline, Tulsa World, June 6, 1999, at 15, available in 1999 WL 5403047 (warning readers that you have to take precautions in chat rooms because it is likely that the identities presented are not real and that the online service providers cannot possibly assume responsibility for filtering the voluminous activity in the chat rooms); see also Safety Channel: Parent's Resources, The Parent's Den: Parent's Guidelines, (visited Oct. 19, 1999) <http://www.legalpadjr.com/partner/parents/parguide.htm> (describing in the Parents Guidelines section the precaution concerning online safety, and alerting parents to the possibility that children might be exposed to objectionable materials accidentally, or by intentional actions encouraged by inappropriate direct communication from strangers).

[7]. See Picus Software, EagleEyes 99, (last modified Mar. 10, 1999) <http://www.picus.net/PROD01.HTM> (containing an advertisement for a software system that parents can install on home computers called AEagleEyes@, thereby allowing parents the flexibility to choose which types of adult-only websites will be blocked); see also Safe Kids Home Page (1), (visited Oct. 21, 1999) <http://www.safekids.com/> (including guidelines for parents, kids' rules for online safety page, and a family contract for online safety); Welcome to GoodParents.com!, (visited Oct. 21, 1999) <http://www.goodparents.com/> (featuring selections from a book entirely devoted to what parents can do to keep their children safe online).

[8]. See generally John Schwartz, It's a Dirty Job; Web Childproofers Keep Surfing Through Muck; Researchers Catalogue Internet's Dark Side, Wash. Post, June 23, 1999, at A1, available in 1999 WL 17010301 (depicting the increasing popularity of online screening programs among parents of today's young Internet browsers).

[9]. 31 F.Supp.2d 473 (E.D. Pa. 1999) [hereinafter Reno III].

[10]. 47 U.S.C. 231 (Supp. 1999).

[11]. 18 U.S.C. 1464 (1994); see generally Laura K. McKay, Note, The Communications Decency Act: Protecting Children from On-Line Indecency, 20 Seton Hall Legis. J. 463, 475-76 (1996) (detailing the concern of exposing children to pornography over the Internet).

[12]. See generally Rodney A. Smolla, Free Speech in an Open Society 324 passim (1992) (discussing the limits of the First Amendment's protective reach).

[13]. See id.

[14]. See Chaplinsky v. New Hampshire, 315 U.S. 568, 571 (1942) (describing that by allowing the broadest scope to the text of the First Amendment, it becomes apparent that the rights associated with free speech are not absolute in every circumstance); see also Branzburg v. Hayes, 408 U.S. 665 (1972); Feiner v. New York, 340 U.S. 315 (1951).

[15]. See Roth v. United States, 354 U.S. 476 (1957).

[16]. 413 U.S. 15 (1973).

[17]. See id. at 21, citing Memoirs v. Massachusetts, 383 U.S. 413, 418 (1966).

[18]. See FCC v. Pacifica Found., 438 U.S. 726, 740 (1978).

[19]. See Smolla, supra note 12, at 326.

[20]. See Pacifica, supra note 18, at 735.

[21]. See id. at 749-50.

[22]. See Sable Comm. v. FCC, 492 U.S. 115, 131 (1989); see also Fabulous Ass'n v. Pa. Public Utility Comm'n, 896 F.2d 780, 788-89 (1990) (deeming unconstitutional statutory requirements for telephone access codes to hear sexually suggestive telephone messages).

[23]. See Sable, 492 U.S. at 127.

[24]. See id. at 127-28.

[25]. Ginsberg v. New York, 390 U.S. 629, 640 (1968).

[26]. See id. at 638, citing Prince v. Massachusetts, 321 U.S. 158, 170 (1944).

[27]. Doherty, supra note 1, at 291.

[28]. See id.

[29]. 74 F.3d 701 (6th Cir. 1996).

[30]. See id. at 705. The es were operating an online business from their home, which was based on a computer bulletin board system. The bulletin board featured chat lines, messages, e-mail, descriptions of pictures available for download, and advertisements for sexually-explicit videotapes.

[31]. Prosecutors in this case utilized 18 U.S.C. 1465 (1994).

[32]. 74 F.3d at 708.

[33]. Id. at 706.

[34]. See generally McKay, supra note 11, at 477 (describing congressional intent to fill in the holes left open by the decision).

[35]. See Petrie at 650-51.

[36]. Pub. L. No. 104-104 (110 Stat.133) (1996) (amending scattered sections of 18 U.S.C. & 47 U.S.C.).

[37]. 47 U.S.C. 223(a)(1)(B)(ii) (Supp. 1997).

[38]. Id. § (d)(1)(B).

[39]. Id. § (e)(1)-(6).

[40]. Id. § (e)(5)(A). .

[41]. 929 F. Supp. 824 (E.D. Pa. 1996) [hereinafter Reno I].

[42]. See ACLU v. Reno, 929 F.Supp. 824, 824-27 (E.D.Pa. 1996), aff'd, 521 U.S. 844 (1997).

[43]. See id. at 827.

[44]. Reno v. ACLU, 521 U.S. 844, 871-72 (1997).

[45]. Reno II, at 855.

[46]. Id. at 851.

[47]. See id. at 855-56.

[48]. See Reno II, at 880.

[49]. See Reno II, at 855.

[50]. 47 U.S.C. 231 (Supp. 1999).

[51]. See ACLU v. Reno, 31 F.Supp.2d 473, 476 (1999).

[52]. 47 U.S.C. 231(a)(1) (Supp. 1999).

[53]. Id. 231(c)(1).

[54]. Id.

[55]. Id. 231(b)(4).

[56]. See Reno III, at 476-77.

[57]. See id. at 476.

[58]. See id. at 477.

[59]. See id.

[60]. See id. at 478-79.

[61]. See id. at 479.

[62]. See id.

[63]. See id.

[64]. See id.

[65]. See id.

[66]. See id.

[67]. See id. at 493.

[68]. See id. at 498.

[69]. See id. at 495.

[70]. See id. at 495.

[71]. See id. at 485-86.

[72]. See id. at 495.

[73]. See id. at 487.

[74]. See id. at 480.

[75]. Id.

[76]. See id. at 497.

[77]. See id.

[78]. See id. at 498.

[79]. Id. at 497.

[80]. See id. The prohibited forms of material that could be "harmful to minors" included "any communication, picture, image, graphic image file, article, recording, writing, or other matter of any kind." 47 U.S.C. § 231(e) (Supp. 1999).

[81]. See Reno III, 31 F. Supp.2d at 497.

[82]. See id. at 498.

[83]. See id. at 497.

[84]. See id.

[85]. 47 U.S.C. 223(d)(1)(B) (Supp. 1997).

[86]. 47 U.S.C. 231(a)(1) (Supp. 1999).

[87]. See id. at 480.

[88]. See id. at 498; see also ACLU v. Reno, Round II: Rejecting Cyber-Censorship Court Defends Online "Marketplace" of Ideas.

[89]. See Reno III, 31 F. Supp.2d at 492 (discussing blocking and filtering software); id. at 497 (concluding that the record "reveals that blocking or filtering technology may be at least as successful as the COPA . . . without imposing the burden on constitutionally protected speech that the COPA imposes"). Id.

[90]. See Doherty, supra note 1, at 295, 298.

[91]. See id. at 285, infra Part IV.C.

[92]. Children's Online Privacy Protection Act, Pub. L. No. 105-277 §1303(b), 112 Stat. 2681 (1998).

[93]. Id; see also Children's Online Privacy Protection Rule, 64 Fed. Reg. 22,750, 22,753 (1999) (to be codified at 16 C.F.R. pt. 312).

[94]. Children's Online Privacy Protection Rule, 64 Fed. Reg. 22,750, 22,753 (1999) (to be codified at 16 C.F.R. pt. 312).

[95]. See id.

[96]. See id.

[97]. Id.

[98]. See generally Virginia State Bd. of Pharm. v. Virginia Citizens' Consumer Counsel, Inc., 425 U.S. 748, 756-57 (1976) (containing a restriction on either the speaker or the audience is a restriction on speech).

[99]. See Children's Online Privacy Protection Rule, 64 Fed Reg. 22,750, 22,753 (1999) (to be codified at 16 C.F.R. pt. 312). The FTC's proposed regulations only govern websites Adirected to children,@ and impose no prohibited classes of communications, unlike both the COPA and CDA. Id.

[100]. Id.

[101]. Id. at 22,752.

[102]. Id.

[103]. H.R. 369, 106th Cong. (1999).

[104]. See id.

[105]. Id. at 2a.

[106]. See id. at 2b.

[107]. See Children's Online Privacy Protection Rule, 64 Fed Reg. 22,750, 22,752 (to be codified at 16 C.F.R. pt. 312).

[108]. H.R. 369, 106th Cong. § 2(b) (1999).

[109]. See 64 Fed Reg. 22,750, 22,750 (1999) (to be codified at 16 C.F.R. pt. 312.3).

[110]. See 47 U.S.C. §223(e)(5) (repealed 1997); see also 47 U.S.C. §231 (c)(1) (Supp. 1999).

[111]. See U.S. West, Inc., v. FCC, 182 F.3d 1224 at 1238 (10th Cir.1999) (detailing that the First Amendment requires that when the government regulates commercial speech, it need not employ the least restrictive means to accomplish its purpose, but does need to employ a means that is Anarrowly tailored to meet its objective). In the example provided above (use of data for marketing purposes), the type of speech that is being restricted is commercial speech; see also Pittsburgh Press Co. v. Human Resources Comm'n, 413 U.S. 376, 385(1973).

[112]. S. 97, 106th Cong. (1999); H.R. 543, 106th Cong. (1999); H.R. 2560, 106th Cong. (1999); S. 1545, 106th Cong. (1999); H.R. 896, 106th Cong. (1999).

[113]. See S. 97, 106th Cong. § 2(A)(2)(A).

[114]. Schwartz, supra note 8, at A1.

[115]. See Jerry Berman, Summary of Statement for the Record, Center for Democracy and Technology, United States Senate Committee on Commerce, Science & Transportation: Hearing on S. 97, Mandatory Filtering of Internet Access in Schools and Libraries, (last visited Nov. 18, 1999) <http://www.cdt.org/speech/statement/summary.shtml>.

[116]. See id.

[117]. Center for Democracy and Technology, CDT's Letter to the Honorable John McCain, (last visited Sept. 21, 1999) <http://www.cdt.org/speech/letter/cdtletter2McCain.shtml>

[118]. See id.

[119]. See id.

[120]. Schwartz, supra note 8, at A1.

[121]. 24 F. Supp.2d 552 (1998).

[122]. See id.

[123]. See id. at 556.

[124]. See id.

[125]. See id at 570.

[126]. See id.

[127]. See Schwartz, supra note 8, at A1.

[128]. See id.

[129]. See id.

[130]. See id.

[131]. Reno III, 31 F. Supp.2d 473, 491-92; Reno II, 521 U.S. 844, 855.

[132]. See Espinosa, supra note 6, at 15.

[133]. 47 U.S.C. §223(d)(1)(B) (Supp. 1997).

Related Browsing

Copyright 1999 Richmond Journal of Law & Technology