By Chris Jones*
I. Introduction
As the world moves increasingly online, consumers are forced to enter personal information into websites to apply for jobs, attend school, or purchase tickets to an event. Consumers’ personal information is often sold and shared as a commodity among tech businesses, advertising agencies, and data brokers. According to the Federal Trade Commission (“FTC”), “most consumers . . . know little about the data brokers who collect and trade consumer data or build consumer profiles that can expose intimate details about their lives and . . . expose unsuspecting people to future harm.”[1] As a result, the risk of individual privacy harms continues to increase. Privacy injuries may include reputational, discriminatory, physical, emotional, economic, and relationship harms.[2]
Absent a comprehensive federal privacy law, the majority of U.S. businesses operate under the assumption that fine print in a legally complex privacy policy is sufficient to act in good faith. Unfortunately, standard privacy policies do nothing to advise consumers of the harms they may experience when utilizing a website, application, or device. Without a basic understanding of why they should care about their personal information being sold and shared, consumers lack the requisite knowledge necessary to make an informed decision.
This article argues that the potential for consumer harms, resulting from the use of a product or service, should be spelled out and disclosed. The FTC should promulgate a rule to require harms’ disclosure in a standardized, easy-to-understand privacy policy that is consistent throughout the industry. By educating the general public up front, informed consumers can determine the true level of risk they are willing to take, instead of blindly following flashy advertising and exciting trends.
II. Background
A. Overview of Privacy Policies
Privacy policies are typically lengthy notices filled with technical terms and legal language[3] that explain what an entity does with a consumer’s personal information, how the information is shared with third parties, and whether the consumer has options regarding this sharing.[4] Many privacy policies are difficult to understand and contain language designed to mislead consumers into believing the business protects their information.[5] Moreover, privacy policies typically provide no warning to consumers of potential harms they may encounter when utilizing the product or service.[6]
Many businesses rely on the fine print in a legally complex privacy policy to address consumer privacy issues.[7] According to Jen King, the director of consumer privacy at the Center for Internet and Society, privacy policies are “documents created by lawyers, for lawyers. They were never created as a consumer tool.”[8]
In 2012, the average length of an online privacy policy was 2,415 words.[9] It would take an average internet user seventy-six working days—consisting of eight hours per day—to read the privacy policies of every website they encountered within a year.[10] Over the past decade, Americans’ use of the internet has exploded; as a result businesses have greatly expanded their privacy policies.[11] For example, Facebook’s privacy policy takes a reported eighteen minutes to read.[12] Thus, it is not reasonable to expect the average consumer has the time, nor the sophistication, to read and understand every lengthy and substantially different privacy policy they may encounter.
B. Legal Foundation
The FTC is in charge of preventing unfair or deceptive acts or practices that affect commerce in the privacy arena.[13] Pursuant to Section 18 of the FTC Act and the Commission’s rules of practice, the FTC has the authority to “promulgate, modify, and repeal trade regulation rules that define with specificity acts or practices that are unfair or deceptive in or affecting commerce within the meaning of Section 5(a)(1).”[14] The FTC Act identifies unfair or deceptive acts as those that cause or are likely to cause significant injury to consumers.[15]
The FTC currently sanctions businesses for unfair or deceptive practices while enforcing adherence to a business’ privacy policy.[16] As the FTC does not provide a standardized template for privacy policies, businesses are left to draft their own documents, without clear guidelines.
III. Potential Harms from Data Sharing
As technology has revolutionized American lives, individuals’ personal information is entered into online platforms on a daily basis in order to schedule medical appointments, apply for college, or communicate with most businesses. Moreover, the average Smartphone user in the U.S. utilizes approximately forty-six apps per month.[17] As a result, the risk of individual privacy harms continues to increase.[18] Privacy harms encompass a large scale of scenarios ranging from discrimination to emotional impairment to economic loss.[19]
Privacy injuries associated with the unauthorized use of an individual’s data may include reputational,[20] discrimination,[21] physical,[22] autonomy,[23] economic,[24] emotional,[25] and relationship harms.[26] For example, the disclosure of personal health data may affect a consumer’s ability to obtain employment, financial products, insurance, housing, or admission to a nursing home; it may cause social stigmatization based on race, sexual preferences, disease, addictions, mental health conditions, religion, or political positions; and it may subject the consumer to potentially dangerous situations due to blackmail, bullying, stalking, Ransomware, or the revelation of secret locations for domestic abuse victims.[27] Disclosures of mental health conditions, along with certain diagnoses, such as sexually transmitted disease, alcohol, or drug use carry additional social stigmatizations.[28]
Courts have moved beyond rigid injury requirements to include more intimate personal autonomy harms, such as “coercion – the impairment on people’s freedom to act or choose; (2) manipulation – the undue influence over people’s behavior or decision-making; (3) failure to inform – the failure to provide people with sufficient information to make decisions; (4) thwarted expectations – doing activities that undermine people’s choices; (5) lack of control – the inability to make meaningful choices about one’s data or prevent the potential future misuse of it; [and] (6) chilling effects – inhibiting people from engaging in lawful activities.”[29]
Further stigmatization can occur when online platforms make their way into the real lives of consumers. For example, Facebook has developed a relationship with law enforcement, searching for individuals whose online activities may infer suicidal tendencies.[30] Facebook scans users’ input—including private messages—for content that may apply to “safety and health.”[31] Facebook then reports individuals to law enforcement that they consider as potentially suicidal.[32] Thus, by sending allegedly private messages on Facebook, a user runs the risk of the police showing up at their door in real life.[33]
This can be particularly troubling for users as police documentation about a potentially suicidal visit—including officers’ body cam footage of people, cars, and homes—becomes public record.[34] The records may be shared with any interested parties—including data brokers.[35] This public documentation of a consumer’s perceived mental instability can have devastating consequences that affect the rest of their lives.[36] Potential harms may include discrimination in careers, housing, public doxing, reputational damage, relationship issues, or mental health stigmas. Imagine having to disclose to potential employers that you were deemed a suicide risk by local law enforcement.
Still, the majorities of consumers are not adequately informed of potential harms and have little to no knowledge of the life-long consequences that may result from utilizing these products and services.[37]
IV. Privacy Policies Should Disclose Potential Harms
At the time of publication, there are still no comprehensive U.S. privacy laws at the federal level, let alone any statutes to require privacy policies disclose potential consumer harms. While it is standard practice in the U.S. for some industries to warn consumers of potential harms, the technological world is way behind. For example, in California, amusement parks, personal car manufacturers, and even holiday lights’ manufacturers are required to disclose “significant exposures to chemicals that cause cancer, birth defects or other reproductive harm.”[38]
Countries worldwide and several U.S. states have begun to pass privacy laws to minimize commercial surveillance and promote data security.[39] “Persistent and targeted surveillance collapses individual moments of interaction, spread out over time and mitigated through human forgetfulness, into one long story of an individual’s life.”[40] This type of surveillance can lead to inferences about highly sensitive areas of a person’s life, such as religion, sexual activities, and health.[41] Therefore, U.S. consumers need to be warned of surveillance profiling and subsequent harms, before they agree to utilize a product or service.
“Studies have shown that most people do not generally understand the market for consumer data that operates beyond their monitors and displays.”[42] A Pew Research Center study found that “78% of US adults say they understand very little or nothing about what the government does with the data it collects, and 59% say the same about the data companies collect.”[43] Thus, if the majority of consumers admit they do not understand what is done with their personal data, a privacy policy filled with legal terms and jargon does nothing to serve as a warning.
Critics may argue that privacy harms often do not occur until some future time, if at all. Therefore, it is unnecessary to warn consumers about the potential risk of future harms. The current technological ecosystem is so complex and consists of so many entities; it is difficult—if not impossible—for the average consumer to pinpoint where their personal information was disclosed, when experiencing higher insurance rates, employment discrimination, or targeted advertising based on their most intimate secrets. Thus, it is critical to notify consumers of potential harms at the initial time of their data collection.
V. Solution
Absent a comprehensive federal privacy law, this article proposes the FTC should promulgate a rule to require consumer harms’ disclosure in a standardized, easy-to-understand privacy policy that is consistent throughout the industry.
The Gramm-Leach-Bliley Act (“GLBA”) provides financial institutions with a standardized template listing specific categories of information that must be disclosed.[44] This template is similar to the nutrition-label approach to privacy instituted by Apple and Google in their app stores.[45] For example, Apple created the labels “to help users learn at a glance what data will be collected by an app, whether that data is linked to them or used to track them, and the purposes for which that data may be used.”[46] The nutrition labels are a great tool to identify the information being collected; however, the labels fail to warn consumers of potential harms resulting from the collection.
By utilizing a template similar to the GLBA requirement, consumers can easily determine whether or not the business sells or shares their personal information with third-parties, who those third parties are, what potential harms can result from said sharing, and what choices the consumer has—if any—to opt out. Thus, consumers can quickly identify the key components to determine whether they want to do business with the entity. While the U.S. has a long way to go in protecting users’ privacy, this disclosure of harms is one step that can educate and empower consumers to make informed decisions.
VI. Conclusion
Action should be taken at the federal level to clearly notify consumers of potential privacy harms resulting from the sharing of their personal information. By providing sweeping protection in privacy policies for all U.S. residents, the FTC can help to balance the benefits of technology with the education necessary for consumers to take back control of their own private lives.
*J.D., Gonzaga University School of Law. Acknowledgments and gratitude to Professor Drew Simshaw for his
invaluable insights and continuing support.
[1] Trade Regulation Rule on Commercial Surveillance and Data Security, Fed. Trade Comm’n,, Dec. 8, 2021, 1, 5 https://www.ftc.gov/system/files/ftc_gov/pdf/commercial_surveillance_and_data_security_anpr.pdf. [hereinafter FTC Trade Regulation Rule].
[2] See Danielle Keats Citron & Daniel J. Solove, Privacy Harms, Geo Wash. L. Fac. Publications & Other Works, 1, 19, 21-23, 25, 28 https://scholarship.law.gwu.edu/faculty_publications/1534 (last visited Feb. 7, 2023).
[3] See Lori Andrews, A New Privacy Paradigm in the Age of Apps, 53 Wake Forest L. Rev. 421, 435 (2018).
[4] See Andrews, supra note 3, at 434-36.
[5] See Deceived by Design, Forbrukerradet 1, 22 (June 27, 2018),
https://fil.forbrukerradet.no/wp-content/uploads/2018/06/2018-06-27-deceived-by-design-final.pdf (discussing how Big Tech utilizes positive and negative wording to “nudge users toward making certain choices”); See also Kevin Litman-Navarro, We Read 150 Privacy Policies. They Were an Incomprehensible Disaster, N.Y. Times, June 12, 2019, https://www.nytimes.com/interactive/2019/06/12/opinion/facebook-google-privacy-policies.html.
[6] See Forbrukerradet, supra note 5.
[7] See Andrews, supra note 3, at 435.
[8] See Litman-Navarro, supra note 5.
[9]Alexis C. Madrigal, Reading the Privacy Policies You Encounter in a Year Would Take 76 Work Days, Atl. Monthly Grp., LLC (Mar. 1, 2012), https://www.theatlantic.com/technology/archive/2012/03/reading-the-privacy-policies-you-encounter-in-a-year-would-take-76-work-days/253851/.
[10] Id.
[11] See Litman-Navarro, supra note 5.
[12] See Litman-Navarro, supra note 5.
[13] See 15 U.S.C. §§ 41-58, as amended [hereinafter 15 U.S.C.]; See also Michael Goodyear, The Dark Side of Videoconferencing: The Privacy Tribulations of Zoom and the Fragmented State of U.S. Data Privacy Law, 10 Hous. L. Rev. 76, 79 (2020).
[14] See FTC Trade Regulation Rule, supra note 1, at 12.
[15] See 15 U.S.C., supra note 13; See also Scott Stiefel, The Chatbot Will See You Now; Protecting Mental Healthware Confidentiality in Software Applications, 20 Colum. Sci. & Tech. L. Rev. 333, 386 (2019).
[16] See 15 U.S.C., supra note 13; See also Nicole Angelica, Alexa’s Artificial Intelligence Paves the Way for Big Tech’s Entrance into the Health Care Industry – The Benefits to Efficiency and Support if the Patent-Centric System Outweigh the Impact on Privacy, 21 N.C. J. L. & Tech. 59, 77-78 (2020)..
[17] See Stephanie Chan, U.S. Consumers Used an Average of 46 Apps Each Month in the First Half of 2021, Sensor Tower, Inc., Aug. 2021, https://sensortower.com/blog/apps-used-per-us-smartphone.
[18] See FTC Trade Regulation Rule, supra note 1, at 7 (stating how the FTC noted that “companies’ collection and use of data have significant consequences for consumers’ wallets, safety, and mental health”).
[19] See Lothar Determann, Healthy Data Protection, 26 Mich. Tech. L. Rev. 229, 256 (2020).
[20] See Citron & Solove, supra note 2, at 22 (describing how “reputational harms impair a person’s ability to maintain ‘personal esteem in the eyes of others and can taint a person’s image.’” Reputational harms can result in social rejection, lost employment or business).
[21] See Citron & Solove, supra note 2, at 28 (Discrimination harms particularly highlight the inequality and disadvantages for people from marginalized communities. Potential discrimination may occur in the form of employment, housing, insurance ratings, or online harassment).
[22] See Citron & Solove, supra note 2, at 19 (describing physical harms as setbacks to physical health or physical violence when personal data is improperly shared).
[23] See Citron & Solove, supra note 2, at 40 (describing autonomy harms as the “restriction, coercion, or manipulation of people’s choices”).
[24] See Citron & Solove, supra note 2, at 21 (Economic harms include financial loss or identity theft).
[25] See Citron & Solove, supra note 2, at 23 (Emotional harms include emotional distress, categorized by anger, frustration, various degrees of anxiety, and annoyance).
[26] See Citron & Solove, supra note 2, at 25 (describing how relationship harms can encompass personal, professional, and organizational relations).
[27] See Determann, supra note 19, at 256; See Andrews, supra note 3, at 465–66.
[28] See Determann, supra note 19, at 256-57 (When faced with a mental health diagnosis, patients may experience “embarrassment, shame, and even social exclusion should information of this nature become public.” This stigmatization often affects an individual’s quality of life and can cause additional health conditions or a variety of psychosomatic symptoms).
[29] See Citron & Solove, supra note 2, at 47.
[30] See Benjamin Goggin, Inside Facebook’s Suicide Algorithm; Insider, Inc., Jan. 6, 2019, https://www.businessinsider.com/facebook-is-using-ai-to-try-to-predict-if-youre-suicidal-2018-12.
[31] See id.
[32] See id.
[33] See id.
[34] See Jacqueline White, ISP body-cam footage shows Idaho suspect pulled over in Indiana, minutes after being stopped by Deputy, Scripps Media, Inc., https://www.wrtv.com/news/working-for-you/isp-body-cam-footage-shows-idaho-suspect-pulled-over-in-indiana-minutes-after-being-stopped-by-deputy (last updated Jan 3, 2023 04:18 PM).
[35] See generally Electronic Frontier Foundation, FOIA How To,
https://www.eff.org/issues/transparency/foia-how-to (last visited Feb. 6, 2023).
[36] See Goggin, supra note 30.
[37] See Brooke Auxier, Lee Rainie, Monica Anderson, Andrew Perrin, Madhu Kumar, and Erica Turner, Americans and Privacy: Concerned, Confused, and Feeling Lack of Control Over Their Personal Information, Pew Res. Ctr, 1, 10 (Nov. 15, 2019)
https://www.pewresearch.org/internet/2019/11/15/americans-and-privacy-concerned-confused-and-feeling-lack-of-control-over-their-personal-information/ [hereinafter Pew Research Center] (describing how the majority of Americans do not understand what the government or private businesses do with their personal information); See also Citron & Solove, supra note 2, at 18-40 (describing privacy harms a consumer may experience due to unauthorized use of their personal information).
[38]See OEHHA Cal. Off. Envtl Health Hazard Assessment About Proposition 65
https://oehha.ca.gov/proposition-65/about-proposition-65 (last visited Feb. 2, 2023) (“Proposition 65 requires businesses to provide warnings to Californians about significant exposures to chemicals that cause cancer, birth defects or other reproductive harm. These chemicals can be in the products that Californians purchase, in their homes or workplaces, or that are released into the environment. By requiring that this information be provided, Proposition 65 enables Californians to make informed decisions about their exposures to these chemicals”).
[39] See Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016, Official J. Eur. Union L. 119, at 1 (2016) (The GDPR is a comprehensive privacy law designed to prohibit businesses from tracking and selling the personal information of consumers located in the EU, absent consent); See also Californians for Consumer Privacy, CPRA Executive Summary
https://www.caprivacy.org/cpra-exec-summary/ (last visited June 21, 2021) (describing California’s privacy law); See also Va. Code §§ 59.1-571 to -581 (2021) (describing Virginia’s privacy law); See also Col. Gen. Assemb., Senate Bill 21-190,
https://leg.colorado.gov/sites/default/files/documents/2021A/bills/2021a_190_enr.pdf (last updated June 23, 2021) (describing Colorado’s Privacy Law).
[40] Margot E. Kaminsky, Privacy and the Right to Record, 97 Boston U. L. Rev. 167, 215 (2015).
[41] Id.
[42] See FTC Trade Regulation Rule, supra note 1, at 5.
[43] See Pew Research Center, supra note 37, at 10.
[44] See FTC Trade Regulation Rule, supra note 1, at 15 (describing how the GLBA regulates the privacy of consumer information collected by financial institutions).
[45] See Cookie Pro, Google Play Data Safety vs. Apple Nutrition Label
https://www.cookiepro.com/knowledge/data-safety-nutrition-label/ (last updated July 6, 2022).
[46] Id.
Image Source: https://depositphotos.com/vector-images/caution-sign.html