By: Tabetha Soberdash

Image Source: http://www.m2sys.com/blog/biometric-technology/10-massive-biometric-technology-examples-that-revamped-the-world/
What once was used primarily in science fiction movies to portray a far off, technology-advanced world, is now something that many use everyday as more and more companies begin to utilize biometric security technology.[1] Biometric security technology utilizes the individualizing characteristics of a person’s biometrics to identify or to authenticate the person.[2] Put simply, biometrics are an individual’s unique physical characteristics and can include things like fingerprints, irises, retinas, and facial characteristics.[3]
Over time, companies have started to utilize biometric technology for a variety of tasks. For example, it is used for going through some airports’ security, for entering places like Disney, and even for unlocking apps on one’s cellphone.[4] In fact, companies utilizing biometric technology has become so popular that a study by Spiceworks shows that biometric security technology is used in sixty-two percent of companies with an additional twenty-four percent planning to use it within the next two years.[5]
While this type of technology comes with some major benefits, such as narrowing who has access to a system’s login capabilities or eliminating the possibility of forgetting one’s password, there is still a risk that this sensitive data could be compromised or breached by third-parties.[6] If this happens, one cannot simply change one’s biometrics like one can change a password.[7] As biometric data is consistent throughout an individual’s lifespan, this risk can have substantial effects that can follow the individual.[8]
With such risks possible, it becomes crucial to look at what laws regulate biometric use and sharing that provide protection for one’s privacy. Currently, only a few states have comprehensive biometric privacy laws in place and no such federal law exist.[9] However, the year 2019 has shown a major movement towards defining the laws that do exist and describing the standing they require for litigation.
For example, Illinois is a state that has had its biometric privacy law litigated significantly this year. As Illinois was the first state to comprehensively address biometric privacy when it enacted the Biometric Information Privacy Act (BIPA) in 2008, it has been very influential in laying out the foundation of defining comprehensive laws regulating biometric collection.[10] According to BIPA, three main things must occur before a private entity can collect or store biometrics.[11] First, a private entity must inform individuals that their biometrics will be collected.[12] Secondly, the private entity must inform individuals of the purpose and length of the collection.[13] Thirdly, the private entity must receive informed written consent from the individual to proceed forward with the collection.[14] Continuatively, BIPA requires that a private entity must first obtain additional consent beyond that initial required consent before sharing biometric data with third parties.[15] This year, two major cases have affected the way BIPA is able to protect biometric privacy of Illinois citizens. In the first case, Rosenbach v. Six Flags Entertainment Corp., the Illinois Supreme Court concluded that individuals will not need to “plead and prove that they sustained some actual injury or damage beyond infringement of the rights afforded them under [BIPA]” to have standing to sue.[16] This will allow for the possibility of many more suits to arise in the near future, as it makes it easier for plaintiffs to be able to prove they have standing to sue.[17] Additionally, this conclusion was soon used by the Ninth Circuit in the case of Patel v. Facebook, Inc in a manner that could further expand the potential for future suits.[18]
In Patel v. Facebook, Inc., the Ninth Circuit was faced with the issue of whether or not plaintiffs had sufficiently shown that Facebook’s biometric surveillance caused them a concrete injury that would allow their case to survive Article III standing and be heard in federal court.[19] In the case, plaintiffs alleged that Facebook’s “Tag Suggestions” feature violated BIPA, as it collected and used their biometric information without their informed opt-in consent.[20] Using the interpretation of the BIPA described in Rosenbach v. Six Flags Entertainment Corp., the Ninth Circuit determined that mere violation of the BIPA provisions created an actual harm to the privacy interests that BIPA was created to protect.[21] As such, the plaintiffs alleged a concrete and particularized harm sufficient to meet Article III standing.[22] Furthermore, the Ninth Circuit determined that it was “reasonable to infer that the [Illinois] General Assembly contemplated BIPA’s application to individuals who are located in Illinois, even if some relevant activities occur outside the state.”[23] It would appear that Patel v. Facebook, Inc. has the potential to open the door to many more class-action suits in the foreseeable future.[24] However, what actually occurs will depend on if Facebook chooses to appeal the decision and if that appeal is heard by the Supreme Court.[25] If Facebook does not follow through with an appeal, or its appeal is not heard by the Supreme Court, class-action suits likely will increase as the decision is utilized to find standing in future cases.[26] However, if Facebook does proceed with an appeal that is then heard, the decision of Patel v. Facebook, Inc. will have to be compared to an earlier decision by the Second Circuit that had the opposite conclusion.[27] In the Second Circuit’s decision of Santana v. Take-Two Interactive Software, Inc., BIPA violation claims from players of NBA 2K video games were rejected after concluding that the players were not injured sufficiently by the video game’s scans of their faces to meet Article III standing.[28] How the Supreme Court chooses to address this circuit split will greatly impact the number of cases that can find standing for litigating BIPA violation claims.[29]
Another state that has particularly taken a large step towards defining a comprehensive biometric privacy law is California. At the conclusion of the year, California will have a biometric privacy law that is similar to the European Union’s General Data Protection Regulation (GDPR) when the California Consumer Privacy Act (CCPA) goes into effect on January 1, 2020.[30] Potentially, CCPA may result in broader scope of protection than BIPA,[31] as it allows consumers to not only have more control over their biometric data but also many other types of personal information as well.[32] Specifically, the CCPA will provide California residents with the right to know what personal information large corporations are collecting about them, the ability to tell businesses not to share or sell their personal information, and the protection against businesses that compromise their personal information.[33]
Although the scope of protection of CCPA may be broader than the scope of BIPA, the CCPA will limit private right of action to when one’s personal information “is subject to an unauthorized access and exfiltration, theft, or disclosure as a result of the business’s violation of the duty to implement and maintain reasonable security procedures and practices.”[34] This likely will lead to requiring a greater showing of harm than BIPA requires before a private suit can occur.[35] However, the creation of the law in itself will likely provide an avenue for an increase in litigation.
Further, this year has been impactful on federal biometric privacy law development as well. As mentioned earlier, there is not currently a comprehensive federal biometric privacy law, but earlier this year a federal bill was introduced to regulate the commercial applications of facial recognition technology.[36] The bill, titled the Commercial Facial Recognition Privacy Act of 2019, would prohibit certain entities from using facial recognition technology and data without first obtaining user consent.[37] However, the act is limited in that it expressly states it “shall not be construed as superseding, altering, or affecting any statute, regulation, order, or interpretation in effect in any State, except to the extent that . . . is inconsistent with the provisions of this Act, and then only to the extent of the inconsistency.”[38] As such, biometric protection is likely still going to be broader under state laws and enforcement.
In conclusion, the year 2019 has shown major movement towards defining biometric privacy laws and expanding protection of one’s privacy. However, as the world continues to increase its use of biometric technology, litigation over the issue is likely to continue. As jurisdictions utilize different definitions and laws to regulate biometric use and collection, the upcoming years will likely show an increase in litigation of biometric privacy issues as companies balance out the different rules.[39] Further, as circuits split on defining what harm is required to have standing to sue, even determining how to meet an individual area’s laws will likely result in an increase in litigation and need for policy formation in the upcoming years. As such, companies will need to continuously keep watch of how jurisdictions decide to protect an individual’s privacy, and individuals will need to watch for what policies companies have in place to protect their biometric information and what their stat
[1] See SHRM, More Employers Are Using Biometric Authentication (2018), https://www.shrm.org/resourcesandtools/hr-topics/technology/pages/employers-using-biometric-authentication.aspx.
[2] See Chiara Braghin, Biometric Authentication 1–2 (2000).
[3] U.S. Dep’t of Homeland Sec., Biometrics (2019), https://www.dhs.gov/biometrics.
[4] See Christina Ianzito, Airlines Using Facial Recognition to Speed Airport Check-In (2018); Adam Vrankulj, Walt Disney World introduces new RFID gate system (2013), https://www.biometricupdate.com/201303/walt-disney-world-introduces-biometric-verification-for-passholders; Michelle Wheeler, The future of biometric technology (2014), https://phys.org/news/2014-03-future-biometric-technology.html.
[5]See SHRM, supra note 1.
[6] See Nat’l Acads. of Scis., Eng’g, & Med., Biometric Recognition: Challenges and Opportunities 110 (Joseph N. Pato & Lynette I. Millett 2010).
[7] See id. at 114-115.
[8] See id.
[9] See SHRM, How to Stay Within the Law When Using Biometric Information (2018), https://www.shrm.org/resourcesandtools/legal-and-compliance/employment-law/pages/stay-within-the-law-biometric-information.aspx.
[10] See id.
[11] See Biometric Information Privacy Act, 740 Ill. Comp. Stat. Ann. 14/15(b) (LexisNexis through P.A. 101-309, except for portions of P.A. 101-48, 101-221, 101-238, and 101-275 of the 2019 Regular Session of the 101st General Assembly).
[12] See Biometric Information Privacy Act, 740 Ill. Comp. Stat. Ann. 14/15(b)(1) (LexisNexis through P.A. 101-309, except for portions of P.A. 101-48, 101-221, 101-238, and 101-275 of the 2019 Regular Session of the 101st General Assembly).
[13] See Biometric Information Privacy Act, 740 Ill. Comp. Stat. Ann. 14/15(b)(2) (LexisNexis through P.A. 101-309, except for portions of P.A. 101-48, 101-221, 101-238, and 101-275 of the 2019 Regular Session of the 101st General Assembly).
[14] See Biometric Information Privacy Act, 740 Ill. Comp. Stat. Ann. 14/15(b)(3) (LexisNexis through P.A. 101-309, except for portions of P.A. 101-48, 101-221, 101-238, and 101-275 of the 2019 Regular Session of the 101st General Assembly).
[15] See Biometric Information Privacy Act, 740 Ill. Comp. Stat. Ann. 14/15(d)(1) (LexisNexis through P.A. 101-309, except for portions of P.A. 101-48, 101-221, 101-238, and 101-275 of the 2019 Regular Session of the 101st General Assembly).
[16] Rosenbach v. Six Flags Entertainment Corp., 129 N.E.3d 1197, 1207 (Ill. 2019).
[17] See SHRM, Illinois Biometric Class Actions Are on the Rise Risks (2018), https://www.shrm.org/resourcesandtools/legal-and-compliance/state-and-local-updates/pages/biometric-class-actions.aspx.
[18] See Patel v. Facebook, Inc., 932 F.3d 1264, 1273–1274, 1276–1277 (9th Cir. 2019); Rosenbach v. Six Flags Entertainment Corp., 129 N.E.3d 1197, 1207 (Ill. 2019).
[19] See Patel v. Facebook, Inc., 932 F.3d 1264, 1268–1270 (9th Cir. 2019).
[20] See id.
[21] See Patel v. Facebook, Inc., 932 F.3d 1264, 1273–1274, 1276–1277 (9th Cir. 2019); Rosenbach v. Six Flags Entertainment Corp., 129 N.E.3d 1197, 1207 (Ill. 2019).
[22] See id.
[23] Patel v. Facebook, Inc., 932 F.3d 1264, 1276 (9th Cir. 2019).
[24] See Crowell & Moring, Ninth Circuit Rejects Facebook’s Article III Argument; Biometric Lawsuit Will Proceed 1–2 (2019).
[25] See id.
[26] See id.
[27] See id.
[28] See id.
[29] See id.
[30] See Int’l Ass’n of Privacy Prof’ls, GDPR Matchup: The California Consumer Privacy Act 2018 (2018), https://iapp.org/news/a/gdpr-matchup-california-consumer-privacy-act; California Consumer Privacy Act of 2018, Cal. Civ. Code § 1798.100 (Deering through Chapters 1-70, 72-136, 138-173, 175-185, 187-193, 195, 196, 198-200, 202-213, 215, 217-223, 225-243, 245-254, 257-260, and 264 of the 2019 Regular Session, including all legislation effective September 11, 2019 or earlier).
[31] Compare Biometric Information Privacy Act, 740 Ill. Comp. Stat. Ann. 14/15 (LexisNexis through P.A. 101-309, except for portions of P.A. 101-48, 101-221, 101-238, and 101-275 of the 2019 Regular Session of the 101st General Assembly) with Cal. Civ. Code § 1798.140 (Deering through Chapters 1-70, 72-136, 138-173, 175-185, 187-193, 195, 196, 198-200, 202-213, 215, 217-223, 225-243, 245-254, 257-260, and 264 of the 2019 Regular Session, including all legislation effective September 11, 2019 or earlier).
[32] See Cal. Civ. Code § 1798.140(o) (Deering through Chapters 1-70, 72-136, 138-173, 175-185, 187-193, 195, 196, 198-200, 202-213, 215, 217-223, 225-243, 245-254, 257-260, and 264 of the 2019 Regular Session, including all legislation effective September 11, 2019 or earlier) (defining personal information as “information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household”).
[33] See California Consumer Privacy Act of 2018, Cal. Civ. Code § 1798.100 (Deering through Chapters 1-70, 72-136, 138-173, 175-185, 187-193, 195, 196, 198-200, 202-213, 215, 217-223, 225-243, 245-254, 257-260, and 264 of the 2019 Regular Session, including all legislation effective September 11, 2019 or earlier).
[34] Cal. Civ. Code § 1798.150(a)(1) (Deering through Chapters 1-70, 72-136, 138-173, 175-185, 187-193, 195, 196, 198-200, 202-213, 215, 217-223, 225-243, 245-254, 257-260, and 264 of the 2019 Regular Session, including all legislation effective September 11, 2019 or earlier).
[35] See id.
[36] See Commercial Facial Recognition Privacy Act of 2019, S. 847, 116th Cong. (2019).
[37] See id.
[38] See Commercial Facial Recognition Privacy Act of 2019, S. 847, 116th Cong. § 6(a) (2019).
[39] See generally SHRM, Use of Biometric Data Grows, Though Not Without Legal Risks (2018), https://www.shrm.org/resourcesandtools/hr-topics/technology/pages/biometric-technologies-grow-.aspx (discussing how “a rise in class-action lawsuits against companies in some states suggests organizations need written policies and procedures regarding how they use, store and secure biometric data”).