Privacy World 2022 Year in Review: Biometrics and AI
Thursday, January 26, 2023

2022 was another year of high activity and significant developments in the realm of artificial intelligence (“AI”) and biometric privacy related matters, including in regard to issues arising under the Illinois Biometric Information Privacy Act (“BIPA”) and others.  This continues to be one of the most frequently litigated areas of privacy law, with several notable rulings and emerging patterns of new activity by the plaintiffs’ bar.  Following up on Privacy World’s Q2 and Q3 2022 Artificial Intelligence & Biometric Privacy Quarterly Newsletters, be sure to read on for a recap of key developments and insight as to where 2023 may be headed.

I.     LITIGATION TRENDS AND KEY DECISIONS

A.     First BIPA Class Action Jury Trial Results in Win for Plaintiff

The world of biometric privacy litigation experienced a development noteworthy enough to put it on equal footing with Rosenbach v. Six Flags Ent. Corp., 2019 IL 123186, 129 N.E.3d 1197 (Ill. 2019)—which held actual injury is not required to pursue BIPA claims—with a jury finding in favor of a class of Illinois truck drivers in the first BIPA class action to be tried to verdict. In that case, Rogers v. BNSF Ry. Co., Richard Rogers alleged that his former employer, BNSF Railway Co. violated BIPA when it collected and stored his and other truck drivers’ biometric data without obtaining their consent or informing them of the company’s data retention policies. BNSF itself, however, was not involved in any activities associated with the collection or use of biometric data. Instead, the company contracted with a third-party vendor, Remprex, to operate the equipment that collected Rogers’ fingerprints, which purportedly failed to follow the requirements of Illinois’s biometric privacy statute.

After closing arguments, the jury needed less than an hour to return its verdict in favor of the class of truck drivers. The jury only decided on the issue of liability and was not tasked with calculating damages, which will be assessed by the court at a later date. With that said, using back-of-the-envelope calculations—and assuming that the court applies BIPA’s lower $1,000 negligent violation statutory damages amount and awards damages only for the initial finger scan of each class member that ran afoul of the law (as opposed to every scan under a continuing violation theory), damages still amounts to a staggering $44 million.

There are several major takeaways from the Rogers verdict, including the following:

  • The fact the jury needed under an hour to reach its verdict indicates that it was not even a close call in the jurors’ eyes as to whether the conduct at issue violated BIPA;

  • The jury’s verdict rendered against the defendant—despite the fact that the railroad did not itself actively collect, use, or possess any biometric data—provides additional context regarding the unsettled issue of vicarious liability in BIPA class action disputes; and

  • The anticipated impact of the Rogers verdict will be a continued increase in the volume of BIPA class action filings moving forward.

B.     Retailers Maintain Status as Primary Target for BIPA Class Action Suits

One of the most significant trends that took place in the BIPA class action litigation space over the course of 2022 was the continued targeting of online retailers in class action lawsuits alleging violations of Illinois’s biometric privacy statute. Generally speaking, this can be attributed to (among other factors) retailers’ extensive use of technology that at least allegedly appears (according to plaintiff’s counsel) to implicate facial recognition and the availability of liquidated damages on a per violation basis under BIPA.

Of note, retailers faced a high volume of BIPA lawsuits in connection with their use of virtual try-on (“VTO”) tools, which utilize facial feature detection capabilities to allow users to virtually “try on” products, such as eyewear or cosmetics, to see how they might look on them prior to making a purchase by virtually placing the product on the user’s face. Importantly, despite the questionable nature of merits of the claims underlying these lawsuits, i.e., whether the VTO tools in question engage in scans of face geometry, the majority of defendants in these class actions have been unable to obtain dismissals at the motion to dismiss stage. Retailers are also being targeted for BIPA class lawsuits in a broad range of other contexts, such as the use of AI voice assistants that facilitate customers’ drive-thru orders, as well as restaurants’ use of automated voice order (“AVO”) systems that enable customers to place orders over the phone.

The trend of BIPA suits targeting retailers is likely to extend into 2023. As such, all retailers (if they have not already done so) should consult with experienced biometric privacy counsel to review their current practices relating to the collection and use of biometric data and remediate any compliance gaps immediately.

C.     Courts Continue to Favor Expansive Interpretation of Key Aspects of BIPA’s Statutory Text When Confronted with 12(b)(6) Motions

Another major trend that took place in the area of BIPA litigation in 2022 was the expansive, broad interpretation of key aspects of Illinois’s biometric privacy statute by state and federal courts at the pleadings stage, allowing such claims to enter discovery (which can be extremely costly and time consuming for defendants). In particular, courts shifted from their original, relatively narrow interpretation of BIPA Section 15(c) profiting claims to a much more liberal interpretation of this particular compliance requirement of BIPA, with several courts finding that even the most conclusory allegations set forth in BIPA disputes were sufficient to withstand motions to dismiss for failure to state a claim.

For example, in Karling v. Samsara, Inc., No. 22 CV 295, 2022 U.S. Dist. LEXIS 121318, at *18-19 (N.D. Ill. July 11, 2022), the court held that allegations relating to alleged non-compliance with Section 15(c)—specifically, that “profit[ing] from contracts to capture [biometric] data and provide services [utilizing that data] to employers—was sufficient to avoid dismissal under Federal Civil Rule 12(b)(6).

Similarly, in Mahmood v. Berbix, Inc., No. 22 CV 2456, 2022 U.S. Dist. LEXIS 153010, at *6-7 (N.D. Ill. Aug. 25, 2022), the court held that a plaintiff plausibly alleged that a defendant violated Section 15(c) merely by setting forth allegations that the defendant’s customer paid for access to its facial recognition platform to verify the plaintiff’s age and identity before she rented a car. Notably, the Berbix court reasoned that “[i]n short, [the defendant’s] collection and use of biometrics is a necessary component to its business model,” which the court found satisfied the standard for plausibly alleging an unlawful sales or profiting claim under Illinois’s biometric privacy statute—a looser standard for Section 15(c) claims as compared to earlier BIPA opinions.

In addition, courts also continued to interpret the key term “scans of face geometry” as it is used in BIPA in an extremely broad manner as well. For example, in Wise v. Ring LLC, No. 20 CV 1298, 2022 U.S. Dist. LEXIS 13899, at *4 (W.D. Wash. Aug. 8, 2022), a Washington federal court rejected the argument that video data collected from doorbell cameras (purportedly used to create face templates) of individual bystanders with no contractual relationship to the defendant did not constitute biometric identifiers or biometric information.  This was based on the argument, which the court rejected, that a mere scan of face geometry—absent identifying information such as a name tying that geometry to a person—did not implicate the risks the Illinois legislature sought to mitigate in enacting BIPA.

Taken together, with courts favoring more expansive interpretations of relevant provisions of BIPA, strict compliance with Illinois biometric privacy statute will remain important in mitigating the already significant liability exposure that exists for non-compliance with the law in 2023.

D.     Discrimination & Bias Issues Relating to AI Tools Garners Attention From Federal Regulators

Today, AI continues to offer companies a myriad of benefits when used in commercial operations—including increased efficiency, reduced costs, enhanced customer experiences, and smarter decision-making, among others. At the same time, however, growing reliance on these tools has also garnered increased interest from lawmakers and regulators concerned about potential fairness and bias issues associated with the use of this technology.

In 2022, U.S. Equal Employment Opportunity Commission (“EEOC”) signaled its intent to closely scrutinize the use of AI tools in hiring and employment decisions to ensure that employers and vendors use these technologies fairly and consistently with federal equal employment opportunity laws. In May, the EEOC issued The Americans With Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Applicants and Employeesextensive guidance designed to assist employers in avoiding violations of the Americans With Disabilities Act (“ADA”) when using AI to assess job candidates and employees. The EEOC guidance provides a detailed discussion of the primary ways in which the use of AI tools can result in disability discrimination, while also offering several “promising practices” that employers can implement to comply with the ADA when leveraging the benefits of AI technologies. Of note, within just a few days of issuing its guidance, the EEOC filed a federal age discrimination suit against a software developer alleging that its application software engaged in intentional discrimination in violation of the Age Discrimination in Employment Act (“ADEA”) through programming that solicited birthdates and automatically rejected applicants based on their age.

In addition, last year also saw the Consumer Financial Protection Bureau (“CFPB”) release its Circular 2022-03: Adverse Action Notification Requirements in Connection With Credit Decisions Based on Complex Algorithmswhich cautions creditors of the need for compliance with the Equal Credit Opportunity Act (“ECOA”) when making credit decisions with the aid of complex algorithms.

Finally, as discussed in more detail below, the Federal Trade Commission (“FTC”) also reemphasized the priority focus it has placed on policing AI, with several notable developments in this space that took place involving the country’s de facto federal privacy regulator in 2022.

Taken together, companies should take note of this new federal regulatory agency focus on closely scrutinizing the use of AI tools, especially as it relates to their potential discriminatory impact on protected classes, and ensure that their AI practices are in compliance to manage associated legal risks.

II.     ARTICLE III STANDING

BIPA litigation often involves the issue of whether a plaintiff has Article III standing to maintain a lawsuit in federal court. Defendants have tended to remove BIPA cases to federal court, while plaintiffs often prefer to keep suits in state court (which does not require a plaintiff to have standing under Article III). As Privacy World readers will likely be aware, to establish standing in federal court, a plaintiff must allege an injury in fact that is traceable to the defendant’s conduct and which can be redressed by a favorable ruling.

One key ruling illustrating the standing dispute was Zellmer v. Facebook, No. 18 CV 1880, 2022 U.S. Dist. LEXIS 206475 (N.D. Cal. Nov. 14, 2022), in which the court dismissed plaintiff’s Section 15(a) after numerous disputes and prolonged briefing over plaintiff’s standing. Plaintiff had argued that defendant had violated Section 15(a) of BIPA by failing to make publicly available a retention and destruction schedule for biometric identifiers, and violated Section 15(b) by collecting biometric information without the required consent from users. After defendant obtained summary judgment as to the Section 15(b) claim, the court found that plaintiff had only alleged a violation of a public obligation and had not articulated how it harmed or impacted him in any way. A general allegation that plaintiff’s rights were harmed was found to be insufficient. Zellmer emphasizes that Article III standing is a highly fact-specific determination for each individual BIPA case based on the unique set of allegations that the case presents.

While removal is generally a favorable option for lawsuits that were originally filed in state court, an action does not have to originate in state court for a defendant to procure dismissal based on lack of standing. For example, in Theriot v. Louis Vuitton N. Am., Inc., No. 22 CV 2944, 2022 U.S. Dist. LEXIS 218972 (S.D.N.Y. Dec. 5, 2022), defendant operated a virtual try-on feature that plaintiffs alleged impermissibly scanned and stored users’ facial geometry, in violation of Sections 15(a) and 15(b) of BIPA. In addition to moving to dismiss plaintiffs’ Section 15(b) claim on separate grounds, which was denied, defendant argued that plaintiffs lacked Article III standing for their Section 15(a) claim. The court agreed, looking to the Seventh Circuit’s differing rulings in Bryant and Fox in determining that plaintiffs had failed to allege a particularized harm. The court found that the allegations in the complaint—that defendant had failed to develop and make public a retention and destruction schedule for biometric identifiers, without more—were more similar to the allegations in Bryant that were found to be insufficient for a Section 15(a) claim.

Even where neither party raises standing as an issue, a court may sua sponte examine a plaintiff’s standing.  For instance, in Harvey v. Resurrection Univ., No. 21 CV 3203, 2022 U.S. Dist. LEXIS 154550 (N.D. Ill. Aug. 29, 2022), plaintiffs’ Section 15(a) and 15(c) claims were remanded upon a sua sponte examination of standing, even though neither party had raised standing as an issue. Harvey is a valuable reminder that Article III standing is an intrinsic component of any BIPA litigation in federal court.

III.     FTC DEVELOPMENTS

In 2022, the FTC engaged more fully with issues bearing upon AI and biometrics. First, Alvaro Bedoya was sworn in as the newest FTC commissioner, solidifying a Democratic majority. Bedoya’s swearing in paved the way for the FTC to focus on privacy concerns, including increased scrutiny of AI and biometrics.

2022 also showed the FTC investigating and bringing enforcement actions against companies for misusing AI. In March 2022, the FTC reached a settlement with WW International, Inc. for collecting personal information from minor users of a fitness app without parental permission. A complaint brought by the Department of Justice alleged that WW’s collection of minors’ data violated the Children’s Online Privacy Protection Act of 1998 (“COPPA”). WW agreed to pay a $1.5 million penalty as part of the settlement, delete information from users under age 13, and erase the algorithms derived from the data.

In 2022, the FTC opened an investigation against Match Group, which owns the dating site OKCupid. The investigation followed the 2021 dismissal of a BIPA lawsuit against Clarifai, Inc., a technology company specializing in AI. The underlying suit alleged that Clarifai violated BIPA by harvesting facial data from OkCupid without obtaining consent from users or making necessary disclosures. At present, the FTC is investigating whether any entities engaged in unfair or deceptive trade practices in mining data from OkCupid and in using the data in Clarifai’s facial recognition technology.

On June 16, 2022, the FTC issued a congressional report concerning the use of AI to combat various online harms in response to the 2021 Appropriations Act. The report acknowledged that while AI helps stop the spread of harmful online content, it also poses problems regarding inaccurate algorithms, discrimination, and invasive surveillance. The report offered several recommendations, including a legal framework to prevent further harms, human intervention and monitoring, and accountability for entities using AI.

Meanwhile, on August 11, 2022, the FTC issued an Advanced Notice of Preliminary Rulemaking on commercial surveillance and lax data security practices (“Commercial Surveillance ANPR”). Twenty-one of the 95 questions concerned AI and whether FTC should take steps to regulate or limit these technologies. Commercial Surveillance ANPR provides detailed insight into the current FTC’s concerns about artificial intelligence, particularly concerning its risks of discrimination. A bipartisan group of state attorney generals joined the discussion, penning November 17 letter expressing concern over commercial surveillance and data privacy, in particular biometrics and medical data.

IV.     LOOKING AHEAD BY LOOKING BACK: 2022 BIOMETRIC LEGISLATIVE PROPOSALS 

Review of 2022 Biometric Privacy Legislative Developments

Lawmakers in a number of states attempted (albeit unsuccessfully) to enact new biometric privacy laws across the country during the 2022 legislative cycle. In so doing, lawmakers took several different approaches to regulating the collection and use of biometric data.

A.     Broad Biometric Privacy Bills

In 2022, the most straightforward method lawmakers used in their attempt to enact greater controls over the commercial use of biometrics was through broad biometric privacy bills that target the use of all forms of biometric data, similar to BIPA, CUBI, and HB 1493. In 2022, six states—California, Kentucky, Maryland, Maine, Missouri, and West Virginia—introduced similar bills that sought to regulate all types of biometric technologies.

Several of the bills introduced in 2022—such as California’s Senate Bill 1189 and Kentucky’s House Bill 32—were carbon copies of BIPA. While these bills would have created broad liability exposure on a scale similar to that of BIPA, they would not have substantially increased companies’ compliance burdens due to their similarities with Illinois’s biometric privacy statute.

Other states, however, attempted to enact legislation that departed significantly from the BIPA blueprint. Unlike the BIPA copycat bills discussed above, these bills not only would have created significant liability exposure, but would have also required wholesale modifications to companies’ existing biometric privacy compliance programs due to the range of unique provisions in these pieces of legislation.

For example, Maryland’s Biometric Identifiers Privacy Act not only incorporated some of the common elements seen across current biometric privacy laws, such as data destruction and informed consent, but also many other provisions are traditionally confined to consumer privacy laws like the CCPA and CPRA. For example, Maryland’s legislation:

  • Provided consumers with the “right to know,” which would have required the disclosure of a range of pieces of information regarding companies’ collection and use of biometric data upon a consumer’s request;

  • Afforded consumers non-discrimination rights and protections, including a ban on requiring consumers to submit their biometric data in order to obtain a product or a service from a company; and

  • Imposed requirements and limitations on processors of biometric data, including restrictions on the use of biometric data for any purposes other than providing services to the company.

South Carolina also introduced an interesting piece of hybrid legislation, known as the Biometric Data Privacy Act (“BDPA”). Like the Maryland bill, the BDPA set forth a range of consumer rights regarding the collection and use of their biometric data. Importantly, however, the BDPA went far beyond its Maryland counterpart in several key respects. For example, the BDPA would have required mandatory training for all employees on how to direct consumers to exercise their rights afforded to them under the law. Most notably, however, the BDPA would have imposed breach notice obligations that far surpassed those imposed under traditional breach statutes by requiring businesses to notify all consumers—even those who never submitted their biometric data to the company—within a mere 72 hours of discovering a breach. To further complicate matters, companies were subject to fines of $5,000 per consumer who was not notified in a timely manner.

B.     Targeted Facial Recognition and Voice Biometrics Bills

Other states took a more focused approach to their legislation. Instead of seeking to regulate all types of biometric data, these bills singled out specific types of biometric technologies—and facial recognition in particular.

The targeted facial biometrics bills introduced in 2022 were a continuation of the trend that began in late 2020, when Portland, Oregon, became the first jurisdiction in the nation to enact a blanket ban over the use of facial recognition by the private sector. Not too long after that, Baltimore enacted its own private-sector facial recognition ban that closely mirrored the Baltimore ordinance. With that said, Baltimore’s ban departs from its Portland counterpart in one significant respect—by imposing criminal penalties for violations of the law.

In 2022, there were several attempts to put in place narrow, targeted bills focused on facial biometrics. For example, Massachusetts introduced House Bill 117, which targeted facial recognition exclusively with stringent legal requirements and restrictions. While the Massachusetts bill did not go so far as to impose a complete ban on facial recognition software and systems, it nevertheless still attempted to impose stringent limitations over its use, including a de facto ban on utilizing facial biometrics for surveillance purposes.

Beyond facial recognition, Oklahoma introduced a bill which singled out voice biometrics, known as the Voice Recognition Privacy Act of 2022. The name of the bill is somewhat misleading, however, as the legislation regulated not just voiceprints—true biometric data used to identify or verify the identity of individual speakers—but any type of recorded voice data.

Finally, some states, such as Vermont’s House Bill 75, introduced bills that targeted both facial and voice biometrics. Of note, the Vermont legislation would have required opt-in consent, as well as the posting of clear and conspicuous signage notifying individuals that biometrics is being used on the premises.

C.     American Data Privacy and Protection Act

Lastly, federal lawmakers also introduced a bill of their own—the American Data Privacy and Protection Act (“ADPPA”), which would have regulated biometric data in a uniform fashion across all 50 states. Of note, the ADPPA would have narrowly limited the collection and use of biometric data to only those instances where such activities were strictly necessary to provide or maintain a specific product or service requested by the subject of the biometric data, or under one of ten narrowly-tailored “permitted purposes” set forth in the statute, such as complying with a legal obligation. The federal privacy bill would have also restricted companies from disclosing, releasing, sharing, disseminating, or otherwise making biometric data available to third parties unless the transfer was necessary to facilitate data security or for the authentication of individuals’ identities.

Importantly, while the bill would have generally preempted any state laws that are “covered by the provisions” of the ADPPA or its regulations, the federal legislative proposal did not preempt all state privacy laws, providing carve outs for both Illinois’s BIPA and laws that solely addressed facial recognition or related technologies. Together, the ADPPA would have added significant complexity to the legal landscape had it made its way into law, providing regulation over biometric data in those jurisdictions where none currently exists, while at the same time keeping in place current biometrics-related laws and regulations, each with their own unique nuances.

Takeaways and Analysis

So what does this all mean? There are several key themes and takeaways from the flurry of activity that took place in the biometric privacy legislative space in 2022.

A.     Increased Activity on the Legislative Front

First, moving forward, the likelihood of greater regulation over the collection and use of biometric data will continue to trend upward. In 2022, there was a marked increase in the number of biometric privacy bills that were introduced as compared to 2021, which also saw a sizeable increase in pieces of legislation introduced over and above 2020 figures. And while none of the bills introduced in 2022 made their way into law, these legislative proposals signal lawmakers’ intent to continue their efforts to bring these bills to fruition for the foreseeable future. Further, as federal lawmakers continue to drag their feed on enacting a nationwide, uniform biometric privacy statute, lawmakers at the state and municipal levels will face added pressure to put in place regulation of their own, further increasing the likelihood that new biometrics laws will be enacted sooner than later.

Ultimately, as states and cities add new laws governing biometric technology in more parts of the country, companies will see a corresponding increase in exposure and related risks faced in connection with the collection and use of biometric data.

B.     Continued Focus on Targeting Facial Recognition Technology

Second, companies should expect a continued focus on facial recognition for greater regulation in 2023 and beyond.

In 2022, lawmakers again showed their desire to single out facial recognition with bills that solely sought to regulate this particular form of biometrics. This focus on facial recognition parallels the sentiment of a number of states and cities that have voiced their concerns over facial biometrics and its potential for misuse. Further, this year’s bills—albeit unsuccessful—came on the heels of several new ordinances that were recently enacted, which outright prohibit the use of facial recognition by private entities. It should also be noted that in addition to Portland and Baltimore, New York City’s “commercial establishments” ordinance—while addressing all types of biometric data—was clearly designed to target facial recognition by municipal lawmakers, who even singled out the technology by name in the text of the ordinance.

Taken together, companies should expect to see more of the same throughout 2023 and beyond, with lawmakers continuing to take aim at facial recognition to place greater controls and restrictions over the use of this specific form of biometrics.

C.     More Complex & Costly Compliance Obligations Stemming From “Hybrid” Biometric Privacy Laws

Third, companies face substantially enhanced compliance burdens with the advent of new “hybrid” biometric privacy legislation that was introduced in 2022, such as the Maryland Biometric Identifiers Privacy Act, which blends traditional biometric privacy legal principles with those ordinarily encompassed in consumer privacy statutes.

With legislators’ increasing interest in enacting new, far-reaching biometric privacy frameworks, it is only a matter of time before these hybrid laws are added to the already-complex patchwork of biometrics-focused laws and regulations. Importantly, these hybrid statutes—each with their own unique requirements—will significantly increase compliance burdens for all companies that collect and use biometric data, while also ushering in a correspondingly-high increase in liability exposure risks. Thus, while biometric privacy compliance may seem like a fairly straightforward task today, that may soon change in the immediate future.

D.     Increased Reliance on Private Rights of Action as Laws’ Primary Enforcement Mechanism

Lastly, companies should expect to see an increase in the percentage of new legislation moving forward that feature private rights of action as their main enforcement mechanism.

The majority of bills introduced in 2022 contained a private right of action, instead of regulatory enforcement by state attorneys general, as their sole enforcement mechanism. This represents a continuation of the trend that has emerged in recent years, with lawmakers clearly favoring class litigation as the main method for enforcing non-compliance with biometric privacy legislative proposals.

Moreover, all four biometric privacy laws that were enacted in 2021—Portland’s and Baltimore’s facial recognition bans, New York City’s “commercial establishments” biometric identifier information ordinance, and New York City’s Tenant Data Privacy Act—all feature this more robust form of enforcement.

Ultimately, with legislators having seemingly scrapped the idea of administrative enforcement in favor of private rights of action, companies now face the risk that the same avalanche of class litigation that has been generated by Illinois’s biometric privacy statute may soon spread to other parts of the country.

V.     CONCLUSION

Businesses across all industries are rapidly increasing their reliance on biometric data to improve the efficiency and effectiveness of their operations, and also to satisfy consumers’ growing interest in this next-generation technology. At the same time, lawmakers are also greatly increasing their efforts to put in place tighter regulations over the collection and use of biometric data. The authors predict it is only a matter of time before biometric privacy laws in the U.S. are the norm and not the exception.

While the particulars of these anticipated new laws are anyone’s guess at this juncture, companies can be reasonably certain that the majority of these new statutes will incorporate many of the same overarching privacy principles that serve as the foundation for the current body of law governing biometric data.  As such, private entities across industries can get ahead of the compliance curve by taking a proactive approach and embedding these common core privacy principles as the foundation of their biometric privacy compliance programs. Developing tailored, comprehensive biometric privacy compliance programs that integrate the core principles that we will examine today can ensure continued, ongoing compliance not just with current biometrics regulation, but with projected legislative trends as well—allowing companies to always stay a step ahead of today’s constantly-evolving biometric privacy regulation.

 

NLR Logo

We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins