Madison Square Garden’s Use of Facial Recognition Software to Create “Enemy Ban” For Adverse Attorneys Draws Scrutiny, Reflects Changing Uses of Biometric Software
While Madison Square Garden might normally make headlines for musical artists or sporting events, the venue’s parent company, MSG Entertainment, has been in the spotlight following media and regulator attention regarding its use of facial recognition technology to ban certain individuals from its venues. Read on to learn more and its implications for other uses of facial recognition technology.
First, some background. MSG Entertainment’s use of biometric facial recognition came under scrutiny last December, when an attorney employed by a law firm engaged in litigation against MSG Entertainment was denied entry from attending the Radio City Christmas Spectacular with her child. She was apprehended by the venue’s security staff, who knew her name and firm she was associated with, and purportedly informed her she had been identified by the venue’s facial recognition system as part of an “attorney exclusion list.”
This was not the only instance in which an attorney was seemingly denied entry based solely on being an attorney who is personally or whose firm is engaged in litigation against MSG. Based on several news reports, the company has a policy of excluding from its venues not only attorneys representing parties engaged in litigation against MSG Entertainment, but also all attorneys employed by the firms engaged in those litigations, and uses software to identify those attorneys from their photos on the firms’ websites. For example, a Long Island attorney was banned from MSG before a Knicks-Celtics game after her law firm filed a suit on behalf of a fan who fell from a skybox at MSG during a Billy Joel concert and another attorney was stopped from entering a MSG for a Rangers game because the attorney was employed by a firm suing MSG.
At least two law firms filed suit against MSG Entertainment in December 2022 over the ban. Although these suits did not raise biometric or AI-based claims, they alleged violations of New York state civil rights laws and prima facie tort claims and requested declaratory judgment in addition to a temporary restraining order, preliminary injunction, and permanent injunction. The ban has been met with criticism from the judges presiding over these actions, including Chancellor Kathaleen McCormick of the Delaware Chancery Court, who remarked that MSG Entertainment’s letter reinforcing the ban was “the stupidest thing [she’d] ever read.”
The debate over MSG Entertainment’s facial recognition software illustrates the divide between consumer perception of using facial recognition for authentication or verification purposes, which has generally become more accepted, versus using such technology for real-time surveillance or identification outside of the context of express consumer consent.
This shifting public perception of the various purposes for which facial recognition may be utilized is also congruent with recent legislative activity. For example, in response to the recent events at MSG Entertainment’s venues, a bill was introduced in the New York state legislature to add “sporting events” to the list of public places of entertainment that are barred from refusing entry to individuals with a valid ticket. New York State Senator Brad Hoylman-Sigal condemned MSG Entertainment’s policy, stating, “MSG claims they deploy biometric technology for the benefit of public safety when they remove sports fans from the Garden. This is absurd given that in at least four reported cases, the patrons who were booted from their venues posed no security threat and instead were lawyers at firms representing clients in litigation with MSG.”
Although the bill does not specifically address the use of facial recognition technology, it would nonetheless work to limit the ways in which such technology is used. Similarly, New York Attorney General Letitia James penned a letter to MSG Entertainment warning that the ban could violate anti-discrimination laws and could chill attorneys from taking on certain types of litigation against the company.
Biometric technology has been a focus of state regulation for some time, most significantly with Illinois’ Biometric Information Privacy Act (“BIPA”); Texas’ Capture or Use of Biometric Identifier Act (“CUBI”); and Washington’s HB 1493. While BIPA is considered the most stringent of the three state statutes, each imposes certain requirements relating to notice, consent, and data security measures for biometric information or identifiers. BIPA also contains a private right of action permitting for the recovery of statutory damages, which has made it a frequent target for class action litigation. New biometric privacy bills have also recently been introduced in New York, Hawaii, Mississippi, and Maryland, which would similarly regulate the collection and use of all forms of biometric data.
Lawmakers have also enacted legislation at a local level to govern the use of facial recognition technology and, more specifically, to thwart potential improper uses of the technology. In late 2020, Portland, Oregon became the first U.S. jurisdiction to ban the use of facial recognition by the private sector, clarifying in the prefatory materials for the ordinance that lawmakers were primarily concerned with the use of facial recognition for surveillance purposes within physical spaces and its corresponding potential risks for misidentification and misuse. New York City has already enacted a municipal-level ordinance regulating the use of biometrics-powered technologies by “commercial establishments.”
As a result of certain high-profile incidents, including those discussed above related to MSG Entertainment, more states may be inclined to enacted biometric privacy bills modeled after BIPA (or taking a more tailored approach to still provide certain protections regulating biometric privacy concerns). Simultaneously, these developments may also encourage lawmakers contemplating regulating the use of this technology in other jurisdictions—but who have not yet introduced legislation and who lack an appetite for passing an outright ban—to push forward with additional biometric regulations.
MSG Entertainment is due to respond to Attorney General Letitia James’s Letter by February 13, 2023 “to state the justifications for the Company’s Policy and identify all efforts you are undertaking to ensure compliance with all applicable laws and that the Company’s use of facial recognition technology will not lead to discrimination.” For updates on MSG Entertainment’s response and other developments relating to facial recognition software in New York, Privacy World will be there to keep you in the loop.