February 2, 2023

Volume XIII, Number 33

Advertisement

February 02, 2023

Subscribe to Latest Legal News and Analysis

February 01, 2023

Subscribe to Latest Legal News and Analysis

January 31, 2023

Subscribe to Latest Legal News and Analysis

January 30, 2023

Subscribe to Latest Legal News and Analysis
Advertisement

NYC Law Restricting Use of AI in Hiring Takes Effect in January: Are You Ready?

Last year, the New York City Council passed Local Law Int. No. 1894-A, which amended the City’s administrative code to afford new protections to employees during the hiring and promotion processes. The law protects those individuals from unlawful bias by the employer when automated employment decision tools are used. Employers must conduct AI tool audits to confirm that such tools are not biased. The results of those audits must be published on publicly- available websites. Furthermore, the employer is required to disclose the data that the AI tool collects either by disclosing it publicly or in a response to an inquiry.

However, the City has yet to issue any guidance on the expectations or steps necessary to prepare for compliance. In particular, the law does not define what is meant by an “independent auditor.” Employers will likely rely on law firms and consulting firms to “perform the audit” since there are no other requirements other than that the party be “independent.” On top of this lack of specificity, many employers use third-party vendors’ automated tools, so the audit must also occur in regard to outside vendors’ software and practices. The challenge will be to determine what the vendors did during the construction and creation of the AI tools. The audit will likely need to consist of a discussion with technical experts who understand how the tools function as well as lawyers or consultants who are well versed in potential discrimination complaints.

Employers can also look to the U.S. Equal Employment Opportunity Commission technical assistance document, which includes guidance on AI hiring tools. It also includes questions to ask vendors related to such software.

This law will be enforced by the City’s Office of the Corporation Counsel; employers in non-compliance could face a $500 fine for the first violation, and $1,500 for each subsequent violation. The fines are multiplied by the number of AI tools used and the number of days the employer fails to correct the non-compliance. There is no private right of action under the law, but there is still potential for class actions in federal court for discrimination if the City issues fines for using discriminatory tools under this law. If you have yet to determine how you can comply with this new law, now is the time.

Copyright © 2023 Robinson & Cole LLP. All rights reserved.National Law Review, Volume XII, Number 244
Advertisement
Advertisement
Advertisement

About this Author

Kathryn Rattigan Attorney Cybersecurity Data Privacy
Partner

Kathryn Rattigan is a member of the firm's Business Litigation Group and Data Privacy + Cybersecurity Team. She advises clients on data privacy and security, cybersecurity, and compliance with related state and federal laws. Kathryn also provides legal advice regarding the use of unmanned aerial systems (UAS, or drones) and Federal Aviation Administration (FAA) regulations. She represents clients across all industries, such as insurance, health care, education, energy, and construction.

Data Privacy and Cybersecurity Compliance

Kathryn helps clients comply...

401-709-3357
Advertisement
Advertisement
Advertisement