June 28, 2022

Volume XII, Number 179

Advertisement
Advertisement

June 28, 2022

Subscribe to Latest Legal News and Analysis

June 27, 2022

Subscribe to Latest Legal News and Analysis
Advertisement

New York City to Restrict Use of Automated Employment Decision Tools: What Employers Should Know

Employers and employment agencies in New York City that currently utilize, or expect to utilize, automated tools to make employment decisions may wish to begin planning now for restrictions that will take effect on January 1, 2023, concerning the types of tools that may be utilized and the disclosures concerning such tools that must be provided to candidates for employment or promotions.

What Is an “Automated Employment Decision Tool”?

Under the New York City law, an “automated employment decision tool” is broadly defined as “any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence (AI), that issues simplified output, including a score, classification, or recommendation, that is used to substantially assist or replace discretionary decision making for making employment decisions that impact natural persons.” The law defines an “employment decision” as the act of screening “candidates for employment or employees for promotion within [New York City].” The law does not apply to the use of a tool that “does not automate, support, substantially assist or replace discretionary decision-making processes and that does not materially impact natural persons, including, but not limited to, a junk email filter, firewall, antivirus software, calculator, spreadsheet, database, data set, or other compilation of data.”

What Limitations Will Exist Concerning the Use of Automated Employment Decision Tools?

The law specifies that in order for an employer to use an automated employment decision tool, the tool must have been “the subject of a bias audit conducted no more than one year prior to the use of such tool.” A “bias audit” is defined as “an impartial evaluation by an independent auditor” and includes, but is not limited to, testing the tool to assess its “disparate impact on persons of any component 1 category required to be reported by employers pursuant to subsection (c) of section 2000e-8 of title 42 of the United States code,” which encompasses sex, race, and ethnicity. Before using an automated employment decision tool, an employer or employment agency must post on its website a “summary of the results of the most recent bias audit of such tool as well as the distribution date of the tool to which such audit applies.”

What Notifications Are Required by the Law?

The law requires employers to provide notifications and disclosures to candidates for whom employment decisions will be made utilizing automated decision tools. Under the law, at least 10 business days prior to using a tool, an employer or employment agency must notify each candidate or employee who resides in New York City that an automated employment decision tool will be used in connection with the assessment or evaluation of the candidate or employee. The advance notification shall “allow a candidate to request an alternative selection process or accommodation.” Employers and employment agencies must also disclose the “job qualifications and characteristics that such automated employment decision tool will use in the assessment of such candidate or employee.” This disclosure must be made “no less than 10 business days” before an employer or employment agency uses an automated tool.

Employers and employment agencies also may post on their websites “the type of data collected for the automated employment decision tool, the source of such data and the employer or employment agency’s data retention policy.” If an employer or employment agency elects not to publicly disclose these details online, the information must be provided to a candidate or employee within 30 days of a written request from such individual. Disclosures are not required if such a notification would violate local, state, or federal law, or interfere with a law enforcement investigation.

What Are the Penalties for Noncompliance?

Failure to comply with the law may result in the imposition of civil penalties of up to $500 for a first violation and each additional violation occurring on the same day as the initial violation, and between $500 and $1,500 for each subsequent violation. The law specifies that “[e]ach day on which an automated employment decision tool is used” in violation of the provision requiring a bias audit “shall give rise to a separate violation.” Additionally, the failure to provide any of the required notices constitutes a separate violation.

In addition to outlining the administrative avenues for recovering civil penalties, the law authorizes the city’s corporation counsel to “initiate in any court of competent jurisdiction any action or proceeding that may be appropriate or necessary for correction of any violation.” Furthermore, a candidate or employee may bring a civil action and the New York City Commission on Human Rights may enforce the law.

What Other Issues May Employers Wish to Consider?

Over the next year, employers might consider arranging for an appropriate audit of any automated decision tools that may currently be in use, or confirming that any tools that may be acquired in the future can be appropriately audited prior to being placed in use. Employers and employment agencies in New York City also may want to train individuals responsible for utilizing these tools about key requirements of the law. Employers also may wish to monitor initiatives by lawmakers in other jurisdictions, such as Illinois and Maryland, where they operate, as additional or different measures may be required to remain compliant with other applicable laws. Additionally, employers also may wish to monitor related initiatives by federal lawmakers. In October 2021, the U.S. Equal Employment Opportunity Commission (EEOC) announced an initiative to evaluate the use of artificial intelligence in hiring and other employment decisions. As part of that effort, the EEOC intends to:

  • “[g]ather information about the adoption, design, and impact of hiring and other employment-related technologies” through “a series of listening sessions with key stakeholders”; and

  • “[i]ssue technical assistance to provide guidance on algorithmic fairness and the use of AI in employment decisions.”

 

© 2022, Ogletree, Deakins, Nash, Smoak & Stewart, P.C., All Rights Reserved.National Law Review, Volume XII, Number 6
Advertisement
Advertisement
Advertisement
Advertisement
Advertisement
Advertisement

About this Author

Shareholder

Simone Francis concentrates her practice in the areas of employment litigation, environmental counseling and litigation, and general litigation. She has represented a range of large, mid-sized, and small employers in litigation before the federal and local courts in the U.S. Virgin Islands and elsewhere in the United States, and also has acted as an advocate before administrative tribunals, including the Equal Employment Opportunity Commission, the Virgin Islands Department of Labor, the Civil Rights Commission, and the Public Employees Relations Board. In addition, Ms....

340-714-5510
Advertisement
Advertisement
Advertisement