Deploying A Holistic Approach to Automated Employment Decision-Making in light of NYC’s AEDT Law
Friday, February 3, 2023

The Equal Employment Opportunity Commission’s first commission meeting of 2023, entitled Navigating Employment Discrimination in AI and Automated Systems: A New Civil Rights Frontier, kicked off the New Year addressing artificial intelligence or AI in the workplace from several angles.  A common sentiment among the event’s panelists was that AI and related technologies used in the workplace must be developed and used in a manner that accounts for and protects civil rights.  Indeed, and as emphasized throughout the EEOC’s meeting, the tenets of fairness, justice, and equality cannot be ignored as AI and related technologies become more prevalent.  Instead, it is imperative that these tools be designed and implemented with these considerations in mind, and with the confidence that they comply with the growing list of applicable laws without generating or perpetuating discriminatory practices.  Indeed, these tools, if designed intelligently and carefully, have the potential to both radically revolutionize employment decision-making processes and serve as a bulwark against discriminatory practices. 

These considerations are of particular salience as New York City’s Local Law 144 (which regulates the use of these tools, as discussed below) continues to move towards enforcement starting in April 2023.  The proposed guidance and public commentary regarding the nascent law, which regulates automated employment decision tools (or “AEDT” as Local Law 144 calls them), continues to develop.  But this ongoing discussion makes clear that two things appear to be true: (i) employers often experience difficulty in developing/utilizing legally compliant tools that do not run afoul of applicable employment discrimination law; and (ii) an increasing number of employers are exploring ways to utilize automated approaches to employment decision-making in an effort to actually eradicate bias and discrimination in the workplace.  Intentional and/or unconscious bias in hiring, advancement, and retention practices continues to exist, and these issues persist even with the creation and use of automated tools.  To wit, even if unintentional, discrimination has the potential to occur when facially neutral employment practices (including the use of AI) have a disproportionate adverse impact on members of a protected class.  Employers that use AEDTs, or intend to start using them, should consider how to responsibly deploy these tools in a compliant manner, especially when furthering workplace inclusivity objectives.

Complying with NYC Local Law 144

Before proceeding further, it is important to understand the basic contours of NYC Local Law 144, including which issues remain unsettled.  As we discussed in our previous post, NYC Local Law 144 represents a first-of-its-kind law that regulates the use of AEDTs in the workplace.  NYC’s broad law, unlike narrow laws passed in other states such as Maryland and Illinois that regulate facial recognition and video technology, requires employers to take several affirmative steps before using AEDTs in employment decisions. Specifically, employers must (i) subject AEDTs to a “bias audit” within one year of its use; (ii) ensure that the results of such audits are publicly available; (iii) provide particular notices to job candidates regarding the employer’s use of these tools; and (iv) allow candidates or employees to request alternative evaluation processes as an accommodation in certain circumstances.   

As a threshold matter, employers must recognize the myriad ways they may already be using AEDTs that fall within the ambit of the law.  These could include resume scanners that utilize AI or machine learning to sort through and select candidates for further consideration based upon keywords and experiences that the employer believes may equate to success in the given position, software that measures an employee’s aptitude for a certain job, or computer-involved monitoring software that evaluates employees’ performance. 

Apart from the question of which tools actually constitute AEDTs subject to the law, many other questions emerged following the passage of Local Law 144, such as (i) what must a “bias audit” include; (ii) who can qualify as an “independent auditor” capable of conducting the bias audit; (iii) what must the bias audit ascertain; and (iv) how much information must employers publish for employees and candidates?  Notably, these questions persisted despite the New York City Department of Consumer and Worker Protection (“DCWP”) issuing a first set of proposed rules in the fall of 2022.  In response to public commentary and after further deliberation, DCWP issued revisions to its proposed rules for Local Law 144 on December 15, 2022, which provide additional clarity regarding certain provisions of the law. For example:

  • AEDTs are tools “derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issue simplified output, including a score, classification, or recommendation, that is used to substantially assist or replace discretionary decision making.” The proposed rules explain that AEDTs meet the “substantially assist or replace discretionary decision making” standard if they are: (1) the sole factor in an employment decision, (2) more heavily weighted among several other factors, or (3) used to overrule conclusions derived from other factors.  Some public commenters noted that this proposed standard is more restrictive than lawmakers had intended and could subject fewer AEDTs to the law’s requirements.

  • The law makes clear that bias audits for AEDTs must be conducted by “independent auditors.”  The proposed rules state that an independent auditor must be “capable of exercising objective and impartial judgment” and cannot be involved in using, developing or distributing the AEDT; have an employment relationship with an employer/agency that uses, or vendor that developed or distributes, the AEDT; or have a direct or material indirect financial interest in an employer/agency that uses or vendor that developed or distributed the AEDT. 

  • The proposed rules state that employers using the same AEDT can rely on a single bias audit covering multiple employers, so long as each employer using the AEDT provides historical data to the independent auditor or can explain why such data is not available.  “Historical data” in this context means “data collected during an employer[’s] use of an AEDT to assess candidates for employment or employees for promotion.”

  • Regarding accommodation, notice and publication requirements, while employers using AEDTs must provide notices allowing individuals to “request an alternative selection process or accommodation” if one is available—the new law does not create new accommodation rights.

There are still open questions as to the jurisdictional reach of the law as well.  Local Law 144 defines “employment decision” to mean screening “candidates for employment or employees for promotion within [New York City].”  The law further notes that employers are only required to provide required notices to employees or candidates “who reside[] in [New York City].”  DCWP’s proposed rules do not address the multitude of scenarios where non-NYC employers target NYC residents for hiring or promotions, and under what circumstances those employers could find themselves covered by the law.  Conversely, it is unclear whether NYC employers who target non-NYC employees are required to comply with the law’s requirements for these employees as well—though a plain reading of the law seems to indicate they might fall outside of the law’s reach.  While it is unclear whether DCWP will issue a third round of proposed rules, these questions may only be answered once the law is enforced.  Importantly, DCWP has agreed to postpone enforcement of the law until April 15, 2023 (despite it technically going into effect January 1, 2023).

Using Local Law 144 as a Tool to Advance DE&I Objectives

A key legislative purpose behind the required bias audit is to ensure that AEDTs are being utilized in a manner that does not violate New York City’s Human Rights Law—the broad anti-discrimination ordinance that applies to almost all NYC employers.  Although employers are now required to conduct these bias audits if they want to utilize AEDTs, these audits have the potential to substantially assist employers in attaining diversity, equity and inclusion objectives by ensuring that the AEDTs they use do not create or perpetuate bias or workforce disparities. 

Of note, the bias audit actually affords employers an opportunity to review and scrutinize how a given AEDT works in practice and make corrections based on the results of a given audit.  Of course, these corrections may be essential to avoiding potential alleged violations of the City’s Human Rights Law (i.e., claims of disparate impact or disparate treatment discrimination), but they could also ensure that the AEDT is actually useful to the employer.  In annually auditing a given AEDT, employers should also review the metrics and data they are using to ensure that the AEDT is properly aligned with the employer’s goals and objectives and is actually working to improve diversity goals in the decision-making processes.  Indeed, these tools are intended to assist employers in making employment decisions.  Thus, employers should take the bias audit requirement as an opportunity to not only advance their own DE&I goals, but also to ensure that AEDTs are actually working as intended by helping employers select qualified, diverse candidates via better, more informed choices.  

What Should Employers Do Now?

The guidance in this area continues to evolve.  Government enforcement agencies, including the EEOC, have taken note of these trends and have eyed workplace AI tools with increased scrutiny.  For instance, as noted in the EEOC’s January 2023 Draft Strategic Enforcement Plan for 2023-2027, the agency is focused on workplace discrimination via AI tools as a key enforcement objective.  The EEOC’s January 31st hearing highlighted issues such as the complexities involved in developing trustworthy AI and related standards, the need for transparency and compliance assistance guidance, a desire for EEOC certifications for audits, and small business safe harbors. The EEOC indicated that it will issue new guidance to provide clarity as to how employers can meet their obligations to comply with applicable law, but that it would like to have input from software developers, vendors and employers and will provide a robust comment period before issuing regulatory guidance.  In 2022, the EEOC also published enforcement guidance regarding the Americans with Disabilities Act (ADA), cautioning employers that individuals with disabilities may be disproportionately impacted by the increased use of AI tools in employment decisions without the use of proper safeguards.  The Biden Administration also recently released a “Blueprint for an AI Bill of Rights,” which – among other things – advocates for the development of sufficient protections against algorithmic discrimination, including technology bias audits.  The European Union has also proposed the Artificial Intelligence Act which could potentially influence the development of similar U.S. laws.

Employers must heed these developments and work with their trusted advisors to audit their use of AEDTs and organize their workplace data in a manner that will facilitate legal compliance.   NYC employers that are already using automated technologies in their employment decision-making processes, or plan to implement them in the workplace in the future, should consider taking the following actions:

  • Review NYC Local Law 144 and the proposed rules to understand new compliance obligations, and monitor changes in the law and related agency guidance.

  • Assess which categories of automated tools and technologies the employer uses in its workplace decision-making schemes, and determine with counsel whether these tools are legally compliant.   

  • Incorporate counsel in the bias audit process to ensure attorney-client privilege protection where possible. 

  • Organize job descriptions and workplace demographic data in a manner that facilitates proper training of algorithmic tools (e.g., by ensuring that the appropriate “success” variables inputted for these tools are not proxies for impermissible discrimination) and periodically test tools to confirm they do not create a disparate impact.

  • Adhere to applicable legal requirements for requests for accommodation throughout these processes.

  • Devise prudent recordkeeping procedures to demonstrate compliance with applicable laws and NYC Law 144’s requirements.

  • Work with third-party vendors, including AEDT vendors, to ensure their compliance with the new law and make any desired updates to service agreements.

  • Train Human Resource professionals and other managers involved in hiring and employment decision-making processes to ensure they are familiar with the new legal requirements and to address any issues that may arise with implementation.

  • If the employer is utilizing AEDTs, ensure that notices are effectively provided to any candidates or employees, where appropriate.

Parting Words

In recent years, our workplaces have evolved to meet certain challenges and new realities.  Employers continue to navigate remote and hybrid work arrangements, talent wars, generational shifts, data privacy concerns for workers, and an increasingly pointed focus on diversity, equity and inclusion initiatives and ESG (Environmental, Social, and Governance) considerations in the workplace.  The proliferation of AI and various automated employment decision tools are just the most recent workplace development.  Employers now utilize technologies to increase worker productivity, improve team communication, and streamline services, and increasingly, these technologies are used to aid recruitment, interviewing, hiring and retention processes.  AI and related technologies can be a boon to employers in these areas, but, they are not issue-free.  Against the backdrop of the EEOC’s focus on this issue, the developing legal infrastructure governing the use of AI and AEDTs, and an ever-increasing body of knowledge concerning how these tools work, it is incumbent upon employers to deploy these tools thoughtfully and with intent.  Taking a holistic approach to the integration and use of these tools into workplace decision-making is imperative.

 

NLR Logo

We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins