June 7, 2023

Volume XIII, Number 158

Advertisement
Advertisement

June 06, 2023

Subscribe to Latest Legal News and Analysis

June 04, 2023

Subscribe to Latest Legal News and Analysis
Advertisement

The Adverse Impacts of AI in Employment Procedures Under Title VII

Technological advances have provided employers with a variety of algorithmic decision-making tools that may assist them in making employment decisions, including recruitment, hiring, retention, promotion, transfer, performance monitoring, demotion, dismissal, and referral. As a result, the EEOC issued May 12, 2023, guidance on employers' use of artificial intelligence (AI) during the hiring process. This guidance finds its basis in past EEOC statements regarding the hiring process. 

Title VII of the Civil Rights Act of 1964 generally prohibits employers from using neutral tests or selection procedures that have the effect of disproportionately excluding persons based on race, color, religion, sex, or national origin, if the tests or selection procedures are not "job related for the position in question and consistent with business necessity." This is "disparate impact" or "adverse impact" discrimination. 

In 1978, the EEOC adopted the Uniform Guidelines on Employee Selection Procedures under Title VII.[1] By extension, these guidelines require employers to exhibit care and diligence to ensure their employment practices, including algorithmic decision-making tools, do not have a disparate impact on the basis of race, color, religion, sex, or national origin, e.g., have the effect of excluding minority candidates from employment even though the AI tools are facially race-neutral. In instances where there is a disparate impact, employers can show the selection procedure is consistent with business necessity by showing it is necessary to the safe and efficient performance of the job. The selection procedure should therefore be associated with the skills needed to perform the job successfully.

The EEOC's guidelines are applicable to algorithmic decision-making tools when they are used to make or inform decisions about whether to hire, promote, or terminate applicants or current employees. Similar to traditional selection procedures, if use of an algorithmic decision-making tool has an adverse impact on individuals of a particular race, color, religion, sex, or national origin, or on individuals with a particular combination of such characteristics, then use of the tool will violate Title VII unless the employer can show that such use is "job related and consistent with business necessity."

In most cases, employers will still be liable under Title VII for their use of algorithmic decision-making tools even if the tools are designed or administered by an outside software vendor. Similarly, employers may be held responsible for the actions of their agents, which may include software vendors. Before engaging a vendor or supplier to provide an algorithm-driven hiring tool, employers should inquire about the steps taken by the vendor/supplier to evaluate whether use of the tool causes a substantially lower selection rate for individuals with a characteristic protected by Title VII. If an employer discovers that the use of an algorithmic decision-making tool has or would have an adverse impact, it must take steps to reduce the impact or select a different tool to avoid violating Title VII.

Employers have already implemented different types of software that incorporate algorithmic decision-making at different stages of the employment process. This includes but is not limited to resume scanners; employee monitoring software; virtual assistants or chatbots; video interviewing software; and testing software that provides "job fit" scores. The EEOC encourages employers to conduct self-analyses on an ongoing basis to determine whether their employment practices, including algorithmic decision-making tools, have a statistically significant effect on a basis prohibited under Title VII.


[1] See 29 C.F.R. part 1607. The Guidelines were adopted simultaneously by other federal agencies under their authorities. See Uniform Guidelines on Employee Selection Procedures, 43 Fed. Reg. 38,290 (Aug. 25, 1978) (adopted by the Office of Federal Contract Compliance Programs at 41 C.F.R. part 60-3, by the Civil Service Commission at 5 C.F.R. § 300.103(c), and by the Department of Justice at 28 C.F.R. § 50.14). 

© 2023 Miller, Canfield, Paddock and Stone PLC National Law Review, Volume XIII, Number 145
Advertisement
Advertisement
Advertisement

About this Author

Deja M. Davis Associate Attorney Detroit Michigan Employment Labor Miller, Canfield, Paddock and Stone, P.L.C.
Associate

Deja Davis is an associate in Miller Canfield's Employment and Labor Group. She earned her J.D. with honors from the University of Detroit Mercy School of Law, where she was the Moot Court Executive Director of External Competitions, president of the Black Law Students Association, and Student Chair Member of the Diversity Committee. She earned her B.S. at Pennsylvania State University, where she was a Dean's List honoree and All-Academic Big Ten honoree as a member of the track and field team.

Deja previously served as a law clerk for Lakeshore...

313-496-7950
Richard W. Warren Labor & Employment Attorney Miller, Canfield, Paddock and Stone Detroit, MI
Principal

Richard Warren defends employers facing all types of wrongful discharge lawsuits, including discrimination, retaliation and harassment claims filed by employees. 

He also has a wealth of experience handling non-competition litigation and defending employers faced with complex and class action cases, including age discrimination, race discrimination and retiree health benefit claims. 

In order to help companies avoid employment lawsuits, Richard also spends a significant amount of time counseling management and human resource professionals in areas such as hiring, layoffs,...

313-496-7932