June 24, 2019

June 21, 2019

Subscribe to Latest Legal News and Analysis

Big Data Analytics in Hiring

While the phrase has different meanings depending on the context, “big data” typically refers to data that is so large in volume that computers, rather than traditional methods of analysis, are necessary to understand it. “Big data analytics,” a phrase often used synonymously for the actual data and its computerized analysis, encompasses data’s volume, collection speed, type collected, and how best to decipher it. Marketing departments have long used big data analytics to target potential customers with pinpoint accuracy. Human resources (“HR”) departments increasingly consider whether and how to incorporate big data tools into their hiring processes.

The promise offered by big data analytics, and certainly the vision sold by many of the vendors that specialize in selling big data tools for application in the HR context, includes better outreach to potential applicants, increased efficiency in the hiring process, fewer people hours spent combing through resumes, and the selection of more qualified and better-matched candidates. The market includes a variety of analytical tools for these purposes, such as algorithms that scan resumes to match candidates to jobs by simulating human hiring tendencies, measure candidates on personality traits deemed critical for success in the job, and assess the cognitive abilities of each candidate against those of high-performing incumbents. Vendors market their big data tools as predictive algorithms that will allow their clients to hire the right people by using data that maps the applicant’s profile onto the company’s available openings. Ultimately, by hiring the “right” people, companies will improve productivity, increase retention, and spend fewer resources on employee selection.

Many of these big data tools use artificial intelligence (“AI”) or machine learning to help select attributes and candidates for hiring. Machine learning takes the baseline algorithms that make selection decisions and improves upon them by learning from “mistakes.” For example, a job role might change organically such that an old job description might not adequately assess the skills needed by an applicant, but an AI algorithm trained to mine the data of current employees in the role might find character traits that help “define” the skills needed to succeed in the role. By taking these character attributes of current employees into account as a machine learns, hiring decisions potentially improve as the selection algorithm changes.

Before blindly adopting big data analytics, however, employers must be aware of the potential risks. For example, an employer cannot easily “look under the hood” to see precisely how the selection algorithm is operating, partially because vendors consider the algorithm to be proprietary and confidential, and partially because the vendors themselves do not know exactly how the algorithm has changed as a result of machine learning. Without the ability to assess what the selection algorithm is doing, employers may have difficulty determining which factors, if any, are a potential source of bias. Additionally, in the event of litigation involving an AI algorithm’s selection criteria, the employer may be unable to produce in discovery sufficient evidence of the decision-making process. Indeed, the algorithm that the employer is required to defend might be different from the version that was used at the time of the hiring decision. Oftentimes, even the vendor/data scientist who created the algorithm does not know what the algorithm is doing.

One can argue that big data analytics can lend consistency to the hiring process, reducing the subjectivity in selection decisions and potentially limiting the likelihood of a disparate treatment discrimination claim. Nevertheless, employers should be careful that the algorithm does not incorporate intentionally discriminatory factors. Moreover, employers should be aware that the increased consistency and objectivity also increases the potential for disparate impact claims. If the AI-influenced decision results in a statistically significant adverse impact on a group of candidates possessing one or more protected characteristics, employers may be more vulnerable to class or collective action allegations.

Big data analytics also presents special challenges related to its impact on persons with disabilities. Where a person’s ability to use the technology constitutes an impediment to a proper assessment, the analytical tool may lead to claims of discrimination. Further, federal law precludes an employer from obtaining information about a candidate’s medical history or condition before making a hiring decision. To the extent a big data tool collects information about medical history or causes candidates to disclose such information at an inappropriate time, the tool may violate discrimination law.

While a complete machine takeover of the hiring process remains unlikely, big data analytics continues to be an attractive tool to assist HR departments. To that end, employers should consider the following practical steps to safeguard against machine learning run amuck in the hiring process:

  • Conduct a thorough due diligence of the vendor and its product(s), ask to view the algorithm and its different permutations, and seek indemnification to limit liability in the selection process.
  • Conduct a periodic statistical sampling of the AI-selected applicant pool and candidates through an adverse impact analysis.
  • Implement appropriate data security measures, such as determining how relevant data will be hosted and identifying a core group of individuals within HR who will have access to that data.
  • Understand document retention obligations so as to properly comply with Equal Employment Opportunity Commission (“EEOC”) guidance, U.S. Department of Labor (“DOL”) regulations, and state law.
  • Determine what to do with the data and how to access it, if and when the agreement with the vendor ends, or litigation occurs.

These steps are just a few of the considerations that employers should take into account when evaluating big data tools. For ultimate success, employers should be sure to involve all stakeholders, including business managers, HR, and legal counsel, in determining whether to adopt these tools.

©2019 Epstein Becker & Green, P.C. All rights reserved.

TRENDING LEGAL ANALYSIS


About this Author

Adam S. Forman, Epstein Becker Green, Workforce Management Lawyer, Chicago, Detroit, Social Media Issues Attorney
Member

ADAM S. FORMAN is a Member of the Firm in the Employment, Labor, and Workforce Management practice, based in Chicago and Detroit (Metro). As noted in the 2015 edition of Chambers USA, Mr. Forman “is a renowned expert in social media issues relating to the workplace” and also “focuses on litigation, training and preventive advice on the employment side.” A frequent writer and national lecturer on issues related to technology in the workplace, such as social media, Internet, and privacy issues facing employers, Mr. Forman is often interviewed by...

312-499-1468
Nathaniel M. Glasser, Epstein Becker, Labor, Employment Attorney, Publishing
Member

NATHANIEL M. GLASSER is a Member of the Firm in the Labor and Employment practice, in the Washington, DC, office of Epstein Becker Green. His practice focuses on the representation of leading companies and firms, including publishing and media companies, financial services institutions, and law firms, in all areas of labor and employment relations.

Mr. Glasser’s experience includes:

  • Defending clients in employment litigation, from single-plaintiff to class action disputes, brought in federal court, state court, and arbitration tribunals involving claims of unlawful discrimination, harassment, retaliation, breach of contract, defamation, alleged violation of the FLSA and state wage and hour laws, and whistleblowing

  • Representing clients facing charges at the U.S. Equal Employment Opportunity Commission, the U.S. Department of Labor, the District of Columbia Commission on Human Rights, the New York State Division of Human Rights, the New York City Commission on Human Rights, and other administrative agencies at the federal, state, and local levels

202-861-1863
Matthew Savage Aibel, Epstein Becker Green, Trade Secrets Attorney, Breach of Non-Compete Agreements Lawyer
Associate

MATTHEW SAVAGE AIBEL is an Associate in the Litigation and Employment, Labor & Workforce Management practices, in the New York office of Epstein Becker Green.

Mr. Aibel:

  • Assists in the representation of clients in complex commercial litigation, business disputes, and breach-of-contract matters

  • Provides assistance with litigation matters involving the breach of non-competition and non-solicitation agreements, the misappropriation of trade secrets, and...

212-351-4814