Artificial Intelligence in Business: Will AI Solve Your Recruiting Headache… Or Create a Bigger One?
In a new series of blog posts, we will discuss Artificial Intelligence (AI) and the benefits and challenges it presents to businesses and employers. Our first post of the series explores AI and its impact on employment decisions.
AI recruiting software presents considerable benefits to employers: it can save time and money by helping employers efficiently find, filter through, and select potential candidates. Developers claim that AI software ensures that employers will find the best candidates by helping employers to shorten the length of the recruitment process, interface with candidates, streamline back-and- forth communications and scheduling, and even eliminate a human recruiter’s implicit biases. Because of the obvious benefits of AI recruiting software, many employers now rely on it to modernize their recruitment processes. However, AI recruiting software can also expose employers to liability for discriminatory hiring practices.
For example, although AI recruiting software might be able to eliminate some implicit biases held by human recruiters, an AI can replace those biases with its own through a process called machine learning.
When an engineer designs an AI program, that engineer must use learning algorithms to ensure that the AI remains accurate as it receives new data — this process is called machine learning. Machine learning is crucial for any AI to be effective. Each time an AI receives data, it interprets the patterns and relationships in the data that it receives, and “learns” from those patterns.
However, depending on the algorithms used, the data sets that the algorithms are trained with, the complexity of the data fed into the algorithms, and any inadvertently introduced human biases, it is possible for an AI to develop its own biases, called machine biases.
When AI recruiting software develops machine biases, an employer using that software could be unknowingly injecting those biases into its own recruiting process.
Because of the lack of transparency with the training data used to calibrate AI recruiting software, it is important for each employer to acutely understand the impact that software could be having on its hiring process and create a plan to reduce its exposure to liability risk.
Before using AI recruiting software, each employer should investigate the software and algorithms it uses, implement proactive policies to monitor the impact that the recruiting software might have on its hiring process, and create an action plan to anticipate possible sources of liability.
Employers who already use AI recruiting software should take immediate steps to eliminate any unanticipated sources of liability by conducting routine assessments of their policies and procedures, updating their software regularly, reviewing the data collected and used by the software, and contacting an attorney to gain more knowledge of the legal components of employment-law liability.