Striking a Balance: Managing Workplace with Data-Driven Solutions
Tuesday, July 18, 2017

Most of us encounter the use of analytics in our everyday lives and give little thought to its use. Have you ever applied for a credit card or loan and were asked to provide a list of your outstanding financial obligations? Or, perhaps you applied for health insurance and were required to provide a summary of your health history. Providers request this information to help determine whether you are credit worthy or insurable based on analysis of others with similar histories. Welcome to the world of analytics.

But what about the use of analytics to manage the workplace? Imagine being able to predict which of several hundred job applicants are most likely to be successful on the job. Or being able to predict which employees are most likely to leave the organization in the future, or worse, file a charge. Analytics can be used to assess employee engagement, and it can even be used to optimize employee development initiatives.

Leveraging workplace analytics in this way may help companies streamline processes, resulting in saved time and money. But are there risks? Several reports from agencies such as the Federal Trade Commission and the White House have warned of the risk of making biased decisions based on analytics. Last Fall, the Equal Employment Opportunity Commission even held a public meeting regarding the use of big data in employment during which it examined the risks and benefits of big data analytics in the workplace.

Despite risks, properly designed analytics platforms can yield a host of benefits and may significantly lessen the likelihood of liability. Of course, algorithms used by employers to make decisions could be tainted by bias – for example, race and gender could be incorporated into an algorithm used by company officials to determine who should be hired or promoted. Even if race and gender is not explicitly included, an algorithm could result in the unintentional disproportionate exclusion of a particular race or gender group, that is, disparate impact. But these concerns also exist absent the use of algorithms. Humans, but their very nature, bring unintentional biases reflecting their life’s experiences and intuition to everyday decisions. Humans also may bring inconsistency to the decision-making process. Properly designed analytics platforms based on neutral data science are highly consistent and efficient.

Indeed, algorithms should not be designed to explicitly incorporate protected characteristics such as race or gender. And employers must monitor their analytics use for evidence of disparate impact. The most effective of these platforms provide guidance and should never be solely relied upon by employers when making decisions.

 

NLR Logo

We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins