HB Ad Slot
HB Mobile Ad Slot
Exploring Intentional Bias in the Marketing of Consumer Products
Tuesday, February 16, 2021

Advances in artificial intelligence (“AI”) continue to present exciting opportunities to transform decision-making and targeted marketing within the world of consumer products. While AI has been touted for its capabilities in creating fairer, more inclusive systems, including with respect to lending and creditworthiness, AI models can also embed human and societal biases in a way that can result in unintended, and potentially unlawful, downstream effects.

Mostly when we talk about bias, we focus on accidental bias. What about intentional bias? The following hypothetical illustrates the problem as it relates to the marketing of consumer products.

In targeted advertising, an algorithm learns all sorts of things about a person through social media and other online sources, and then targets ads to that person based on the data collected. Let’s say that the algorithm targets ads to African Americans. By “intentional” we don’t mean to suggest that the software developer has racist or otherwise nefarious objectives relating to African Americans. Rather we mean that the developer simply intends to make use of whatever information is out there to target ads to that particular population (even if that data is specifically race or data that correlates with race, such as ZIP Code). This raises a number of interesting questions.

Setting aside certain situations involving bona fide occupational qualifications (for those familiar with employment law), would this be okay legally? What if the product is certain hair care products or a particular genre of music? What about rent-to-own furniture based on data that suggest that African Americans are greater than average consumers of such furniture? Taking this scenario a step further, what if it is well documented that rent-to-own arrangements are a significant contributing factor to poverty among African Americans?

Bias can also be introduced into the data through the way in which the data are collected or selected for use. What if the data, collected from predominately African American ZIP Codes, suggest that African Americans typically are willing to pay higher rental rates, and so the advertisements directed to African Americans include those higher rates? Could the companies promoting these advertisements based on those statistical correlations be subject to liability for predatory or discriminatory lending practices? Do we still need human judgment to make sure that AI supported decision making is fair?

These are among the questions that we’ll explore in our upcoming panel on targeting advertising and we invite you to join us.

HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
 

NLR Logo

We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins