October 14, 2019

October 11, 2019

Subscribe to Latest Legal News and Analysis

Machine Learning and Sua Sponte Discrimination

It violates The Fair Housing Act to advertise in ways that deny particular segments of the housing market information about housing opportunities. It also violates New York law. But what happens when you use an advertising medium that discriminates on its own? We may just find out.

New York Governor Andrew Cuomo recently directed the New York Department of Financial Services to investigate reports of housing providers harnessing Facebook’s abilities to target users with precision. Leveraging the unnerving amount of information that Facebook has on its users, advertisers are able to select who sees their ads (and who is blocked from seeing their ads) based on any number of protected characteristics, including race, religion, sexual orientation, and disability. When a regulated landlord makes advertising or leasing choices based on these characteristics, it’s clearly illegal.

However, we are faced with a reality where computer programs built on machine learning algorithms may discriminate on their own. Facebook allegedly uses machine learning to predict users’ response to a given ad and may create groupings based on protected classes. As a result, it’s possible that an advertiser doesn’t intend to discriminate, but Facebook’s machine learning blocks ads to a group of users that is exclusively comprised of persons of a specific protected class, displaying only to prospects in a majority ethnic, racial or religious group.

But what can be done? Does our increasing reliance on AI doom us to a more discriminatory society? Maybe not. It turns out there’s reason for hope—that maybe AI and machine learning will lead us to a more just society when deployed correctly. In May, 2019, several Berkeley professors published an academic paper comparing fintech lending to face-to-face lending. The paper reached two conclusions: (i) fintech lending results in 1/3 less price discrimination than face-to-face lending and (ii) fintech lending does not discriminate in accept/reject decisions.

Two lessons are clear from the Facebook fiasco: (1) companies like Facebook that deploy artificial intelligence must understand how decisions are being made and carefully design the decisioning process to avoid claims of discriminations and (2) attorneys representing companies using Facebook to advertise must be involved in not just the review of the ads, but also overseeing which groups receive those ads.

Copyright © 2019 Womble Bond Dickinson (US) LLP All Rights Reserved.

TRENDING LEGAL ANALYSIS


About this Author

Tom Kierner Lawyer Womble Bond Dickinson Atlanta Fintech IP Data Privacy Payment Systems
Associate

Tom Kierner is a transactional attorney with a background in payment systems and financial regulations.  He is a member of the firm’s FinTech and IP Transaction teams in Atlanta.

Tom advises his clients on the dynamic regulatory and legal landscape for FinTech and payments companies. He also assists his clients in negotiating and drafting agreements with banks, processors, and other service providers.

He has experience handling data privacy matters on behalf of clients, including managing data breach responses. He also has experience responding to inquiries and enforcement...

404-888-7409