Recent FTC Report Shows Increase in Dark Patterns
Thursday, November 30, 2023

It is impossible to use the internet and not encounter digital advertising. Sometimes they can be useful, sharing new information about products that can help us. But sometimes, they are more than just an annoyance—they threaten our online privacy and our wallets. There are certain rules and regulations that advertisers must follow in order to be in compliance with federal law. Advertising is not the free-for-all that it may seem. When advertisers attempt to take advantage of consumers by flaunting these regulations, they can be held accountable by legal action.

Recent FTC reports show that there have been a marked rise in sophisticated “dark patterns,” or design practices meant to trick consumers into subscriptions, payments, or giving up their privacy. Dark patterns are different from sophisticated user design, or particularly effective advertising. They are intentionally deceptive practices that are in violation of Federal Trade Commission (FTC) standards.

If you have information about dark patterns in advertising, speak to a consumer protection attorney. False advertising and the use of dark patterns can be difficult to spot, but it is illegal. Speaking up may help prevent others from being taken advantage of, and hold deceitful advertisers accountable with the full might of the law.

What is a Dark Pattern?

A dark pattern is part of a larger design methodology that attempts to trick users into doing something. Dark patterns are also known as “manipulative design.” Dark patterns prey on weaknesses in the human psyche, and make use of habits to which we have become accustomed. They are an attempt to manipulate human decision making and skew our choices in favor of a company’s profit motive. The phrase “dark pattern” was first coined by a London based, UX designer Harry Brignull, who defined it as, “a user interface that has been carefully crafted to trick users into doing things that are not in their interest and usually at their expense.”

Dark patterns are particularly prevalent in e-commerce, where deceptive practices are weaponized against tech users of all ages. Both tech-savvy teenagers and older internet users can fall victim to dark patterns. One of the most important goals of computer engineers, designers and statisticians is how to get users to direct their attention towards their products, even when doing so is not in the consumer’s true interest. Dark patterns are different from regular advertising, however, because they employ more difficult barriers and more deceptive design into making people pay attention, and pay for, goods and services that they did not want originally.

What Are Dark Patterns in Advertising?

A dark pattern in advertising might be preying on consumer’s FOMO, or “fear of missing out.” A website might advertise a sale that ends in a few hours, or a limited release of a good or service. If these limitations are false, or if the website then pushes the consumer onto another, more expensive equivalent as the next best option, it may be an example of a dark pattern. Advertisers might also hide additional fees and surcharges until the final payment screen, and show a lower price throughout the rest of the online interaction. The dark pattern at play here is banking on the fact that a consumer will click “check out” without double checking to see that the price has suddenly jumped up at the last moment.

What Are Dark Patterns in Marketing?

Dark patterns in marketing might reel consumers in with offers of free trial periods, and then make the process of opting out nearly impossible. Terms and conditions might be laid out in difficult-to-read font, obscured by colors or shapes, or be buried after paragraphs of text. Dark patterns in marketing exist to mislead a consumer by promising a different product or experience than what they can actually receive.

What Are Dark Patterns in UX?

Dark patterns in UX are only limited by designers’ imaginations. They may look like highlighting certain choices with bright colors or pop-up screens, while obscuring opt-out areas or critical information about hidden costs. Dark patterns may be as simple as reorienting the layout of a traditional screen, so that the “sign up” button is where the “opt out” option is usually kept.

One example of a dark pattern in UX is designing an ad meant to be seen on a phone that looks like it has a smudge in the center of it. When smartphone users go to wipe away the “smudge” on their screens, they are taken immediately to the website of the advertiser instead.

Laws Against Dark Patterns

There are several laws against dark patterns. The Deceptive Experiences To Online Users Reduction (DETOUR) Act was proposed as recently as July 2023. Currently, the Bureau of Consumer Protection has several policies active that are aimed to prevent consumers from being stuck with subscription services that offer unnecessarily difficult opt-out policies, or misleading offers. Other advertising laws, such as the Consumer Financial Protection Act (CFPA), Section 5 of the FTC Act, the Restore Online Shoppers’ Confidence Act (ROSCA), Electronic Fund Transfer Act (EFTA), and the Telemarketing Sales Rule (TSR) protect users against unfair, abusive, or deceptive marketing.

The Consumer Financial Protection Act (CFPA) is especially concerned with the use of “negative options,” or marketing that equates a consumer’s failure to act as consent. For instance, if a consumer does not submit a cancellation request, they can continue to be charged for a company’s services under the use of “negative options.” While negative options, such as offering a free trial period with a paid subscription service behind it, are not implicitly illegal, the use of dark patterns can make it too difficult for users to reasonably cancel a service, miring them in a cycle of recurring payments and arcane refund processes.

The use of dark patterns is also explicitly outlawed in the state of California under the California Privacy Rights and Enforcement Act. This state law plainly states that, “agreement obtained through use of dark patterns does not constitute consent.”

Who Enforces Dark Pattern Laws?

The Federal Trade Commission (FTC) enforces advertising laws in cases of dark patterns. Violations of consumer protection laws are investigated when the Commission has “reason to believe” that violation of the FTC Act has occurred. By imposing civil penalties such as fines against businesses who violate dark pattern laws, the FTC can hold companies accountable and disincentivize future fraud. The amount assessed against a company is calculated per violation, and rates are linked to the rate of inflation.

FTC Reports Recent Increase in Dark Patterns

The FTC is determined to shed new light on dark patterns by bringing enforcement actions against companies that take advantage of consumers through advertising and UX design. “Our report shows how more and more companies are using digital dark patterns to trick people into buying products and giving away their personal information,” said Samuel Levine, Director of the FTC’s Bureau of Consumer Protection.

The recent release from the federal agency showed that dark patterns are particularly prevalent in areas such as e-commerce, cookie consent banners, children’s apps, and subscription sales. The FTC warns consumers against new and unscrupulous web design meant to make consenting to the sale of the personal data, the making of in-app purchases, and payments of hidden fees more difficult to spot for consumers.

Examples of Dark Patterns Included in the FTC’s Report

While dark patterns exist in a variety of ways, the FTC’s latest release warns against the following most common examples of dark patterns:

Disguised Ads

Advertisements on social media are beginning to resemble real content. The FTC warns that this aesthetic choice is made on purpose, done as a way to trick consumers into interacting with posts subconsciously. Additionally, many websites that seem to offer unbiased reviews or independent editorial content may actually be compensated for their rankings.

Finally, the FTC has recently taken action against an email marketing campaign that sold work-from-home services to consumers, using misleading subject lines that implied its products were endorsed by celebrities such as Warren Buffet or Suze Orman, or that communications were initiated by trusted sources like CNN or Fox News. The FTC has issued full refunds to affected consumers in the case, totaling more than $284,000.

Difficult-to-Cancel Subscriptions

Recurring payments are subject to increased scrutiny by the FTC as dark patterns make cancellation nearly impossible for consumers. In one case against ABCmouse, consumers who decided to cancel their subscriptions to the online learning site were forced to click through several pages of promotions and links, all of which directed them away from the cancellation page. This was after promising “Easy Cancellation” to consumers. This kind of purposefully misleading design structure is called a “roach motel,” meaning that it is easy to get into, and difficult to get out.

Buried Terms

Junk fees, product limitations, and other important terms of service need to be visible and accessible to consumers. When dark patterns are deployed to conceal necessary information or additional charges, action can be taken by the FTC in order to refund consumers and hold the company accountable.

In one recent case against LendingClub, the FTC returned more than $10 million to consumers after they were charged surprise fees for their loans. After promising no fees for its services, LendingClub hid additional information about fees behind tooltip buttons on its website, and in between larger, more prominent text. Other websites may sneak additional products into online shopping carts, or lock consumers into contracts by suppressing information until after purchase.

Tricks to Obtain Data

Data privacy has become a hot button issue in recent years, as access to consumer data is placed at a premium by advertisers and companies. Meanwhile, more data is aggregated online than ever before, creating swaths of personal information that can be used to market products more aggressively to users. Dark patterns may seem as if they give users a choice about their data sharing options, but actually steer them away from personal privacy settings. Some dark patterns may present as pre-checked boxes, or opt consumers in to personal data collection on pages that are buried deep within a settings menu.

Where Do I Report Dark Patterns?

If you believe a dark pattern is in violation of Consumer Protection Law, you can report the violation to the Attorney General of your state, or to the Federal Trade Commission tip line. You can also reach out to a consumer protection lawyer if you suspect a company is using dark patterns in their marketing and advertising. 

Dark Patterns Class Action Lawsuits

A class action lawsuit offers multiple people who have been harmed by misleading or deceptive practices the opportunity to file a shared complaint in court. Each plaintiff, or wronged party, has an equal share in the outcome, which can result in both repayments as well as awarded damages.

Class action lawsuits are a practical and powerful method for individuals to stand up to large corporations. While one person who has been wrongfully charged fees by an app or had their data sold by a company might not have the incentive to file a lawsuit, many individuals standing together can create a compelling case and hold a corporation accountable.

Amazon Prime Class Action Lawsuit

Recently, a number of Amazon consumers have filed a class action lawsuit alleging that the e-commerce giant purposefully hid Prime cancellation terms behind a maze of dark patterns. The lawsuit alleges that a company design effort known as “Project Iliad” was largely responsible for reducing cancellations by 14%, as “fewer members managed to reach the final cancellation page.” In order to cancel Amazon Prime memberships, consumers had to click through at least three separate pages, each time confirming their intention, and facing multiple “confirm-shaming” tactics meant to scare them away from following through. The cancellation buttons were renamed on each page from “End My Membership” to “Cancel My Benefits,” with multiple options along the way to “Remind Me Later” and postpone the decision. By the end of the third page, users remained subscribed to Amazon Prime, despite expressing their intention to quit multiple times previously, unless they found one final “End Now” button.

This kind of purposeful complexity is an example of dark patterns design. Additionally, each individual Amazon Prime user likely experienced some degree of confusion, frustration, and lost money from the way that the company acted, but the situation remained unaddressed until a class action lawsuit was filed. Collective action can hold companies accountable for deceptive practices that the individual consumer might be forced to let slide.

Dark Patterns Settlements

Recent dark patterns settlements have helped consumers recover money and send a message to corporations that this kind of behavior is unacceptable. Many dark patterns prey upon children and teens by encouraging them to make in-app purchases.

Epic Games Settlement

In one $245 million settlement, Epic Games, the owner of the popular game Fortnite, used a series of digital design tricks to allow underage users to charge virtual in-app purchases like costumes, dance moves, and piñatas to their parents’ stored credit card information. When parents complained, Epic locked their accounts and removed access to the already-purchased content.

Even adult users could easily be misled by the game’s design. One of the dark patterns identified in the lawsuit was the placement of the preview button for in-game merchandise directly next to the purchase option. If players tapped too close to the purchase choice, something all-too-easy to do on a small smartphone screen, the amount was automatically deducted from a player’s account with no separate confirmation window. In other instances, button design was swapped between the preview and purchase options when the game is played via Playstation Controller, in order to mislead consumers into making purchases without realizing.

The $245 million enforcement action will be used to repay Fornite gamers, and is the largest penalty to date for violating FTC rules. The previous record was held by a $170 million settlement in 2019 against Google LLC and its subsidiary YouTube, for violating the Children’s Online Privacy Protection Act Rule (COPPA).

Publishing Clearing House Settlement

Another dark patterns settlement was announced in June of 2023 against Publishers Clearing House. The direct marketing company was accused of suggesting to consumers that making a purchase would increase their chance of winning its magazine subscription sweepstakes, or that a purchase was necessary to enter at all. The company also was accused of charging hidden fees, misleading customers about how their data was being used, and sending marketing emails with subject lines such as “High Priority Doc. W-34 Issued” in order to impersonate time-sensitive tax information and increase engagement. The FTC noted that many of the people taken advantage of by these scams were older Americans. The $18.5 million settlement will go to repay those who suffered losses due to the company’s deceitful actions.

Dark Patterns FAQs

Dark patterns are everywhere in digital design and e-commerce. The following are some frequently asked questions about phenomena, and how to hold companies accountable for their use.

How many types of dark patterns are there?

The first UX designer to use the phrase “dark patterns,” Harry Brignull, identified 12 commonly used techniques that fall under this classification. However, new dark patterns are limited only by designers’ imaginations. As former Facebook data lead Jeff Hammerbacher once said, “the best minds of my generation are thinking about how to make people click ads, and that really sucks.”

What are the most common types of dark patterns?

The following are the 12 most common types of dark patterns:

  • Friend spam: The networking platform LinkedIn settled a class action lawsuit for $13 million for this dark design concept. The social media service, like many others, requested access to contacts lists or email accounts in order to sign up for a LinkedIn page. They then spammed peoples’ contacts with messages designed to look as if they came from the friend personally. This kind of friend spam is a misleading design not in the user’s interests, but in an attempt to mine user’s data for additional profit.
  • Disguised ads: Disguised ads frequently are designed to appear like posts on social media, or regular content on a website in order to lure users to engage with them. Hubspot shares that up to 34% of users report clicking on ads by mistake, because they looked like part of the existing webpage they were attempting to use.
    Forced continuity: Creating unnecessarily difficult barriers to “opt out” of services is an example of forced continuity. Examples may be scores of confirmation boxes or hiding “cancel” buttons with misleading colors, text, or purposeful glitches.
  • Confirmshaming: Many websites use confirmshaming to make rejecting their services seem unreasonable. Confirmshaming is a design technique to prey on human desire to not let others down, or to make them reconsider their actions. For instance, say you are presented with the option to sign up for an email list. The options you are offered are “yes” or “no, I don’t like interesting things.” This is an example of confirmshaming in web design, making it seem as if only one option is reasonable.
  • Hidden costs: Ticketmaster has recently come under fire for its additional hidden fees, also known as drip pricing, which are only revealed on the checkout page, and not when users browse for the initial cost of tickets. However, Ticketmaster is far from the only platform to hide additional costs behind layers of dark patterns with the hope that consumers either won’t check the final total or will decide to make the purchase anyways, having come so far. Airlines, e-tailers, delivery platforms, and more also utilize this dark pattern to control consumer spending.
  • Bait and switch: Designers can take advantage of built-in expectations to manipulate users. For instance, one infamous Microsoft update programmed the “x” button, usually used to cancel or opt out of a web page, as a “confirm” button when installing their latest software. This misled users who thought they were closing the page into confirming the download.
  • Privacy Zuckering: Named for Facebook Founder Mark Zuckerberg, this dark pattern allows companies to harvest unexpected amounts of data from their users, which can then be resold or used to market products to them.
    Roach motel: When it’s easy to sign up for a service, but nearly impossible to get out, this design strategy is known as a roach motel.
  • Misdirection: Text or visuals may be designed to cast doubt on consumer freedom, or disguise options that are more favorable to the user. For instance, “cancel” buttons may be replaced by “pause my membership,” or “remind me later” options.
  • Trick questions: WinRed, Donald Trump’s campaign company, was accused of utilizing dark patterns in 2020 by including a pre-checked box for recurring donations that visitors had to manually uncheck after reading the fine print. When users didn’t spot the so-called “money bomb” hidden between lines of bolded text and capital letters, they found themselves unknowingly funneling thousands of dollars to the political campaign monthly. The Trump campaign eventually refunded at least $122 million in unintended recurring payments from supporters, due to the devious email campaign design.
  • Triggering FOMO: When a company lists “only a few left” or adds countdown clocks to their webpages to try to convince you to buy now, this is an example of dark patterns preying upon the shared fear of missing out.
  • Triggering fear: A company may attempt to convince users that rejecting their services will lead to negative outcomes. For instance, Facebook was accused of adding text that told users disabling the “facial recognition” feature could lead to other users impersonating them to steal their information.

Why are dark patterns illegal?

Dark patterns are an example of deceptive consumer advertising. They promise services that are not actually delivered, and remove consumer choice. Subscription traps, drip pricing, outright scams, and false advertising all fall under the purview of “unfair and deceptive practices,” as laid out in Section 5 of the FTC Act. Other laws, such as the Restore Online Shoppers’ Confidence Act (ROSCA), have been passed specifically to address developments in e-commerce scams and protect consumer financial information post-purchase.

 

NLR Logo

We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins