September 25, 2021

Volume XI, Number 268

Advertisement

September 24, 2021

Subscribe to Latest Legal News and Analysis

September 23, 2021

Subscribe to Latest Legal News and Analysis

September 22, 2021

Subscribe to Latest Legal News and Analysis

Some Interesting CDA Section 230 Developments: A Novel FCRA Victory, a Negligent Design Exception and a Startling New State Law

In the past month, there have been some notable developments surrounding Section 230 of the Communications Decency Act (“CDA” or “Section 230”) beyond the ongoing debate in Congress over the potential for legislative reform. These include a novel application of CDA in a FCRA online privacy case (Henderson v. The Source for Public Data, No. 20-294 (E.D. Va. May 19, 2021)) and the denial of CDA immunity in another case involving an alleged design defect in a social media app (Lemmon v. Snap Inc., No. 20-55295 (9th Cir. May 4, 2021), as well as the uncertainties surrounding a new Florida law that attempts to regulate content moderation decisions and user policies of large online platforms.  

Florida’s SB 7072

On May 24, 2021, the Florida governor signed a bill (SB 7072) that, among other things: prohibits large “social media platforms” (i.e., over 100 million global monthly users) from willfully deplatforming Florida political candidates, bars any action to shadowban or deplatform a “journalistic enterprise” based on the content of its publications, and prohibits deplatforming or limiting access to a user’s posting without first giving notice and offering a rationale for the ban. The law also requires large social media platforms to give Florida users notice of changes to site terms and maintain other transparency standards about its content moderation policies (including transparency into its algorithmic handling of content and a right of users to opt out of certain algorithmic prioritizing of content in a user’s feed). The law also offers a private right of action to users over certain unfair moderation decisions and procedures (and similar enforcement or administrative authority by the state).

Needless to say, the law has already been challenged in court, as many of its provisions may violate the First Amendment by mandating what a private online operator may publish or withdraw from its site. Courts have routinely ruled that online platforms are not transformed into a state actor subject to First Amendment constraints solely because they provide a forum for speech.  Similarly, many of the provisions that potentially make a platform liable for certain user content moderation decisions or policies, on first glance, seemingly run contrary to and may  be preempted by the CDA, which grants broad immunity to internet service providers for all claims stemming from their publication of information created by third parties and also provides immunity for good faith content filtering decisions related to objectionable content. 47 U.S.C. §230(c)(1), (2)(a). Thus, under Section 230, any activity that can be reduced to deciding whether to publish or exclude material that third parties seek to post online is protected by the CDA. Moreover, regarding preemption, the CDA also expressly provides that “no cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.” 47 U.S.C. §230(e)(3).

With a legal challenge pending, it remains to be seen whether SB 7072 will come into force on its effective date on July 1, 2021 or be enjoined in whole, or in part, by a court.  Regardless of the outcome, some parts of the new Florida law mirror some existing provisions contained in CDA reform provisions introduced in Congress.  As we’ve discussed in a prior post, legislators appear to want greater transparency in moderation decisions by online platforms and more “due process”-like user rights regarding content removal and account termination decisions.

Recent CDA Decisions

Lemmon v Snap, No. 20-55295 (9th Cir. May 4, 2021)

The Snap case was brought by the surviving parents of two boys (“Plaintiffs”) who were involved in a tragic high-speed car accident. Snapchat is a mobile app that allows users to take ephemeral photos and videos, also known as “snaps,” and share them with friends. A Snapchat filter is essentially an overlay that can be superimposed over a photo or video taken on Snapchat, and might include geotags, the time, or something fanciful, or in this instance, the real-life speed of the user. The complaint alleged that Snap, Inc. (“Snap”) incentivized young drivers like their sons to drive at dangerous speeds when using its negligently designed Speed Filter within the Snapchat app. No claims made over the photos or videos produced by the deceased boys using the app.  Apparently, shortly before the crash, one of the boys in the car opened Snapchat to document how fast they were going.

The plaintiffs’ suit alleged that Snapchat encouraged dangerous speeding because it knew or should have known that many teenage and young adult users of the app wanted to use Snapchat to capture a mobile photo or video showing them hitting over 100MPH and then share that snap with their friends. The plaintiffs also documented other similar fatal accidents purportedly linked to the Speed Filter and other lawsuits advancing similar claims.

In examining the plaintiffs’ amended complaint, the district court granted Snap’s motion to dismiss and held that the CDA barred the plaintiffs’ negligent design lawsuit claim because it sought to treat Snap as the publisher or speaker of third party content.  (Lemmon v. Snap, Inc., No. 19-4504 (C.D. Cal. Feb. 25, 2020)).

On appeal, the Ninth Circuit reversed, holding that because the plaintiffs’ claim neither treats Snap as a “publisher or speaker” nor relies on “information provided by another information content provider,” Snap is not entitled to CDA immunity on the plaintiff’s distinct negligent design claim.  (Lemmon v. Snap Inc., No. 20-55295 (9th Cir. May 4, 2021)). The panel found that the plaintiffs’ negligent design lawsuit did not treat Snap as a publisher or speaker because the plaintiffs’ claim turned on the design of Snapchat and Snap’s role as a products manufacturer (i.e., a claim that Snap negligently designed a product (Snapchat) with a defect (the interplay between Snapchat’s in-app reward system and its Speed Filter). According to the appeals court, the duty to design a reasonably safe product was fully independent of Snap’s role in moderating or publishing third-party content.

“Snap ‘acted as the ‘publisher or speaker’ of user content by’ transmitting [one of the boy’s] snap, ‘and that action could be described as a ‘but-for’ cause of [the boys’] injuries.’ This is unsurprising: Snap ‘is an internet publishing business. Without publishing user content, it would not exist.’ But though publishing content is ‘a but-for cause of just about everything’ Snap is involved in, that does not mean that the Parents’ claim, specifically, seeks to hold Snap responsible in its capacity as a ‘publisher or speaker.’ The duty to design a reasonably safe product is fully independent of Snap’s role in monitoring or publishing third-party content.” [citation omitted]

In limiting its exception to CDA immunity, the Ninth Circuit further clarified that the plaintiffs do not fault Snap for publishing that photo message before the accident (as that is, according to the court, simply evidence that Snapchat allegedly had a causal effect on the accident). Similarly, the court noted that plaintiffs would not be able to hold Snap liable for publishing other Snapchat user content showing other high-speed dangerous behavior that may have encouraged the boys in this case, as such claims would treat Snapchat as a publisher of third party content.  Put another way, the court stressed that online operators entitled to CDA protection continue to face the prospect of liability for providing “neutral tools,” so long as litigants’ claims “do not blame them for the content that third parties generate with those tools.”

CDA immunity was also unavailable in this case because the plaintiffs’ negligent design claim did not turn on “information provided by another information content provider.” The panel noted that the plaintiffs’ negligent design claim rested on Snap’s own acts and stands independently of the content that Snapchat’s users create with the Speed Filter.

In sum, the Ninth Circuit rejected Snap’s argument that the plaintiffs’ negligent design claim was merely another ill-fated attempted at a “CDA workaround.” The court distinguished the instant case from other creative pleading attempts to bypass CDA immunity in past disputes that depended on third party content, including the Dyroff case (rejecting an argument that a site developed content based on editorial functions that notification and recommendation functions) and the Kimzey case ( an interactive computer service that “classifies” user characteristics and displays a “star rating system” aggregating consumer reviews does not transform it into a developer of the underlying user-generated information).

Appeals courts do not often carve out an exception to CDA immunity, so the ruling makes this case notable.  In creating this seemingly narrow exception, the court was careful in its reasoning to ensure that plaintiffs’ amended claims did not inherently implicate third party content. Most claims of this type, however, are inextricably tied to the publishing of third party content (and hence subject to CDA immunity) – yet, new technologies and applications may challenge this assumption and we will keep an eye on how the Lemmon ruling affects future suits.

It should be noted that even though the court denied Snap CDA immunity in this instance, the ruling does not necessarily mean that the plaintiffs will prevail on the merits since they would still have to prove their negligent design claim, a products liability tort, and sufficiently show causation and a requisite duty.  Indeed, this obstacle proved unassailable in a similar lawsuit involving a fatal car crash involving the Speed Filter, where a Georgia appellate court found that the CDA did not shield Snapchat from the injury claims because the plaintiffs were seeking to hold Snapchat liable for its own conduct, principally for the creation of the Speed Filter and its failure to warn users. (Maynard v. Snapchat Inc., 346 Ga. App. 131 (2018)).  The litigation victory was short-lived, however, as in further proceedings the same court dismissed the plaintiff’s negligence claim, finding that under the facts of the case, Snap did not owe a duty to the plaintiff to alter its product design to prevent the injuries allegedly caused by the driver while she was using the Speed Filter. (Maynard v. Snapchat Inc., No.  A201218 (Ga. App. Oct. 30, 2020) (“Put simply, Georgia law does not impose a general duty to prevent people from committing torts while misusing a manufacturer’s product. […]The Maynards allege that Snapchat’s design contains an inherent incentive to engage in risky behavior, but they only point to the attractiveness of the product itself, not to any specific reward system or status ranking predicated on misusing it while driving or generating higher speeds.”). One wonders if the California Lemmon litigation will suffer a similar fate on the merits, even if the Ninth Circuit recognized an important, but narrow exception to CDA immunity.

Henderson v. The Source for Public Data, No. 20-294 (E.D. Va. May 19, 2021)

The Henderson case concerns a novel application of CDA immunity to federal Fair Credit Reporting Act (FCRA) claims.  In the case, plaintiffs alleged that The Source for Public Data, L.P. and others (“Defendants”) violated the FCRA by including inaccurate criminal information in so-called background check reports defendants produced and offered for sale on their website, publicdata.com. Plaintiffs alleged that defendants purchased and otherwise acquired criminal history and related data about individuals from various public sources and crunched the data into reports offered for sale to users, but that the records regarding the plaintiffs were inaccurate (or included criminal histories to persons with the same name), thereby harming the plaintiffs’ employment and rental housing prospects. As a result, the plaintiffs brought various claims under the FCRA based on the allegedly false and inaccurate information.  The defendants raised Section 230 as a defense to the claims. [Note: It is not clear from the opinion whether the defendants’ publicdata.com website, which bills itself as providing “affordable access to public records,” is a covered credit reporting agency under the FCRA, as the court limited its analysis to Section 230 issues].

In dismissing the complaint, the court first found that FCRA claims are not exempt from the CDA, as the statutory exemptions contained in 47 U.S.C § 230(e), which include federal criminal statutes, IP law and federal communications privacy law, among others, do not list FCRA. Citing the landmark Fourth Circuit Zeran decision, the court stated that Section 230 creates a federal immunity to any cause of action that would make service providers liable for information originating with a third-party user. Pursuant to  its CDA analysis, the court held that the defendants were entitled to immunity. The court found that the defendants’ site qualifies as an “interactive computer service” because it does not produce the content of the reports at issue, even if the site paid a third party for the raw content and edited it like a publisher or distributor in its traditional capacity: “Although Plaintiffs allege that Defendants manipulate and sort the content in a background check report, there no explicit allegation that Defendants materially contribute to or create the content themselves.”

Other online public record websites and “people finders” have been involved with privacy-related litigation in the past (and even been granted CDA immunity over state claims). And while the Henderson decision is not substantively rich in its legal analysis of the CDA and FCRA issues, it will be interesting to see how this ruling plays out or how other people search sites or more established FCRA-covered entities that use third party or public content might plead the CDA in defense of future FCRA lawsuits.  Interestingly, a peek at the pleadings in the well-known Spokeo litigation – which involved allegations that Spokeo, a people search portal, published an inaccurate report of the plaintiff on its website and violated certain procedural requirements under FCRA – reveals that Spokeo asserted a CDA Section 230 defense in its First Amended Answer in the case.  While the parties actively litigated the issue of whether plaintiff pled an injury-in-fact sufficient for Article III standing (an issue that went all the way to the Supreme Court and back to the Ninth Circuit), the case was settled before trial and without consideration of the CDA issue.

Beyond the courtroom, it’s possible that this ruling will influence the ongoing CDA reform debate, as legislators who already have reservations about the scope of CDA protection may look askance at the Henderson ruling and seek to add the FCRA as a statutory exemption to the CDA in a future reform bill.  We will have to wait and see.

© 2021 Proskauer Rose LLP. National Law Review, Volume XI, Number 161
Advertisement
Advertisement
Advertisement
Advertisement
Advertisement
Advertisement

About this Author

Jeffrey D Neuburger, Proskauer Rose Law Firm, Technology Attorney
Partner

Jeffrey Neuburger is co-head of Proskauer’s Technology, Media & Telecommunications Group, head of the Firm’s Blockchain Group and a member of the Firm’s Privacy & Cybersecurity Group.

Jeff’s practice focuses on technology, media and intellectual property-related transactions, counseling and dispute resolution. That expertise, combined with his professional experience at General Electric and academic experience in computer science, makes him a leader in the field.

As one of the architects of the technology law...

212-969-3075
Advertisement
Advertisement
Advertisement