October 27, 2020

Volume X, Number 301

Advertisement

October 27, 2020

Subscribe to Latest Legal News and Analysis

October 26, 2020

Subscribe to Latest Legal News and Analysis

DOJ Submits CDA Reform Proposal to Congress to Curtail Protections for Online Platforms

Section 230 of the Communications Decency Act, 47 U.S.C. §230 (“Section 230” or the “CDA”), enacted in 1996, is generally viewed as the most important statute supporting the growth of Internet commerce.  The key provision of the CDA, Section 230(c)(1)(a), only 26 words long, simply states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” This one sentence has been the source of bedrock service provider immunity for third party content made available through a provider’s infrastructure, thus enabling the growth of a vigorous online ecosystem. Without such immunity, providers would have to face what the Ninth Circuit once termed, “death by ten thousand duck-bites,” in having to fend off claims that they promoted or tacitly assented to objectionable third party content.

The brevity of this provision of the CDA is deceptive, however. The CDA – and the immunity it conveys – is controversial, and those 26 words have been the subject of hundreds, if not thousands, of litigations.  Critics of the CDA point to the proliferation of hate speech, revenge porn, defamatory material, disinformation and other objectionable content – in many cases, the sites hosting such third party content (knowingly or unknowingly) are protected by the broad scope of the CDA. Other objections are merely based on unhappiness about the content of the speech, albeit in many cases true, such as comments that are critical of individuals, their businesses or their interests. Litigants upset about such content have sought various CDA workarounds over the past two decades in a mostly unsuccessful attempt to bypass the immunity and reach the underlying service providers.

The back-and-forth debate around the scope and effects of the CDA and the broad discretion afforded online providers regarding content hosting and moderation decisions is not new.  However, it was brought into a new focus when the President, vexed at the way some of his more controversial posts were being treated by certain social media platforms, issued a May 20, 2020 Executive Order for the purpose of curtailing legal protections for online providers. The goal was to remedy what the White House believed was the online platforms’ “deceptive or pretextual actions stifling free and open debate by censoring certain viewpoints.”

The Executive Order – which is currently being challenged in court as unconstitutional – directed several federal agencies to undertake certain legislative and regulatory efforts toward CDA reform. Consequently, in June 2020 the DOJ stated “that the time is ripe to realign the scope of Section 230 with the realities of the modern internet” and released a 28-page document with its preliminary recommendations for reform of Section 230.  A month later, the Commerce Department submitted a petition requesting that the FCC write rules to limit the scope of CDA immunity and place potentially additional compliance requirements on many providers that host third party content.  Then, on September 23, 2020, the DOJ announced that it had sent its legislative proposal for amending the CDA to Congress. The DOJ, in its cover letter to Congress, summed up the need for reform: “The proposed legislation accordingly seeks to align the scope of Section 230 immunities with the realities of the modern internet while ensuring that the internet remains a place for free and vibrant discussion.”

The Proposal

Below is a brief summary of some of the DOJ’s key proposed changes to CDA Section 230:

Expansion the Definition of “Information Content Provider: One of the most meaningful changes to the CDA is buried at the end of the DOJ proposal, just a single, seemingly innocuous sentence added to the “Definitions” section concerning what is an “Information Content Provider.” (Proposed Subsection (g)(3)).

Being responsible in whole or in part for the creation or development of information includes, but is not limited to, instances in which a person or entity solicits, comments upon, funds, or affirmatively and substantively contributes to, modifies, or alters information provided by another person or entity.”

Under the CDA, as it has been interpreted over the years, service providers exercising traditional editorial functions are not “information content providers” and are immune from publisher or distributor liability for content made available by third party information content providers.  In numerous suits, litigants have unsuccessfully sought to portray online service providers as whole or partial “information content providers” of objectionable content (thus stripping them of immunity) for soliciting, amplifying, or encouraging posted content, or for adding commentary to third-party content. This new sentence could cover various editorial actions that are a routine part of online publishing today, dramatically re-labeling service providers as “information content providers” and thereby removing immunity from such providers. This, of course, would be contrary to well-established CDA jurisprudence.

Content moderation decisions: Under the DOJ proposal, decisions to “restrict access to or availability of material” would be excluded from §230(c)(1) publisher immunity.  Instead, a service provider would have to satisfy the standards of the so-called “Good Samaritan” provision of the CDA, as that provision is amended by the DOJ proposal. The Good Samaritan provision, as amended by the DOJ proposal, would grant immunity to interactive computer service providers that act in “good faith” (as defined further in the DOJ proposal) to “restrict access to or availability of material that the provider or user has an objectively reasonable belief is obscene, lewd, lascivious, filthy, excessively violent,  promoting terrorism or violent extremism, harassing, promoting self-harm, or unlawful [material], whether or not such material is constitutionally protected.”

By shifting immunity for content deletion decisions to the Good Samaritan section, as amended, the proposal places restrictions on editorial discretion to remove content.  Moreover, the DOJ proposal’s new definition of “good faith” (Subsection 230(g)(5)), which, as applied to the removal of content, would include certain requirements that users be given a factual reason for content removal and a reasonable opportunity to respond.  Reminiscent of the “notice and takedown” structure of the DMCA, a process that has not proven to be wholly effective, this provision, if enacted, would add administrative overhead associated with this process and would be burdensome to smaller providers, and perhaps untenable for larger platforms that use automated tools to remove millions of objectionable posts per year.

Other Provisions

  • Terms of Use as a BarometerThe DOJ proposal would grant providers certain aspects of the CDA safe harbor for content-moderation decisions taken in “good faith” and consistent with the platform’s terms of service.

  • Bad Samaritan Carve-Out: Under the DOJ proposal, CDA publisher immunity would not apply to any criminal prosecution under state law or any state or federal civil action brought against a provider if the provider “acted purposefully with the conscious object to promote, solicit, or facilitate material or activity by another information content provider that the service provider knew or had reason to believe would violate Federal criminal law, if knowingly disseminated or engaged in.” The proposal would also carve out immunity for state and federal civil actions if the provider had knowledge of the content’s illegality and refused to take it down.

  • Judicial Determinations: The proposal would take away immunity from providers who fail to take down content “within a reasonable time after receiving notice of a final judgment” from a U.S. court that such material was defamatory or “unlawful in any respect.”

  • Notice Mechanisms: The proposal would require providers to have an easy-to-use tool for users to report defamatory or other illegal content to the provider (even conditioning CDA immunity on the provider having a tool for reporting content that violates federal criminal law).

  • Enforcement Carve-Outs: The proposal would exempt from immunity specific categories of claims that address egregious content, including child exploitation and sexual abuse, terrorism, and cyber-stalking. It would also exempt immunity to limit any civil action brought under federal antitrust laws. The reforms would also expressly state that CDA immunity would not apply to civil enforcement actions brought by the federal government.

Given the frenzied state of affairs during this election year, it is highly unlikely this will be taken up in the next few months, yet it could be seriously debated after the election.  The drumbeat calling for some regulation of the excesses of the internet has been steady for several years, and the CDA has been the subject of criticism from politicians from both sides of the aisle. Action is particularly likely considering the presence of not only this proposal but all the other CDA reform bills that have been introduced this year in the Senate (e.g., PACT Act, the EARN IT Act, and most recently on September 29, 2020, the bipartisan-supported See Something, Say Something Act), as well as the decision on October 1, 2020 by Senate Commerce Committee to subpoena the CEOs of some of the major social media services to testify about issues surrounding CDA immunity.  As demonstrated in 2018 when Congress overwhelmingly passed a targeted CDA amendment, FOSTA, in 2018, a narrow CDA reform measure is always a possibility.

Regardless of which CDA proposal is up for debate, it is important to remember that there are no easy answers when it comes to tinkering with the CDA; without thoughtful debate and a delicate hand, it is difficult to regulate objectionable content online without affecting the vibrant internet, not to mention pass constitutional muster (note: CDA Section 230 was part of the Communications Decency Act, which itself was an attempt to regulate pornographic material on the web, but was struck down by the Supreme Court in 1997).

The attempt to overhaul the CDA is likely to start the legislative sausage-making process in Congress. Putting aside what the text of any reform bill would look like, the only certain result from any politically-negotiated amendment to the CDA is an entirely new generation of service provider liability litigation.

© 2020 Proskauer Rose LLP. National Law Review, Volume X, Number 275
Advertisement

TRENDING LEGAL ANALYSIS

Advertisement
Advertisement

About this Author

Jeffrey D Neuburger, Proskauer Rose Law Firm, Technology Attorney
Partner

Jeffrey Neuburger is co-head of Proskauer’s Technology, Media & Telecommunications Group, head of the Firm’s Blockchain Group and a member of the Firm’s Privacy & Cybersecurity Group.

Jeff’s practice focuses on technology, media and intellectual property-related transactions, counseling and dispute resolution. That expertise, combined with his professional experience at General Electric and academic experience in computer science, makes him a leader in the field.

As one of the architects of the technology law...

212-969-3075
Advertisement
Advertisement