Anonymous ID: c09cea Sept. 26, 2020, 11:19 a.m. No.10799059   🗄️.is 🔗kun   >>9091 >>9094

>>10798938

  1. DoJ Press Release

September 23, 2020

The Justice Department Unveils Proposed Section 230 Legislation

https://www.justice.gov/opa/pr/justice-department-unveils-proposed-section-230-legislation

 

  1. Line-by-line proposed changes to the statute

https://www.justice.gov/ag/department-justice-s-review-section-230-communications-decency-act-1996

Anonymous ID: c09cea Sept. 26, 2020, 11:22 a.m. No.10799091   🗄️.is 🔗kun

>>10799059

From link

>2. Line-by-line proposed changes to the statute

>https://www.justice.gov/ag/department-justice-s-review-section-230-communications-decency-act-1996

 

Areas Ripe For Section 230 Reform

 

The Department identified four areas ripe for reform:

 

  1. Incentivizing Online Platforms to Address Illicit Content

The first category of potential reforms is aimed at incentivizing platforms to address the growing amount of illicit content online, while preserving the core of Section 230’s immunity for defamation.

 

a. Bad Samaritan Carve-Out. First, the Department proposes denying Section 230 immunity to truly bad actors. The title of Section 230’s immunity provision—“Protection for ‘Good Samaritan’ Blocking and Screening of Offensive Material”—makes clear that Section 230 immunity is meant to incentivize and protect responsible online platforms. It therefore makes little sense to immunize from civil liability an online platform that purposefully facilitates or solicits third-party content or activity that would violate federal criminal law.

 

b. Carve-Outs for Child Abuse, Terrorism, and Cyber-Stalking. Second, the Department proposes exempting from immunity specific categories of claims that address particularly egregious content, including (1) child exploitation and sexual abuse, (2) terrorism, and (3) cyber-stalking. These targeted carve-outs would halt the over-expansion of Section 230 immunity and enable victims to seek civil redress in causes of action far afield from the original purpose of the statute.

 

c. Case-Specific Carve-outs for Actual Knowledge or Court Judgments. Third, the Department supports reforms to make clear that Section 230 immunity does not apply in a specific case where a platform had actual knowledge or notice that the third party content at issue violated federal criminal law or where the platform was provided with a court judgment that content is unlawful in any respect.

 

  1. Clarifying Federal Government Enforcement Capabilities to Address Unlawful Content

A second category reform would increase the ability of the government to protect citizens from harmful and illicit conduct. These reforms would make clear that the immunity provided by Section 230 does not apply to civil enforcement actions brought by the federal government. Civil enforcement by the federal government is an important complement to criminal prosecution.

 

  1. Promoting Competition

A third reform proposal is to clarify that federal antitrust claims are not covered by Section 230 immunity. Over time, the avenues for engaging in both online commerce and speech have concentrated in the hands of a few key players. It makes little sense to enable large online platforms (particularly dominant ones) to invoke Section 230 immunity in antitrust cases, where liability is based on harm to competition, not on third-party speech.

 

  1. Promoting Open Discourse and Greater Transparency

A fourth category of potential reforms is intended to clarify the text and original purpose of the statute in order to promote free and open discourse online and encourage greater transparency between platforms and users.

 

a. Replace Vague Terminology in (c)(2). First, the Department supports replacing the vague catch-all “otherwise objectionable” language in Section 230(c)(2) with “unlawful” and “promotes terrorism.” This reform would focus the broad blanket immunity for content moderation decisions on the core objective of Section 230—to reduce online content harmful to children—while limiting a platform's ability to remove content arbitrarily or in ways inconsistent with its terms or service simply by deeming it “objectionable.”

 

b. Provide Definition of Good Faith. Second, the Department proposes adding a statutory definition of “good faith,” which would limit immunity for content moderation decisions to those done in accordance with plain and particular terms of service and accompanied by a reasonable explanation, unless such notice would impede law enforcement or risk imminent harm to others. Clarifying the meaning of "good faith" should encourage platforms to be more transparent and accountable to their users, rather than hide behind blanket Section 230 protections.

 

c. Explicitly Overrule Stratton Oakmont to Avoid Moderator’s Dilemma. Third, the Department proposes clarifying that a platform’s removal of content pursuant to Section 230(c)(2) or consistent with its terms of service does not, on its own, render the platform a publisher or speaker for all other content on its service.