Press Release

Justice Department, Sen. Hawley Proposals Could Limit Efforts to Fight Online Harms

June 17, 2020 , , , ,
newspapers

Today, reports surfaced that the Department of Justice and Senator Josh Hawley (R-MO) have separate proposals to amend Section 230 of the Communications Decency Act, a law that shields digital platforms from liability for user content they carry, while also allowing platforms to take down content they find objectionable. While we await specific legislative language, some reported details are troubling.

The following can be attributed to John Bergmayer, Legal Director at Public Knowledge:

“While there is room to debate how platforms can better address illegal and harmful content, some details of these proposals seem intended to directly regulate the content moderation choices that platforms make in ways that are flatly unconstitutional.

“Platforms are, and should be, free to make editorial decisions about what content to allow on their services, and to apply their subjective judgment as to what content is ‘objectionable,’ and what content is not. 

“The government of course has a role in ensuring that unlawful content is taken down, and in limiting the harms caused by content that platforms distribute. But this cannot be a cover for overriding a platform’s editorial choices, however much particular policymakers might disagree with them.

“We welcome proposals designed to address legitimate online harms and to give users more rights of redress, including the right of users to challenge take-downs they believe are mistaken. Measures that would increase transparency or, in some areas such as public health information, promote consistency across platforms, may also be valuable. But these efforts cannot be overshadowed by efforts to override the editorial decisions that platforms must make every day, which could subject platforms to lawsuits for taking down harmful, abusive, and misleading content.

“Speech regulations of this kind are, if anything, more likely to lead to platforms taking a ‘hands off’ approach to content moderation, stymying efforts to get platforms to do more to combat online misinformation, fraud, criminality, and abuse. 

“People who find the editorial and content choices of major platforms objectionable should support measures that empower users, such as interoperability and competition rules that allow users, not the government, to decide what kind of platform they want to use.”

View our blog post series on Section 230 to learn more about this important law that allows user-generated content sites, like digital platforms, to operate.