Principles to Protect Free Expression on the Internet
Principles to Protect Free Expression on the Internet
Principles to Protect Free Expression on the Internet

    Get Involved Today

    Section 230 of the Communications Act has been dubbed the “twenty six words” that created the interactive free expression of the internet:

    No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. 

    Yet in the past year or so, we have read and analyzed (and written ourselves) words and words and WORDS about Section 230 of the Communications Act of 1934, 47 U.S.C. § 230 (which, yes, is more than just those 26 words). This simple piece of legislation provides immunity from liability as a speaker or publisher for providers and users of an “interactive computer service” who host and moderate information provided by third-party users. It applies to major platforms like Twitter and YouTube, newspapers with comment sections, business review sites like Yelp, and every other online service or website that accepts material from users.

    It’s important to note, given the widespread misunderstandings and misrepresentations of the law, that whether or not a platform or website is acting as a publisher — that is, taking an active role in moderating and selecting what material users see, including taking down user posts — has no bearing on whether Section 230 applies. If you want, you can go ahead and say that Facebook is the “publisher” of everything found on Facebook — the law merely says that it can’t be sued over this material for things like defamation. A newspaper is potentially liable for everything it publishes, and a broadcaster is for everything it airs. But some sites accept so much user material that vetting all of it beforehand to the degree necessary to eliminate liability is likely impossible. As of 2019, for instance, it was reported that 500 hours of video are uploaded to YouTube…per minute. By contrast, Fox News, currently being sued for defamation by voting machine company Smartmatic, merely airs 24 hours of content per day.

    Section 230 also shields platforms and websites from lawsuits over their good faith moderation practices — even for claims that do not seek to hold a platform liable as a “publisher” or “speaker.” This law has taken on an outsize role in the public dialogue about the role of digital technology and Big Tech, as much as for what it does, as for what people think (or merely claim) it does.

    There were, at last count, 23 recent legislative proposals to reform Section 230, many of them fundamentally at cross purposes. Republicans claim censorship and bias against conservatives and want less content moderation by platforms; Democrats seek protection for the voices of marginalized communities and from the harms of disinformation and want more content moderation. The vast majority of these proposals, in our view, have unacceptable unintended consequences, either to the innovation and benefits of the internet as we know it, or to our ability to freely express ourselves online. We have a particular passion about the latter.

    Content moderation is not neutral. By definition, moderating content requires you to choose what is allowed and what is not allowed. As with mainstream broadcast and legacy media, social media platform companies decide what is allowed to be posted on their sites and users decide what content they do or don’t consume on these platforms. By and large, the government does not decide who can or cannot speak on those platforms, in accordance with the First Amendment. The difference with social media, as compared to legacy and broadcast media, is that social media is interactive and boasts limitless “channels.” This structure promotes greater opportunity for free expression by all voices, including the most marginalized ones, and encourages a variety of platform options in the marketplace, enabling a user to choose what platform to interact on.

    As some online platforms become dominant through various means and the cost of being excluded from a dominant platform becomes high for a user, the stakes for content moderation are raised. It is fair to demand due process, transparency, and consistency of treatment by a platform in its content moderation practices. At the same time, free expression is harmed, not helped, by proposals that seek to limit the content moderation choices of major platforms, because enabling free speech, at times, requires fostering an environment where all voices can be heard, and hateful, abusive, misleading, and other speech does not drive users away. Ultimately, it is the internet as a whole, not any single private platform, that must provide a “forum for a true diversity of political discourse, unique opportunities for cultural development, and myriad avenues for intellectual activity.”

    One solution to the problem of content moderation must therefore lie in competition policy, and creating the opportunity for diverse platforms to exist, with different policies catering to different audiences. Other solutions can involve user empowerment, not just to switch platforms, but in terms of giving users more control of the content they are exposed to. Regulation and scrutiny of dominant platforms remains essential, of course, and unfortunately we do not have a sector-specific regulator that is focused on controlling for abuses in the digital economy at this point. One example of regulation that could be developed with a digital regulator would be oversight and auditing of algorithms that drive people to specific content. Regulation of platforms must be consistent with free speech principles, both in terms of allowing platforms editorial discretion, and recognizing that moderation and curation serves the broader goal of free expression.

    Public Knowledge hopes to advance the dialogue by introducing a set of principles — guardrails, if you will — for lawmakers and others interested in developing or evaluating proposals to alter Section 230. During the last Congress, we saw bills that range from really bad because they could lead to excessive user speech being taken down or harmful content left up, to bills that could create real benefits by injecting greater transparency and supporting marginalized voices through clearer processes. Some of the problems that 230 reform proposals seek to address are really competition-related, privacy-related, or seek to address other problems. These principles are not intended to suggest that policymakers view all of tech and media policy through a Section 230 lens. Rather, as these 230 proposals are likely to keep coming, the principles illustrate the values we apply when evaluating them.

    These principles should be considered in the context of Public Knowledge’s overall consumer-focused technology policy agenda for internet platform accountability. We develop and advocate for a range of policies that promote competition and protect consumers, including antitrust enforcement, more assertive competition policy, national privacy regulation, greater consumer choice, and approaches to mitigate the harms of disinformation.

    Section 230 Principles

    1. Clear Due Process and Transparency: Users should have a clear idea about what content is or is not allowed on the platform, why their content was taken down, and how to avail themselves of a transparent and equitable appeals process with the platform.
    2. Protecting the Voices of Marginalized Communities: Members of marginalized communities are often subjected to harassment online, which in many cases means these voices are less likely to engage in the kind of speech that Section 230 was meant to protect in the first place. Any Section 230 reform must consider the effect it could have on these voices.
    3. Streamlined Content Moderation Process: Content moderation processes should be clear and concise and should not involve an overly legal process for content moderation decisions.
    4. One Size Does Not Fit All: Outright repeal of Section 230 would exacerbate the very thing we need most to challenge the dominance of the largest platforms — new market entrants. Policymakers can encourage market entry and promote platform competition by limiting the reforms to 230 to larger platforms or by providing some accommodation for smaller platforms.
    5. Section 230 and Business Activity: Section 230 does not protect business activities from sensible business regulation, including business activities that stem from user-generated content in some way. Most judges have reached this conclusion already but it is an area to be aware of that may require legislative clarification.
    6. Pay to Play: Section 230 was designed to protect user speech, not advertising-based business models. Platforms do not need to be shielded by Section 230 for content they have accepted payment to publish.
    7. Conduct, Not Content: Section 230 has allowed platforms to give voice to so many different political issues and movements, like the Black Lives Matter, Christian Coalition, Arab Spring, and #MeToo movements. Focusing on conduct allows future content to flourish but makes sure that platforms adhere to certain guidelines.
    8. Promote User Choices: Policymakers can empower users to move to other platform options or create new platform options by requiring interoperability of platforms. This would reduce barriers to data flows, promoting user choice online as well as a user’s ability to speak legally on alternative platforms.
    9. Establish That Any Section 230 Reforms Meant To Address Alleged Harms Actually Have the Ability To Do So: Some reform proposals seek to revoke Section 230 liability protections for platforms without adequately establishing that doing so addresses the very harm lawmakers are trying to prevent. Lawmakers should address the root of the problem and not merely view every problem as a Section 230 problem.

    As policymakers discuss Section 230, we hope they will commit to these principles to ensure any proposed reforms do not unnecessarily harm the way the internet has provided a way for more to speak, organize, and be heard in the last 25 years. At Public Knowledge, we will be measuring them against various proposals to help the public assess if any proposals are harming free expression unnecessarily.

    For more of Public Knowledge’s perspective and insights on Section 230, including its history, its importance, and our perspective on past reform efforts, see the Section 230 issue page on our website. To commit to our principles or ask your legislators to do so, visit publicknowledge.org/ProtectSection230.