Viewpoint Diversity Requires Media Policy, Not Editorial RegulationJune 19, 2019
This is the third blog post in a series about Section 230 of the Communications Decency Act. You can view the full series here.
Many conservatives feel that major online platforms discriminate against them. But their proposed policy solutions, which usually involve modifications to Section 230 of the Communications Decency Act, cannot have the effect that they want. However, policies adapted from traditional media policy might help ensure that users of all political viewpoints have the ability to freely communicate online.
First, as to to the alleged censorship and discrimination. I have no doubt that they feel it is happening, although I have seen no evidence to suggest that it is. It is of course true that various “comedians” and “provocateurs” have found themselves at odds with platforms, who are increasingly at least attempting to enforce their terms of service and to remove “supremacist” content and the users who post it. I assume this is not the discrimination that prominent conservatives are concerned with, which is more likely just their observation of the arbitrary and inconsistent (and sometimes mistaken) application of policies that affect users of all political persuasions. Even if it were true, the solution prominent conservatives continue to put forward—viewpoint-based tweaks to Section 230 of the Communications Decency Act, or platform liability for user content more broadly—won’t and cannot have the effect they desire. Senator Josh Hawley’s recent proposal is the most recent example. This legislation would condition the applicability of Section 230 on a supermajority of the Federal Trade Commission certifying that major platforms are acting in a politically neutral way. While this particular proposal is flawed, we share Senator Hawley’s concern with the power of major technology platforms and are glad to see this sort of energy from the right. But there are better ways to prevent corporate gatekeepers from controlling political discourse.
First it’s important to correct the record. Section 230 was never some form of “sweetheart deal“ where platforms get a “complete exemption from traditional publisher liability in exchange for providing a forum free of political censorship.” Section 230 had nothing to do with viewpoints at all—its purpose was to allow platforms to moderate content without incurring liability. Anyone who thinks this was a bad policy and that 230 should have been conditioned on neutrality can make that argument, but if anything, one of the most frequent complaints about 230 is that it allows, but does not require, platforms to engage in reasonable moderation.
As it stands, platforms are free to take whatever political stance they want, and this has no effect on liability or the availability of Section 230. If a Trump-supporting forum wants to ban all anti-Trump content, it can, and it’s still protected by Section 230. And if Jack Dorsey wakes up tomorrow and decides to crack down on rose emoji Twitter because he thinks that the Democratic Socialists of America, after they seize power, will make him eat three square meals a day, then Twitter can do that, too. Whatever political viewpoint platforms want to suppress or promote, they can under current law, and this has no effect on liability standards and was never supposed to.
In light of how the role and importance of online platforms has changed, Section 230 probably does need to be modified. See here for more about that. However, to the extent that viewpoint diversity is an issue, tweaks to content liability rules are the wrong area to focus on.
There are constitutional issues with using 230 to address viewpoint issues as well. Public Knowledge has long argued that the government has the power to direct ISPs and other infrastructure companies to act as common carriers, which removes their ability to act as “editors” of what they carry. This does not infringe their First Amendment rights because, when it comes to carrying others’ speech, their rights are limited. Infrastructure companies are conduits. Utilities, even. However, online platforms—search engines, social media networks, and user-generated content sites—are not, and forcing them to act like they are would lead to bad outcomes. Thus, from the beginning, Senator Hawley’s proposal is likely unconstitutional because it is viewpoint-based. The government cannot require that platforms act “neutrally” with respect to political viewpoints, and it cannot enact different standards of liability for platforms that are perceived to be unfair to any political viewpoints. It is quite clear that even Trump-appointed justices on the Supreme Court would rule this way.
But, while that is a rather major issue to set to one side, let’s do it anyway. Let’s assume that Senator Hawley’s current proposal was enacted, and withstood legal challenge. It simply would not have the effect that he seems to want. Platforms that were under some sort of “neutrality” obligation would either become unmoderated cess pits, or boring Pleasantvilles. In the first case users who feel like they are being discriminated against would have won a Pyrrhic victory, since their content would be available, on sites that no one wants to use. Nearly any content has some sort of political valence, or could be said to, and a site with a “neutrality” obligation would likely be filled with porn, scams, and abuse in short order, making it difficult for people, regardless of their politics, to find the content they really want. Having content merely available on the internet somewhere cannot be the goal, because that’s possible now, for all kinds of content that Facebook or YouTube would remove in a second. The goal seems to be to make sure that content is available on sites with a lot of users. But a neutrality obligation that results in more of an online free-for-all won’t accomplish that.
Under the Pleasantville scenario, platforms would engage in even more moderation in order to comply with the law, but in a more heavy-handed way, banning political discussion or uncomfortable topics entirely. They would be “neutral,” because they would lack debate on contentious topics entirely. They might even simply block user uploads, becoming traditional media sites instead, and shutting off important venues for ordinary users. In either case users who feel put upon by current platforms still won’t have the platform they want. Lots of other people won’t either.
One ironic aspect of this debate is that it is the left which has traditionally had concerns about the suppression of viewpoints by major corporate actors. And it is certainly true that ideas such as the fairness doctrine have had support among liberals, despite conservatives using such rules to their own advantage as well. But the primary approach of the left hasn’t been to somehow require by law that NBC News, the Wall Street Journal, or Sinclair Broadcasting give airtime or column inches to dissenting views, or to put the New York Times editorial board under public administration. Instead the idea is to diffuse the power of corporate media to begin with. Dissenting viewpoints should not have to beg for crumbs from the master’s table. They should have their own well-stocked pantries, kitchens, and dinettes.
If the content moderation decisions of a major platform, or the editorial choices of a major media company, become issues of broad concern, the response of public policy should not be to try to regulate those choices. It should be to make those choices matter less. No one, and no handful of private companies should be arbiters of public discourse.
This means, to start with, we need greater antitrust scrutiny of major platforms and media companies. However, antitrust is fundamentally an economic doctrine. And media policy is not just about economics. The world’s most efficient and competitive media and online marketplace may still shut out minority and dissenting voices. Or, if you prefer, “woke capitalism,” in its pursuit of profits, will leave traditional conservatives behind. In either case, the “marketplace of ideas” (terrible metaphor) is not always well-served by the actual marketplace.
Instead, platforms need their own form of media policy. (Regular media need a good dose of media policy again, as well.) Traditional media policy is not about directly dictating the editorial lines of media companies (though there are some non-editorial content-related guidelines, such as requirements for public, educational, and governmental (PEG) access to cable networks, or surrounding children’s programming). It’s about ensuring that a sufficient diversity of media outlets exists, such that a diversity of viewpoints, ideally, is a natural consequence. This means restrictions on how big any one media outlet can get through mergers or geographic expansion, even if this comes at the cost of economic “efficiency.” This means ensuring that minorities, women, religious organizations, and community-based broadcasters can access the airwaves, even if otherwise they’d be outbid by the Comcasts, AT&Ts, and Sinclairs of the world.
The question conservatives should be asking is not how to regulate the editorial choices of private corporations to their liking. Instead, they should be wondering how to restructure the marketplace to eliminate chokepoints and gatekeepers. How to apply media policy online is an open question—online services are not geographically limited the way that broadcasters, cable companies, and newspapers traditionally are, so it’s harder to use geographic limits to limit their scale and reach. That said, there is an emerging consensus on a broad set of ideas.
First, limit mergers. Among other things, major online platforms should not be permitted to buy out rivals, or potential rivals. Many of these mergers can be blocked under traditional antitrust law, rightly applied. But if they can’t, we need to improve antitrust law through legislation.
Second, mandate data portability. It should be as easy as possible for users and content creators to switch from one platform to another, taking everything—both data they have submitted to platforms, and data that platforms have created—with them.
Third, mandate interoperability and standardized communication protocols where appropriate. Having a diversity of networks and platforms doesn’t mean that users should have to pick the one that all their friends and family use, or that they should have to maintain dozens of accounts on different services. We have a single, global telephone network, where customers of one company, in one country, can call customers of another company in another country, all because of publicly- and privately-managed sets of technical standards. People with Outlook email accounts can email people with Gmail. I use FastMail with my own domains and never have trouble emailing anyone else. Something along these lines is necessary for some kinds of online services. The implementation details matter of course, but a general goal could be a federated network of independent services, not a single centralized service under centralized control.
Fourth, subject dominant platforms to heightened regulatory obligations and sector-specific regulation, as Harold Feld recently argued in his book, The Case for the Digital Platform Act. Some kinds of rules should apply to platforms whatever their size—such as privacy protections. But it also makes sense to require more of platforms that have achieved a certain size and scale. Such requirements can include due process, where we require that platforms set out their policies and follow them, and give users the chance to challenge mistakes and oversights. They can include transparency, where platforms are required to make public the decisions they make, and why they make them. (These requirements don’t limit platforms from making editorial choices; they simply require that platforms be open about what they are.) They may also include economic nondiscrimination rules, where dominant platforms are not permitted to favor their own services over those of competitors, to an extent beyond what antitrust would already prevent.
There are more ideas out there, and this post is long enough. So let me conclude by pointing out that it is refreshing in a sense to have conservatives in Congress who care about the problems that corporate power, big tech, and media concentration can pose. But the solution should not be to simply pass a law demanding, “be more fair to us.” Instead, they should challenge the assumptions that allow those concentrations of power to arise in the first place. (Also, don’t violate the Constitution.)
Image credit: Flickr user Mark Morgan Trinidad B.
About John Bergmayer
John Bergmayer is Legal Director at Public Knowledge, specializing in telecommunications, media, internet, and intellectual property issues. He advocates for the public interest before courts and policymakers, and works to make sure that all stakeholders — including ordinary citizens, artists, and technological innovators — have a say in shaping emerging digital policies.