Today, Public Knowledge and the Roosevelt Institute launched “The Case for the Digital Platform Act,” a new e-book by Public Knowledge Senior Vice President Harold Feld with a foreword by former Federal Communications Commission Chairman Tom Wheeler. The e-book operates as a guide for addressing the challenges posed by the power of digital platforms.
Last month, Facebook announced a draft charter for a future Oversight Board for Content Decisions. When implemented, the Oversight Board for Content Decisions, composed of independent experts, would be the last instance reviewer of important and disputed content moderation cases for the platform.
Today, YouTube announced that it would begin reducing recommendations of “borderline content” -- materials that stop short of violating the company’s community guidelines but still may be harmful -- and content that could misinform users. According to YouTube, this would include “videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.”
The promotion of diverse viewpoints has been the cornerstone of United States media policy over the last 100 years. In November 2018, Facebook CEO Mark Zuckerberg published an article that delineated the algorithm that Facebook will use to disincentivize hate speech. Although Zuckerberg’s proposal is a laudable step for content moderation, it may be neglecting the value of exposing people to diverse views and competing sources of news. As we debate moderation issues, platforms should consider not only the prohibition of hate speech, but also the affirmative exposure to broader ideas and perspectives. The Federal Communications Commission’s implementation of the diversity principle on radio and TV, explored below, offers some valuable lessons here.