Facebook announced yesterday that it expects to pay a fine up to $5 billion dollars over accusations that the company violated its 2011 consent decree with the Federal Trade Commission over consumer privacy on the social media platform. The company also said there can be no assurances as to the terms of resolution of the investigation.
Facebook CEO Mark Zuckerberg recently published an op-ed in the Washington Post naming a role for government and regulation around four specific policies that continue to be concerns for users of Facebook and broader digital platforms. In two areas (privacy and political advertising) Zuckerberg reiterates Facebook’s agreement with previous legislative proposals, including parts of the General Data Protection Regulation (GDPR) in the European Union and (although not named) concepts from the Honest Ads Act introduced by Senators Amy Klobuchar, Mark Warner, and the late John McCain. In addition to these two topics, Zuckerberg also moves towards responding to calls from the public interest community for stronger content moderation of hateful content and for meaningful data portability to promote competition in a market that trends towards dominant platforms. While some may view yet another Facebook op-ed cynically, I believe this one should be welcomed.
Last week, thanks to investigative reporting, we learned that Facebook discovered in January that it was storing millions of users’ passwords in plain text format, making them fully readable for thousands of its employees. Facebook has acknowledged that this was a serious security error and privacy breach on its side, as its systems, ideally, “are designed to mask passwords using techniques that make them unreadable”, and promised that it “will be notifying everyone whose passwords we have found were stored in this way.” There is no evidence that any of the thousand employees with access to these unencrypted passwords actually accessed them, but Facebook’s decision to remain mum reveals an important lesson for the overarching privacy and security policy debate. Importantly, data security incidents are a widespread problem that goes well beyond Facebook.
Last month, Facebook announced a draft charter for a future Oversight Board for Content Decisions. When implemented, the Oversight Board for Content Decisions, composed of independent experts, would be the last instance reviewer of important and disputed content moderation cases for the platform.
The promotion of diverse viewpoints has been the cornerstone of United States media policy over the last 100 years. In November 2018, Facebook CEO Mark Zuckerberg published an article that delineated the algorithm that Facebook will use to disincentivize hate speech. Although Zuckerberg’s proposal is a laudable step for content moderation, it may be neglecting the value of exposing people to diverse views and competing sources of news. As we debate moderation issues, platforms should consider not only the prohibition of hate speech, but also the affirmative exposure to broader ideas and perspectives. The Federal Communications Commission’s implementation of the diversity principle on radio and TV, explored below, offers some valuable lessons here.