Items tagged "Content Moderation"

Press Release

Public Knowledge Rejects DOJ Proposal to Amend Section 230

September 23, 2020 Content Moderation , digital platforms , Disinformation , DOJ , misinformation , Section 230

Today, the U.S. Department of Justice issued recommendations to Congress for amending Section 230 of the Communications Act. This law states that platforms are not liable for third-party content that they carry, and can take down objectionable content without fear of lawsuits.

Read More
Post

Is #StopHateForProfit the Big Reckoning for Facebook? A Former Marketing Leader Isn’t So Sure

July 9, 2020 Content Moderation

Over the past couple of weeks, a who’s who of over 500 brands, including Verizon, Coca-Cola, Unilever, Levi’s, Starbucks, Reebok, Adidas, and Ford, announced they would withdraw their advertising from Facebook for the month of July or longer. Many were responding to the #StopHateForProfit campaign, organized by a coalition of civil rights and public interest […]

Read More
Post

Due Process for Content Moderation Doesn’t Mean “Only Do Things I Agree With”

June 29, 2020 Content Moderation , Lesiglation , Platform Regulation , Section 230 , Section 230 Series

There’s a common theme in many proposals to amend Section 230 of the Communications Decency Act — the idea that companies need to just follow their terms of service consistently and fairly. Of course, I agree. Who doesn’t? As I detailed in a paper in 2018, I believe that dominant platforms should give their users […]

Read More
Post

Breaking Down and Taking Down Trump’s Latest Proposed Executive Order Spanking Social Media

June 4, 2020 Content Moderation , Disinformation , FCC , First Amendment , Free Expression , Freedom of Expression , FTC , Legal Analysis , Litigation , misinformation , Net Neutrality , Platform Regulation , Section 230

Bashing social media for supposed liberal bias has become pretty standard fare for some conservative pundits and politicians. This remains true despite zero evidence of any kind of bias by social media companies against conservative content or Republican politicians. In fairness, Democratic political leaders have made similar accusations. For example, Speaker Nancy Pelosi (D-CA) has […]

Read More
Post

The Pandemic Proves We Need A “Superfund” to Clean Up Misinformation on the Internet

May 11, 2020 Communications & Pandemic Series , Content Moderation , Disinformation , Internet Superfund , Local Journalism , misinformation , Platform Regulation

This blog post is part of a series on communications policies Public Knowledge recommends in response to the pandemic. You can read more of our proposals here and view the full series here. “When the next pandemic strikes, we’ll be fighting it on two fronts. The first is the one you immediately think about: understanding the disease, researching […]

Read More
Press Release

Public Knowledge Stands with Civil Rights Groups by Refusing Facebook Funding

June 9, 2020 civil rights , Content Moderation , Facebook , Public Knowledge , Section 230

Today, Public Knowledge announces that it will not accept funding from Facebook for any of the organization’s programs or initiatives. The decision follows a June 1 meeting between Facebook’s CEO Mark Zuckerberg and civil rights leaders to discuss the company’s choice to leave up without moderation comments made by President Trump, including one in which he posted, “when the looting starts, the shooting starts,” in reference to protests over George Floyd’s death. Twitter, meanwhile, labeled the content with a disclaimer that it “glorified violence.”

Read More
Post

Moderating Race on Platforms

January 29, 2020 Content Moderation , Platform Regulation , Section 230 , Section 230 Series

In the early fall of 2019, Ryan Williams was driving out of a garage with his wife and child when he was allegedly called a racial epithet by his white neighbor and the neighbor’s daughter. When Williams got out of his car, the neighbor called the police, and as the police arrived, Williams, like many […]

Read More
Post

ISPs Should Not Be Copyright Cops

December 9, 2019 Content Liability , Content Moderation , Copyright , Copyright Reform , DMCA , ISPs , Online Copyright

The internet era ushered in a new way for people around the world to access creative works with the click of a mouse or the tap of a finger. We all know that consumer demand outpaced the business models of entertainment companies, and music, movies, and other copyrighted works were, and still are, often accessed […]

Read More
Post

Could the FCC Regulate Social Media Under Section 230? No.

August 14, 2019 Content Liability , Content Moderation , FCC , Platform Regulation , Section 230

Last week, Politico reported that the White House was considering a potential “Executive Order” (EO) to address the ongoing-yet-unproven allegations of pro-liberal, anti-conservative bias by giant Silicon Valley companies such as Facebook, Twitter, and Google. (To the extent that there is rigorous research by AI experts, it shows that social media sites are more likely to flag posts by self-identified African Americans as “hate speech” than identical wording used by whites.) Subsequent reports by CNN and The Verge have provided more detail. Putting the two together, it appears that the Executive Order would require the Federal Communications Commission to create regulations designed to create rules limiting the ability of digital platforms to “remove or suppress content” as well as prohibit “anticompetitive, unfair or deceptive” practices around content moderation. The EO would also require the Federal Trade Commission to somehow open a docket and take complaints (something it does not, at present, do, or have capacity to do – but I will save that hobby horse for another time) about supposed political bias claims.

Read More
Post

Our Thoughts on Facebook’s Oversight Board for Content Decisions

February 11, 2019 Content Moderation , Facebook , Platform Regulation

Last month, Facebook announced a draft charter for a future Oversight Board for Content Decisions. When implemented, the Oversight Board for Content Decisions, composed of independent experts, would be the last instance reviewer of important and disputed content moderation cases for the platform.

Read More