Items tagged "Content Moderation"

Post

How the House Antitrust Bills Preserve Platforms’ Editorial Discretion and Spur Consumer Choice

August 11, 2021 Big Tech , Big Tech Bills , Content Moderation , interoperability , Legislation , Nondiscrimination

In late June, the House Judiciary Committee voted to approve, on a bipartisan basis, a six-part package of legislation designed to restrict dominant digital platforms from leveraging their power to disadvantage competitors or promote their own lines of business unfairly. We hope the bills will eventually have counterparts or companions from the Senate. Our purpose […]

Read More
Press Release

PACT Act Would Shine Light on Platforms’ Content Moderation Decisions To Benefit Consumers

March 17, 2021 Content Moderation , digital platforms , principles , scorecard , Section 230

Today, Senators Brian Schatz (D-HA) and John Thune (R-S.D.) reintroduced the “Platform Accountability and Consumer Transparency (PACT) Act” with the stated purpose of reforming Section 230 to “strengthen transparency in the process online platforms use to moderate content.” The bill seeks to achieve this by “hold[ing] those companies accountable for content that violates their own policies or is illegal.” 

Read More
Press Release

Public Knowledge Rejects DOJ Proposal to Amend Section 230

September 23, 2020 Content Moderation , digital platforms , Disinformation , DOJ , misinformation , Section 230

Today, the U.S. Department of Justice issued recommendations to Congress for amending Section 230 of the Communications Act. This law states that platforms are not liable for third-party content that they carry, and can take down objectionable content without fear of lawsuits.

Read More
Video

Video: Free Expression Forum – How Section 230 Uplifts Marginalized Voices

September 4, 2020 Content Moderation , Free Expression , Free Expression Forum , Free Speech , Freedom of Expression , marginalized communities , Section 230 , Webinar Series

The Free Expression Forum at Public Knowledge is an ongoing series of dialogues about the importance of free expression online to artists, entrepreneurs, and content creators. It highlights how policy decisions impact this important value and how the community of diverse online voices must stand up to preserve it as technology develops. This first forum […]

Read More
Press Release

Public Knowledge Stands with Civil Rights Groups by Refusing Facebook Funding

June 9, 2020 civil rights , Content Moderation , Facebook , Public Knowledge , Section 230

Today, Public Knowledge announces that it will not accept funding from Facebook for any of the organization’s programs or initiatives. The decision follows a June 1 meeting between Facebook’s CEO Mark Zuckerberg and civil rights leaders to discuss the company’s choice to leave up without moderation comments made by President Trump, including one in which he posted, “when the looting starts, the shooting starts,” in reference to protests over George Floyd’s death. Twitter, meanwhile, labeled the content with a disclaimer that it “glorified violence.”

Read More
Press Release

Public Knowledge Criticizes White House Effort to Control Online Speech

May 15, 2019 Content Moderation , due process , Platform Regulation

Today, the White House launched a tool to enable digital platform users to report alleged instances of bias on technology platforms like Facebook and Twitter.

Read More
Press Release

Public Knowledge and Roosevelt Institute Launch Guidebook to Regulating Digital Platforms

May 8, 2019 consumer choice , Content Moderation , Platform Competition , Platform Regulation , Privacy

Today, Public Knowledge and the Roosevelt Institute launched “The Case for the Digital Platform Act,” a new e-book by Public Knowledge Senior Vice President Harold Feld with a foreword by former Federal Communications Commission Chairman Tom Wheeler. The e-book operates as a guide for addressing the challenges posed by the power of digital platforms.

Read More
Press Release

Public Knowledge Welcomes YouTube Recommendation Changes Targeting ‘Borderline Content” and Misinformation

January 25, 2019 Competition , Competition Policy , Content Moderation , YouTube

Today, YouTube announced that it would begin reducing recommendations of “borderline content” — materials that stop short of violating the company’s community guidelines but still may be harmful — and content that could misinform users. According to YouTube, this would include “videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.”

Read More
Post

How an Internet Superfund Could Clean Up Vaccine Disinformation That Threatens Our COVID-19 Recovery

February 12, 2021 Communications & Pandemic Series , Communications and the Pandemic , Congress , Content Moderation , Disinformation , Fact-Checking , Internet Superfund , misinformation , Platform Regulation , Superfund for the Internet

Have you heard? The COVID-19 pandemic has entered a new phase. No, it’s not the new, more contagious “super strains” of the virus, or the beginning of the vaccine rollout. It’s the shift in focus of the disinformation peddlers, from the virus itself (remember the “infodemic”?) and the 2020 election to the ramping up of […]

Read More
Post

Principles to Protect Free Expression on the Internet

February 11, 2021 Content Moderation , Section 230 , Section 230 Series

Section 230 of the Communications Act has been dubbed the “twenty six words” that created the interactive free expression of the internet: No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.  Yet in the past year or so, […]

Read More