Mark Zuckerberg proposes 230 limited reforms ahead of Congressional hearing

Opening statements by Mark Zuckerberg, Sundar Pichai and Jack Dorsey were published ahead of Thursday’s disinformation hearing in the House – and show the three CEOs taking on an unusually sensitive issue for technology platforms. All three statements are worth reading, with Dorsey focusing on internal tools like Birdwatch and Pichai warning about the dangers of a total repeal of Section 230 of the Communications Decency Act.

But the most detailed proposal came from Zuckerberg, who spoke at length about his preferred changes to Section 230. Instead of completely repealing the law – as President Biden asked during the campaign – Zuckerberg’s proposal would make Section 230 conditional on companies to maintain a system to remove illegal content.

As Zuckerberg describes in the letter:

We believe that Congress should consider making platform intermediaries ‘liability protection for certain types of illegal content conditional on companies’ ability to comply with best practices to combat the spread of that content. Instead of receiving immunity, platforms should be required to demonstrate that they have systems in place to identify and remove illegal content. Platforms should not be held responsible if certain content avoids detection – which would be impractical for platforms with billions of posts per day – but they must be required to have adequate systems to deal with illegal content.

Standards for maintaining 230 protections could be set by third parties, Zuckerberg continues, and would exclude demands around encryption and privacy “that deserve an audience in their own right.” This distinguishes Zuckerberg’s proposal from 230 previously presented bills, such as the EARN IT Act, which conditions protections to a long-sought crypto back door. Zuckerberg’s proposal is closer to the PACT Act, which conditions the protections on disclosures of transparency and other measures, but focuses less on the removal of illegal content.

Generally speaking, it is unusual for companies to propose rules on how they would like to be regulated, but it is less unusual for Zuckerberg, who has already written extensively on favorable rules for data portability and content moderation.

This is the most detailed proposal for Section 230 that Zuckerberg has ever submitted, and one that would require few material changes for Facebook itself. Facebook already has significant systems in place to identify and remove illegal or otherwise objectionable content. Still, the proposal can address some of the most urgent objections to Section 230, which generally focuses on smaller sites entirely dedicated to malicious activity.

The problem is particularly urgent for groups like the National Center for Missing and Exploited Children (NCMEC), which struggle with websites that do not digitize or moderate images of child abuse.

“There are many companies, especially some of the very large companies, that engage in really tremendous voluntary measures,” said Yiota Souras of NCMEC The Verge earlier this year. “But there are many companies that do not, and there is no legal requirement for them to use any type of detection or screening.”

The hearing is scheduled to begin at 12 pm ET on Thursday, March 25.

Source