Shield of responsibility for technology under fire: 26 words and what is at stake

Democrats and Republicans in Congress are aiming for a controversial law that protects internet platforms, including Facebook Inc. and Twitter Inc. of lawsuits over content posted by users.

The measure – just 26 words known as Section 230 – now faces its biggest hit since it was included in the Communications Decency Act of 1996. Requests to revise it increased in the months leading up to the November election and intensified after the deadly attack on Congress by the then president Donald Trumployalists of.

Trump and his Republican Party allies say Section 230 gives companies room to maneuver to censor conservative speech, a statement he repeated on Sunday at a right-wing meeting in Florida. Democrats accuse the same Internet platforms of failing to contain disinformation and hate speech, arguing that Trump’s posts on electoral fraud fueled the Capitol insurrection on January 6.

Even some on Wall Street are pointing their fingers at the shield after the market turmoil caused by a horde of retail investors using online chat forums with targeted actions like GameStop Corp.

Read more: Section 230 is supposed to make the Internet a better place. Failed

While industry lobbyists have called for a cautious approach, a House panel has already summoned Facebook’s executive directors, Alphabet Inc.Google and Twitter will testify in a virtual hearing on March 25 about disinformation and disinformation on their platforms. Facebook CEO Mark Zuckerberg called for more Internet regulation and said it is open to reforming Section 230.

Still, even with Democrats in control of Congress, any bill would need bipartisan support in the Senate to clear the 60-vote limit to move the legislation forward. This means that lawmakers will need to negotiate and make concessions at a time when they were deeply divided.

New measures to redesign Section 230 are expected in the coming weeks. Here is a guide to the proposals on the table:

In this video, we learned what Section 230 means for the modern internet and explored the uncertain future of the law.

Hate speech and civil rights

SAFE TECH ACT: The Consumer Protection, Fraud, Exploitation, Threats, Extremism and Health Act (SAFE TECH) was the first Section 230 bill introduced in the Senate this year. Launched on February 5 by Democratic senators Mark Warner from Virginia, Mazie Hirono from Hawaii, and Amy Klobuchar of Minnesota, the bill has no Republican support at this time.

LIGHTS: The legislation would hold technology companies responsible for content belonging to four categories: civil rights, international human rights, antitrust and harassment, harassment or intimidation. It would clarify that companies can be held responsible for manslaughter actions, which means that families can sue platforms that may have contributed to the death of a person.

The move would drastically change the underlying law to limit companies’ liability protections, treating them as publishers of any paid content on their platforms. This includes advertising that generates huge profits for Google, Twitter and Facebook. It restricts the provision of liability to cover only “speech” by third parties, instead of the generic term “information” in the original law. It would also allow victims to seek court orders when a company fails to address material that “is likely to cause irreparable damage”.

SUPPORT AND OPPOSITION: The NAACP Legal and Educational Defense Fund and the Anti-Defamation League supported the project.

NetChoice, which represents big tech companies like Facebook and Google, is opposed to the bill, saying it “kicks off” Section 230.

“The bill would not only curb freedom of expression on the Internet, but would also revoke Section 230 protections for all e-commerce markets,” such as Etsy Inc., said Carl Szabo, the group’s vice president and general counsel, in a statement. “Small sellers across the country would lose access to customers around the world at a time when entrepreneurs most need that access.”

WHAT IS THE NEXT: Representative Yvette Clarke, a Democrat from New York, is working on a more narrowly focused bill known as the Civil Rights Modernization Act. This would amend Section 230 to ensure that federal civil rights laws apply to targeted ads by technology companies in an effort to prevent the spread of hate speech online. Clarke said in an interview that he wants to find out how platforms promote civil rights violations and ensure that they crack down on hate speech “so that it doesn’t go so far as to harm the American people or American institutions”. She plans to introduce the measure in several weeks.

Democratic Representatives Anna Eshoo from California and Tom Malinowski of New Jersey are planning to reintroduce the Dangerous Algorithms Protection Act. The bill would remove a platform’s liability shield if its algorithm were used to amplify or recommend content that incites hate speech, violence or acts of terrorism. “These companies have shown that they will not do the right thing alone,” Eshoo told Bloomberg.

Content moderation and consumer rights

PACT ACT: The bipartisan Platform Responsibility and Consumer Transparency Act (PACT) was introduced in the Senate in June 2020. Senator Brian Schatz, a Democrat from Hawaii and a Senator John Thune, a Republican from South Dakota, co-sponsored the project.

LIGHTS: This would require “major online platforms” to remove content within 24 hours if notified of a court determination that the content is illegal. Companies would be required to issue quarterly reports, including data on content that has been removed, demonetized or depreciated. It would also allow consumers to appeal content moderation decisions. The legislation would allow the US Department of Justice, Federal Trade Commission and state attorney generals to sue civil actions for online activities.

SUPPORT AND OPPOSITION: The move is supported by the Alliance for Safe Online Pharmacies, which works to combat illegal online pharmacies. NetChoice and the digital rights group Electronic Frontier Foundation oppose the project.

The Internet Association, which represents companies, including Amazon.com Inc., Google and Facebook, said they appreciate the bill’s effort to promote transparency and accountability in content moderation, but raised concerns about broad reporting requirements and said the move should be reduced to exclude smaller internet companies. The group said the highly detailed requirements would be “extremely costly”.

WHAT IS THE NEXT: The PACT ACT is expected to be reintroduced later this month, according to a person familiar with the matter.

In the Chamber, Representative Jan Schakowsky, a Democrat from Illinois, is expected to introduce the Online Consumer Protection Act within weeks. As chairman of a Chamber Energy and Commerce subcommittee that oversees consumer protection issues, Schakowsky would lead any effort to reform how Section 230 impacts consumer safety. His move, which was released in draft form last year, would remove liability protections if platforms violated their terms of service and allow FTC enforcement and consumer lawsuits.

The bill would require social media companies and online markets to create consumer protection policies that define whether content can be blocked, removed or modified. The policy also needs to describe how a user will be notified if content is being removed and how to appeal the removal. Schakowsky he said his bill would guarantee “that consumer rights in the physical world extend to the virtual world”.

Read more: Big Tech terms of service examined under Democrat plan

Child Exploitation

EARN ACTION: The bipartisan Law on the Elimination of Abusive and Rampant Negligence in Interactive Technologies (EARN IT) was introduced in the Senate in March 2020 and was proposed by the Senate Judiciary Committee. Senators Richard Blumenthal, a Connecticut Democrat and Lindsey Graham, a Republican from South Carolina, introduced the bill last year.

LIGHTS: This would allow for state civil and criminal prosecution, as well as federal civil prosecution, if companies advertise, promote, present, distribute, or request child sexual abuse material. The legislation would also establish a National Commission for the Prevention of Child Sexual Exploitation Online, which would create voluntary best practices for the industry. An amendment at the last congress removed the original text that conditioned liability protection on companies that adopted best practices.

SUPPORT AND OPPOSITION: The project is supported by groups of sex trafficking survivors, including the National Center for Missing and Exploited Children and the National Center for Sexual Exploitation.

The Internet Association said it supports the goal of ending online child exploitation, but said the project “would create a damaging lack of consistency” with state laws and said it plans to work with lawmakers to improve the project.

WHAT IS THE NEXT: The project must be reintroduced at this Congress, according to a spokesman for Blumenthal. Senator Dick Durbin, from Illinois, supported the project. As the new chairman of the Judiciary Committee, Durbin could lead him through this Congress.

Read more: Senate panel eliminates technology liability protection for child abuse content

Industry opposition

The Internet Association said that Section 230 strikes a “careful balance” between protecting companies from lawsuits and encouraging them to proactively remove hate and extremist speech online. Removing the protections would discourage companies from moderating any content for fear of being prosecuted, the group says.

The group also claims that legislation often fails to keep up with the changing nature of the Internet and that costly legal requirements can cause start-ups to close their doors.

“The imposition of excessively prescriptive and costly requirements through legislation or regulations will have a negative impact on the Internet ecosystem,” the trade group told Congress in testimony last year.

Still, many companies recognize that some change in the measure is inevitable and are prepared to work with lawmakers to help draft proposals – also in the interest of avoiding more draconian measures.

Read more: Technology trade groups open to protection against liability against illegal posting

To contact the author of this story:
Rebecca Kern in Washington at [email protected]

To contact the editor responsible for this story:
Sara Forden at [email protected]

Zachary Sherwood

© 2021 Bloomberg LP All rights reserved. Used with permission.

Source