U.S. Sen. John Thune (R-S.D.), chairman of the Subcommittee on Communications, Technology, Innovation, and the Internet, today led a hearing titled, “The PACT Act and Section 230: The Impact of the Law that Helped Create the Internet and an Examination of Proposed Reforms for Today’s Online World,” to examine online platforms’ content moderation practices and to discuss what legislative measures can be taken to ensure consumers are protected and empowered while on the internet. During the hearing, Thune questioned experts on transparency requirements that could be used to protect consumers online.
“The reality is that the platforms have a strong incentive to exercise control over the content each of us sees, because if they can present us with content that will keep us engaged on the platform, we will stay on the platform longer,” said Thune. “Moderation is an important function that platforms must provide in order to deliver a valuable experience to their users. Unfortunately, it’s hard for users to get good information about how content is moderated. The Internet has evolved significantly since Section 230 was enacted. Long gone are the days of the online bulletin boards. Today, internet platforms have sophisticated content moderation tools, algorithms, and recommendation engines to promote content and connect users, all optimized toward keeping every user engaged on the platform. The platforms have monetized these systems through targeted advertising and related businesses and have consequently become some of the largest companies in the world. Moreover, these platforms have become essential to our daily lives, as many Americans live, work, and communicate increasingly online.”
On June 24, 2020, Thune joined Sen. Brian Schatz (D-Hawaii), ranking member of the subcommittee, in introducing the Platform Accountability and Consumer Transparency (PACT) Act, bipartisan legislation to update Section 230 of the Communications Decency Act. The PACT Act would strengthen transparency in the process online platforms use to moderate content and hold those companies accountable for content that violates their own policies or is illegal.
Thune’s full remarks (as prepared for delivery):
“I want to thank everyone for being here today, both virtually and in-person.
“We are here to examine the legacy of Section 230 of the Communications Decency Act, which was enacted into law 24 years ago, and to discuss a proposal Senator Schatz and I have introduced to reform Section 230, known as the Platform Accountability and Consumer Transparency Act, or PACT Act.
“Section 230 was written to protect internet platforms both large and small from being held liable for user-generated content while also enabling these platforms to take an active role in moderating such content. The sweeping nature of these protections, coupled with expansive readings by the courts, has come to mean that, with few exceptions, internet platforms are not liable for the comments, pictures, and videos that their users and subscribers post, no matter how harmful.
“As one of our witnesses here today has written in what he calls his “biography” of Section 230, the law’s proposal and passage flew under the radar back in 1996, receiving virtually no opposition or media coverage. Today, however, Section 230 is the subject of intense debate and media scrutiny, to the extent that both the President of the United States, and his likely competitor in this fall’s election, have called for the complete repeal of Section 230.
“One of many variables that has sparked the intense debate about Section 230 is that internet platforms have actively cultivated the notion that they are merely providing the technology for people to communicate and share their thoughts and ideas. Therefore, until only relatively recently, the platforms largely concealed—or at the very least failed to disclose—their moderation and curation systems to sustain this fiction of being a “neutral platform” for all ideas. Content moderation has been—and largely continues to be—a black box, which has led to deep suspicion by many users about bias and discrimination.
“The reality is that the platforms have a strong incentive to exercise control over the content each of us sees, because if they can present us with content that will keep us engaged on the platform, we will stay on the platform longer. Moderation is an important function that platforms must provide in order to deliver a valuable experience to their users. Unfortunately, it’s hard for users to get good information about how content is moderated.
“The Internet has evolved significantly since Section 230 was enacted. Long gone are the days of the online bulletin boards. Today, internet platforms have sophisticated content moderation tools, algorithms, and recommendation engines to promote content and connect users, all optimized toward keeping every user engaged on the platform. The platforms have monetized these systems through targeted advertising and related businesses and have consequently become some of the largest companies in the world. Moreover, these platforms have become essential to our daily lives, as many Americans live, work, and communicate increasingly online.
“That is why it is important to recognize that the benefits of Section 230 for companies have come with tradeoffs for consumers. As the Department of Justice has noted in its recommendations to reform Section 230, broad Section 230 immunity can pose challenges for federal agencies in civil enforcement matters. It is questionable whether Section 230 was intended to allow companies to invoke immunity against the federal government acting to protect American consumers in the civil enforcement context. This has contributed to the creation of a different set of rules for enforcing consumer protections against online companies, compared to those in the offline world.
“In addition, disparate complaint intake and transparency reporting practices between internet companies have led to a limited ability for consumers to address and correct harms that occur online.
“And as Americans conduct more and more of their activities online, the net outcome is an increasingly less protected and more vulnerable consumer.
“The Internet of 1996 is a far cry from the Internet of 2020, and as Americans exist increasingly online – a trend now being accelerated by the COVID-19 pandemic as illustrated by the fact that each of our witnesses is attending virtually today – re-evaluating Section 230 within today’s context will ensure its protections continue to balance the interests of both consumers and companies.
“Against this backdrop, the bill Senator Schatz and I have introduced would update Section 230 to enable greater transparency and accountability for users without damaging its foundational economic, innovative, and entrepreneurial benefit that helped allow the internet to flourish in the first place.
“The PACT Act would require companies that moderate content to provide a clear and easily accessible user policy that explains how, when, and why user-generated content might be removed.
“It would also require these online platforms to create a defined complaint system that processes reports and notifies users of moderation decisions within 14 days.
“Our legislation would require large technology companies to have a toll-free customer service phone line with live customer support to take customer complaints.
“This requirement is geared toward consumers who are less familiar with technology, and those in marginalized communities who may not have readily available access to technology, but who want or need to talk to a real person about a complaint about content on a service or platform.
“The PACT Act would also hold platforms accountable for their content moderation practices by requiring them to submit quarterly reports to the Federal Trade Commission outlining material they’ve removed from their sites or chosen to deemphasize.
“In addition, the PACT Act would make clear that the immunity provided by Section 230 does not apply to civil enforcement actions brought by the federal government. The PACT Act would also make clear that Section 230 does not apply where a platform is provided with a court order finding that content is unlawful. Both of these provisions are also recommendations that the Department of Justice recently put forward in its recent review of Section 230.
“At its core, Section 230 reform is about balancing the consumer’s need for transparency and accountability against internet companies’ need for flexibility and autonomy.
“I believe the PACT Act strikes the right balance and I am committed to achieving a meaningful, bipartisan approach to Section 230 reform that can be enacted into law sooner rather than later.
“However, I recognize that the Internet is complex, and any meaningful regulation must consider various perspectives from diverse groups in academia, civil society, and industry.
“Consequently, we have brought together today a very distinguished panel, and I am confident the conversation will help ensure we are reforming Section 230 in the right way. Each of our witnesses has deep expertise in both the original intent of Section 230 and how it has been interpreted by the courts over the years.
“Today, we are joined by Former Representative Chris Cox, the co-author of Section 230 of the Communications Decency Act; Jeff Kosseff, Assistant Professor of Cyber Science at the United States Naval Academy; Elizabeth Banker, Deputy General Counsel of the Internet Association; and Olivier Sylvain, Professor of Law at Fordham School of Law.
“Thank you each for your participation on this important topic.
“I now recognize Ranking Member Schatz for any opening remarks he may have.”