Recent Op-Eds

It’s Time to Expose Big Tech Bias

By Sen. John Thune

March 27, 2021

This Congress, I have the privilege to serve as the ranking member of the Subcommittee on Communications, Media, and Broadband. Among many other important issues, this assignment allows me to continue my work to hold social media platforms accountable, particularly in the news and information space. Consumers have become increasingly troubled about the way their information is used by social media platforms, and how these sites decide what news and information we see. And there are increasing numbers of anecdotes to suggest that some social media platforms are moderating content in a biased or political way.

Currently, content moderation on social media platforms is governed by Section 230 of the Communications Decency Act, which was enacted into law 25 years ago, long before the advent of social media. Section 230 provides internet sites that host user-generated content – sites like YouTube or Twitter or Facebook – with immunity for the content users post on their sites. For example, if somebody posts a video on YouTube that contains illegal content, YouTube isn’t held legally responsible for that content. 

Today, federal law does not require that social media sites be at all accountable to consumers for those content moderation decisions. That’s why I introduced the bipartisan Platform Accountability and Consumer Transparency Act, or the PACT Act.  My bill would preserve the benefits of Section 230 – like the internet growth and widespread dissemination of free speech it has enabled – while increasing accountability and consumer transparency around content moderation. 

Content moderation is certainly not all bad. The problem is that content moderation has been – and largely continues to be – a black box, with consumers having little or no idea how the information they see has been shaped by the sites they’re visiting.

The PACT Act would address this problem by increasing transparency around the content moderation process. Until relatively recently, sites like Facebook and Twitter would remove a user’s post without explanation and without an appeals process. And even as platforms start to shape up their act with regard to transparency and due process, it’s still hard for users to get good information about how content is moderated.

Under the PACT Act, if a site chooses to remove your post, it has to tell you why it made that decision, and it must explain how your post violated the site’s terms of use. The PACT Act would also require sites to have an appeals process – so if Facebook, for example, removes one of your posts, it would not only have to tell you why, but it would have to provide a way for you to appeal that decision. 

We’ve seen increased concern lately about news articles being removed from social media sites. Under the PACT Act, a newspaper whose article was posted on Facebook or Twitter and then removed by one of those platforms could challenge Facebook or Twitter, which would have to provide a reason for removing the article and allow the newspaper to appeal the decision.

My legislation would also help us develop the data necessary to demonstrate whether social media platforms are removing content in a biased or political fashion. The PACT Act requires detailed transparency reports every six months from large social media platforms like Twitter and Facebook, which will help to provide the data needed to determine whether and where biased moderation exists. 

My bill is a serious, bipartisan approach to the issue of Section 230 reform. It would go a long way toward making social media platforms more accountable to consumers and increasing transparency around the content moderation process.