Last year, Biden called for Section 230 of the Communications Decency Act to be revoked. The law, which Biden voted for in 1996, protects interactive computer services from being held liable for the vast majority of third party content. In recent years, the law has become a popular target of conservative lawmakers, who allege that the largest social media companies implement content moderation policies that stifle conservative speech. Biden and some other members of the Democratic Party have different concerns.

Biden cited online misinformation as motivating his call for Section 230 repeal, “There is no editorial impact at all on Facebook. None. None whatsoever. It’s irresponsible. It’s totally irresponsible.” Bruce Reed, Biden’s chief of staff from 2011 to 2013 and current deputy White House chief of staff, has expressed concerns about Section 230, noting that the law “hurts our kids and is doing possibly irreparable damage to our democracy.”

Harris supported the most recent amendment to Section 230, The Stop Enabling Sex Traffickers Act (SESTA) and Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) package. SESTA/FOSTA added content associated with sex trafficking to the list of Section 230 exceptions. While no doubt well-intended, the law resulted in harmful unintended consequences.

When considering Section 230 amendments as a means to address the harms associated with online speech White House officials and their allies in Congress should proceed with caution. Section 230 may look like a tempting target for amendment, but changes to Section 230 will not eliminate harmful content and may entrench market incumbents.

The Capitol Storming

Concerns about misinformation are bound to be more pronounced in the wake of the January 6, 2021 assault on the Capitol. During attempts by Trump supporters to disrupt Congress’ certification of the Electoral College vote members of the mob that broke into the Capitol shared images and video of the intrusion widely on social media. The shared content revealed that those who stormed the Capitol included believers in the QAnon conspiracy theory, white supremacists, and many others.

Such content appears to vindicate the concerns from Reed and other Biden allies who believe that Section 230 is doing harm to American democracy. After all, online speech appears to be radicalizing millions of Americans, some of whom have demonstrated a willingness to engage in violence.

Before determining whether Section 230 amendments are the best means to address such speech we should keep in mind that racist and conspiratorial content is protected by the First Amendment. Social media networks that cater to or are tolerant of such content are open to accusations of being socially irresponsible, but they are not running afoul of the First Amendment or Section 230.

Many of the largest social media companies have faced criticism over their treatment of conspiratorial and political extremist content for years. Shortly after the Capitol storming Facebook, Twitter, and YouTube took steps to remove a wide range of content, including accounts run by former President Trump. For many, these content moderation decisions were too little too late.

Section 230 and Extremist Content

Legislative attempts to condition Section 230 protections on tackling extremist speech will quickly run into a stubborn constitutional barrier. The First Amendment prevents the government from compelling a private company from favoring one category of legal speech over another. Accordingly, a Section 230 amendment that makes the law’s liability protection contingent on the moderation of legal but awful speech would be unlikely to survive a constitutional challenge.

Social media firms are well within their right to remove extremist speech, and Section 230 prevents them from being sued when they choose to do so. In addition, social media firms can host conspiracy theories as well as extremist content. Not long after the attack on the Capitol, AWS severed ties with Parler, a social media site used by many of the Capitol insurgents. Apple and Google ceased hosting Parler on the Android and iOS app stores.

Shortly after the Capitol attack the most popular social media firms took action against content associated with the incident and claim of election fraud. This content moderation and the action AWS, Google, and Apple took against Parler prompted encrypted chat services Signal and Telegram to experience a surge in new users.

The move to encrypted services is not a surprise. Encrypted services allow for users to communicate with one another without the service or law enforcement being privy to the communications. These services are not ideal social media replacements. Social media platforms are designed such that users can express themselves to a wide audience. Encrypted services do allow for group chats, but they cannot replicate the social media experience of Facebook, Twitter, and YouTube.

Unintended Consequences

Lawmakers could amend Section 230 in such a way that it would encourage social media platforms to be more aggressive when it comes to moderating content. As I have noted before, Boston University Law School’s Danielle Citron and the Brookings Institution’s Ben Wittes have suggested making Section 230 protections contingent on platforms taking “reasonable steps” to address unlawful uses of their services. In such an environment, social media firms would probably be willing to embrace false positives in an attempt to remain in compliance.

But as I wrote not long after the attack on the Capitol:

If such an amendment were enacted, interactive computer services would have an incentive to embrace false positives in order to ensure that they don’t run afoul of Section 230. Awful but lawful speech could be stifled because sites hosting third‐​party content would seek to avoid bankruptcy via a tsunami of lawsuits.

Some might ask, “What’s wrong with services having an incentive to err on the side of caution when it comes to borderline illegal speech?” The answer is that such an environment is likely to be anti‐​competitive, with powerful market incumbents best positioned to adapt to how courts and lawmakers interpret “reasonable steps.” While concerns about online political extremism are likely to prompt lawmakers to seek carrots and sticks for social media companies, we should keep in mind that Section 230 amendments could ultimately entrench the companies so many are criticizing.

Response to Extremist Speech

It would be inappropriate for the president and lawmakers to look to Section 230 amendments when thinking about how to address the spread of misinformation, conspiracy theories, and extremist political content. Almost all of such content is legal under the First Amendment. The most popular social media sites have taken action to address harmful content. But even if these moderation efforts were 100 percent effective such content would find a home on encrypted services or social media sites designed with such content in mind.

The rise of extremist political content is troubling, especially when it leads to violence. Amending Section 230 to address this content is likely to run into constitutional barriers or result in market incumbents improving their market positions.

Commentary by Matthew Feeney. Originally published at Cato At Liberty.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s