Section 230 Lets Tech Fix Content Moderation Issues. Congress Should Respect That

from the soft-law-works dept

Congress is on the brink of destroying the internet as we know it.

Bipartisanship in Congress is usually rare to see, but odd alliances have formed in the Capitol against Section 230, a law that regulates content moderation online which is in large part responsible for the incredible growth and diversity of the internet. Republicans accuse Facebook and Twitter of censoring conservative users on their platforms. Democrats accuse these companies of not doing enough in removing extremist or false content. While both sides agree that S230 has got to go, they’re at war with each other over who will drive regulatory efforts on content moderation. In the end, it won’t really matter who wins. Either way, the spoils of this war will be a gutted S230 or its repeal. That’s bad news for everyone.

Before they ruin the internet entirely, Democrats and Republicans should take a step back and let industry standards catch up with the times.

Removing Section 230 because of actors like Facebook and Twitter would mean harming other websites that haven’t done anything wrong and putting companies in the crossfire. On the other hand, too many new restrictions would cripple the competitive edge our tech sector has over the rest of the world. In both cases only larger companies like Facebook and Twitter would survive, while small businesses — like a family restaurant in Steubenville, Ohio, whose social media presence is driven entirely by customer reviews — would suffer and likely close.

This doesn’t mean that nothing should be done. Something should be done, and soft law is the way.

Soft law is not “law” in the normal sense. It refers to the diverse tools used by private or government bodies to guide how industries should develop. Common soft laws include industry standards created by public-private partnerships, the LEED rating system of the U.S. Green Building Council, and the guides on how to treat COVID by the Center for Disease Control. The uniqueness of soft law is that, instead of coming primarily from government regulators, it can come from anywhere. And instead of focusing on setting strict rules, it focuses on methods to attain ideal outcomes. This makes it “soft” because interpretation of the ‘law’ will differ between participants, who will not be fined for going their own way. Soft law provides guidance while encouraging innovation in reaching industry goals. In this way, it beats the rigidity of hard law.

Soft law is already heavily utilized in artificial intelligence and automated vehicles, so legislators, regulators, and private companies advocating for this approach would have a strong precedent to point to as Section 230 talks continue. Moreover, this wouldn’t be the first time that we tried to regulate the internet with soft law. The early internet was ‘regulated’ by the Clinton administration through The Framework for Global Electronic Commerce, which established principles of how the federal government would regulate internet activities and how it expected the private sector to act. Most importantly, it stated that, “…governments should recognize the unique qualities of the Internet. The genius and explosive success of the Internet can be attributed in part to its decentralized nature and to its tradition of bottom-up governance.”

As legislators look to revise regulations on the internet, it's essential they preserve that bottom-up governance that made the internet such an explosive success. To that end, rather than prescribing a one-size-fits-all approach to content moderation, the government should encourage companies to develop their own standards and make those standards publicly accessible. Instead of prescribing a single set of rules for the internet, the government should hold up companies developing their unique standards as models for the industry at large.

A great example of one such model is the Oversight Board of Facebook, which recently announced its first series of case complaints against the company. The board, composed of former Prime Ministers, think tank leaders, and legal scholars, deliberated and overturned four out of five cases of censorship. Facebook released a statement saying they would abide by the decisions and work to create clearer content moderation policies. Facebook’s approach is innovative for tech giants like itself, but smaller companies require different standards for their audience. Nonprofits like Wikipedia handle this with their own open-source system that encourages volunteer administrators collaborating on content issues. Smaller companies like AllTrails bring moderation to their entire user-base to suggest new trail maps and edit current ones based on user feedback.

Government needs to understand that what works for Facebook won't work for everyone else, and targeting Section 230 to fix all content moderation problems is the wrong approach. The key idea of Facebook’s Oversight Board, Wikipedia’s volunteer administrators, and AllTrails’ public moderation is that they all accomplish the same goal in very different ways. And that’s the essence of soft law. Protected by Section 230, and without an overarching government agency or document requiring them to reach a prescribed standard, companies should be able to create innovative methods in content moderation all on their own.

Some argue that self-regulation is a big nothing burger — that it’s little more than a facade shielding companies from having to take any real responsibility for content posted on their sites. But that’s not true. Leaving content moderation solely to the companies makes them accountable to the public. By now we should all know just how compelling the public can be. For instance, last June public perception of Facebook’s ability to make good decisions on content moderation was overwhelmingly negative, with about 80% not trusting ‘Big Tech,’ but trusting the government even less. It’s no coincidence that Facebook launched its Oversight Board that summer. Other examples of companies imposing standards voluntarily to meet the public’s demand for accountability include Reddit’s “Transparency Report” which is issued every year allowing the public to see what content is being removed and the reasons for doing so. This report is a part of Reddit’s interpretation of the Santa Clara Principles, a soft law effort led by the Electronic Frontier Foundation, ACLU, and several other non-profits. Following these principles allows the public to hold companies accountable to their own promises, addressing a major issue in customer trust while maintaining the integrity of Section 230.

Section 230 allowed entrepreneurs the protection and flexibility to explore new directions in tech that lead to some of the greatest economic and technological advancements in US history. Instead of gutting a law that made the internet what it is today, regulators should respect soft law alternatives brought by the private sector and encourage companies to find what works, helping users and businesses that rely on platforms currently protected by Section 230. Innovation is what will win the war of the web. We’ll only have a free internet as long as we can keep it.

Luke is an economics graduate student at George Mason University focusing on entrepreneurship, health, and innovative technology. You can follow him on twitter @LiberLuke.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: congress, content moderation, section 230, soft law


Reader Comments

Subscribe: RSS

View by: Thread


  1. icon
    bhull242 (profile), 10 Feb 2021 @ 2:40pm

    Re: Re:

    Yeah, considering the fact that Democrats and Republicans want completely different and incompatible goals (Democrats want to have intermediaries liable for third-party content they leave up and for not moderating sufficiently but not liable for successfully removing objectionable content; Republicans want immunity for third-party content but want less moderation by companies and, thus, want to remove immunity for moderation decisions), I can’t really see any way to accomplish both goals. Especially since most current congresspeople are not willing to abolish §230 but just want to amend it.


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Special Affiliate Offer

Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads
.

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.