The EARN IT Act Creates A New Moderator's Dilemma

from the moderate-perfectly-or-else dept

Last month, a bipartisan group of U.S. senators unveiled the much discussed EARN IT Act, which would require tech platforms to comply with recommended best practices designed to combat the spread of child sexual abuse material (CSAM) or no longer avail themselves of Section 230 protections. While these efforts are commendable, the bill would cause significant problems.

Most notably, the legislation would create a Commission led by the Attorney General with the authority to draw up a list of recommended best practices. Many have rightly explained that AG Barr will likely use this new authority to prohibit end-to-end encryption as a best practice. However, less discussed is the recklessness standard the bill adopts. This bill would drastically reduce free speech online because it eliminates the traditional moderator’s dilemma and instead creates a new one: either comply with the recommended best practices, or open the legal floodgates.

Prior to the passage of the Communications Decency Act in 1996, under common law intermediary liability, platforms could only be held liable if they had knowledge of the infringing content. This meant that if a platform couldn’t survive litigation costs, they could simply choose not to moderate at all. While not always a desirable outcome, this did provide legal certainty for smaller companies and start-ups that they wouldn’t be litigated into bankruptcy. This dilemma was eventually resolved thanks to Section 230 protections, which prevent companies from having to make that choice.

However, the EARN IT Act changes that equation in two key ways. First, it amends Section 230 by allowing civil and state criminal suits against companies who do not adhere to the recommended best practices. Second, for the underlying Federal crime (which Section 230 doesn’t affect), the bill would change the scienter requirement from actual knowledge to recklessness.  What does this mean in practice? Currently, under existing Federal law, platforms must have actual knowledge of CSAM on their service before any legal requirement goes into effect. So if, for example, a user posts material that could be considered CSAM but the platform is not aware of it, then they can’t be guilty of illegally transporting CSAM. Platforms must remove and report content when it is identified to them, but they are not held liable for any and all content on the website. However, a recklessness standard turns this dynamic on its head.

What actions are “reckless” is ultimately up to the jurisdiction, but the model penal code can provide a general idea of what it entails: a person acts recklessly when he or she “consciously disregards a substantial and unjustifiable risk that the material element exists or will result from his conduct.” But what’s worse, the bill opens the platform’s actions to civil cases. Federal criminal enforcement normally targets the really bad actors, and companies that comply with reporting requirements will generally be immune from liability. However with these changes, if a user posts material that could potentially be considered CSAM, despite no knowledge on the part of the platform, civil litigants could argue that the moderation and detection practices of the companies, or lack thereof, constituted a conscious disregard of the risk that CSAM will be shared by users.

When the law introduces ambiguity into liability, companies tend to err on the side of caution. In this case, that means the removal of potentially infringing content to ensure they cannot be brought before a court. For example, in the copyright context, a Digital Millennium Copyright Act safe-harbor exists for internet service providers (ISPs) who “reasonably implement” policies for terminating repeat infringers on their service in “appropriate circumstances.” However, courts have refused to apply that safe-harbor when a company didn’t terminate enough subscribers. This uncertainty about whether a safe-harbor applies will undoubtedly lead ISPs to act on more complaints, ensuring they cannot be liable for the infringement. Is it “reckless” for a company not to investigate postings from an IP address if other postings from that IP address were CSAM? What if the IP address belongs to a public library with hundreds of daily users?

This ambiguity will likely force platforms to moderate user content and over-remove legitimate content to ensure they cannot be held liable. Large firms that have the resources to moderate more heavily and that can survive an increase in lawsuits may start to invest the majority of moderation resources into CSAM out of an abundance of caution. As a result, this would leave less resources to target and remove other problematic content such as terrorist recruitment or hate speech. Mid-sized firms may end up over-removing user content that in any way features a child or limit posting to trusted sources, insulating them from potential lawsuits that could cripple the business. And small firms, who likely can’t survive an increase in litigation could ban user content entirely, ensuring nothing on the website hasn’t been posted without vetting. These consequences, and the general burden on the First Amendment, are exactly the type of harms that drove courts to adopt a knowledge standard for online intermediary liability, ensuring that the free flow of information was not unduly limited.

Yet, the EARN IT Act ignores this. Instead, the bill assumes that companies will simply adhere to the best practices and therefore retain Section 230 immunity, avoiding these bad outcomes. After all, who wouldn’t want to comply with best practices? Instead, this could force companies to choose between vital privacy protections like end-to-end encryption or litigation. The fact is there are better ways to combat the spread of CSAM online which don’t require platforms to remove key privacy features for user.

As it stands now, the EARN IT Act solves the moderator’s dilemma by creating a new one: comply, or else.

Jeffrey Westling is a technology and innovation policy fellow at the R Street Institute, a free-market think tank based in Washington, D.C.

Filed Under: , , , , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “The EARN IT Act Creates A New Moderator's Dilemma”

Subscribe: RSS Leave a comment
28 Comments
This comment has been deemed insightful by the community.
Anonymous Coward says:

As it stands now, the EARN IT Act solves the moderator’s dilemma by creating a new one: comply, or else.

Shouldn’t that be comply, or refuse to accept user content even if it means shutting up shop. How do sites like GitLab, or Thingiverse survive if they have to examine everything posted, including contents of zip files etc?

This comment has been deemed insightful by the community.
This comment has been deemed funny by the community.
Stephen T. Stone (profile) says:

“My fellow senators, we must be seen doing something!”

“But what if the thing we do ends up destroying the Internet?”

“Good! Without the Internet, those assholes on Twitter can’t call me names for trying to destroy the Internet.”

“I don’t think—”

“Nor should you.”

This comment has been deemed insightful by the community.
Koby (profile) says:

Corporatism at its Finest

Large corporations love regulations like this because they are the only ones who can comply with the regulations. One of the awesome features of the internet is that small companies can compete against large ones based on the merits. Year 2001 Google could compete against Microsoft and win. Large corporations hate that. So they will support things like the Earn It Act. Only the big boys like Facebook and Google will survive.

If you CSAMthing says:

Re: Re: Corporatism at its Finest

coordinated by Hollywood

Proof or GTFO!

Just kidding: we know how that propaganda monstrosity works.

They distribute the child pornography, then they try to get it back, much like the good old days of J. Edgar Hoovers prime, blackmailing gays over dick pics and love letters, but with the new and improved anti-hetero gay mafia at the helm, flipping the script.

Anonymous Coward says:

Re: Has the senate judiciary committee voted on it yet

Right now COVID-19 is sucking all the air out of the room plus the senate has taken off for a month-long recess due to return on April 23rd but that may not even happen and extend to May.

The bill hasn’t garnered alot of co-sponsors yet aside from the ones who are usually behind bills such as these but once this pandemic passes and things resume to semi-normalcy it will be there waiting.

Anonymous Anonymous Coward (profile) says:

Re: Re:

That is an interesting question. Even though ‘enacted’ by a law, could those best practices be challenged as not law? When presented in courts, can the defense go on and on about how those so called ‘best practices’ are not in fact law and therefore cannot be applied as law? How about the defense shows how a ‘best practice’ is not in fact a best practice. How about the defense showing that the ‘best practices’ imposed by a singularity such as the Attorney General, who actually knows nothing about ‘best practices’, are gifts to the organizations that paid to get his boss elected?

I could go on, but unlike regulatory agencies who are required to have comment periods prior to rule making, it appears that these ‘best practices’ will be imposed by fiat.

Anonymous Coward says:

Re: Re:

No but it would be much harder to get these recomended best practices passed as laws as the people wouldn’t stand for it. But if they pass a bill designed to combat CP that allows non elected persons to state what can and cannot be done, it’s less likly that people will notice they are losing rights.

Anonymous Coward says:

Re: Re: Re:

Well that was changed in the introduced version where it states that once the commission settles on a set of "Best Practices" they must be drafted as a bill and passed in the normal albeit truncated manner akin to Trade Promotion Authority when negotiating trade deals.

So given that these "best practices" will be codified by an official law of congress I’d say they could be.

That One Guy (profile) says:

Re: Re:

Ah, but there’s where the gross dishonesty and weasel words comes into play(with appropriate apologies to actual weasels for being lumped in with politicians), for you see it’s not that sites are punished if they don’t follow the ‘best practices’, they simply aren’t EARNing the privilege of 230 protections, hence the bill’s name.

Thanks to the constant refrain of spinning 230 protections as an extra privilege that online platforms get versus those poor offline ones(rather than the truth that 230 simply codifies that online platforms have the same protections as offline ones), you can be sure that it will be argued that there’s no punishment being handed out, sites are simply being treated equally, and if they want to EARN that special protection back all they have to do is follow a few simple rules, which of course will be trivial if they really want to follow them.

This comment has been deemed insightful by the community.
Igualmente69 (profile) says:

Are construction companies liable for criminals using roads to drive to their targets, or to transport contraband? What about the auto manufacturers who build the cars that the criminals use on said roads? What about the gas stations that provide fuel for said vehicles? What about the oil companies that sell gas to the various stations? What about the geologists who sell their services to help the oil companies locate resources to extract? What about the colleges that educate the geologists? What about, literally everyone else on Earth, who in some way indirectly affect all of these processes?

That One Guy (profile) says:

Re: Re:

And that is the kicker and direct refutation of the idea that sites should have to earn 230 protection: For any offline company it is already well understood that you can’t sue the platform/company for what a third-party uses their service/product for, the only thing 230 does is make it clear that that same protection against liability applies to online platforms as well.

This comment has been deemed insightful by the community.
Sok Puppette says:

Hmm. It actually may be worse than that, because it appears to apply beyond what you’d think of as "platforms".

The recklessness and "best practices" requirements are applied to all providers of "interactive computer services". The definition of "interactive computer service" is imported by reference from 230. That definition is:

The term "interactive computer service" means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.

The part about "system… that enables computer access" sweeps in all ISPs and telecommunication carriers, as well as operators of things like Tor nodes. And "access software provider" brings in all software tools and many non-software tools, including open source projects.

Under 230, those broad definitions are innocuous, because they’re only used to provide a safe harbor. An ISP or software provider is immunized if it doesn’t actually know about or facilitate specific content. No ISP and almost no software provider has any actual knowledge of what passes through or use its service, let alone editing the content or facilitating its creation, so they get the full safe harbor, with minimal or no actual cost to them. And anyway nobody has been after them on 230-ish issues, so including them doesn’t hurt.

Under EARN-IT, those same definitions would be used to impose liability, so now those parties actually get burdens from being inside the definition. That’s worse than a repeal of 230. It doesn’t just remove a safe harbor; it opens an avenue for positive attack.

This commission could decide that it’s a "best practice" for ISPs to block all traffic they can’t decrypt. Or it could decide that it’s a "best practice" not to provide any non-back-doored encryption software to the public, period.

Or, since those might generate too much political backlash at the start, it could start boiling the frog on the slippery slope by, say, deciding that it’s a "best practice" not to facilitate meaningfully anonymous communication, effectively outlawing Tor, I2P, and many standard VPN practices.

Then it could start slowly expanding the scope of that, possibly even managing to creep into banning all non-back-doored encryption, without ever making any sudden jump that might cause a sharp public reaction.

Back on the platform side, over time the rules could easily slide from the expected (and unacceptable) "best practice" of not building any strong encryption into your own product, to the even worse "best practice" of trying to identify and refuse to carry anything that might be encrypted. Start by applying it to messaging, then audio/video conferencing, then file storage… and then you have precedents giving you another avenue to push it all the way to ISPs.

This comment has been deemed insightful by the community.
Sok Puppette says:

Re: Re:

… oh, and even if you weren’t a company with any significant infrastructure, they could also come after you for providing software for P2P or other decentralized solutions. "Protocols, not platforms" only works if somebody’s allowed to provide the software to speak the protocol…

Bergman (profile) says:

Re: Re:

It occurs to me that Congress itself has in-house email systems, and all you’d need to do to have standing to sue Congress as a whole is if someone on their system sends or receives email that contains content that might be illegal.

As for best practices, what would stop the commission from defining ‘extremism’ to be something that cannot be transmitted under best practices, with ‘extremism’ defined as holding political beliefs different than the currently-elected political majority party?

Ron Currier (profile) says:

CSAM

I’m not a troll, I just play one online sometimes…

Republicans and certain Christian groups notwithstanding, is there any actual evidence that CSAM online is a serious problem? I’ve never run across any and I lurk in some pretty sketchy areas of the web. The current porn tubes’ fascination with "incest" storylines are as fake as the rest of porn storylines. Even back when Instagram was for porn I didn’t see any obvious CSAM. If this really a problem that needs new laws or is it just Government and Churchs looking to censor the Internet?

That One Guy (profile) says:

'If you ignore all the evidence my argument is great!'

Last month, a bipartisan group of U.S. senators unveiled the much discussed EARN IT Act, which would require tech platforms to comply with recommended best practices designed to combat the spread of child sexual abuse material (CSAM) or no longer avail themselves of Section 230 protections. While these efforts are commendable, the bill would cause significant problems.

And if you believe any of the above I’ve got some bridges to sell you. As noted in previous articles on the trainwreck of a bill the tools to deal with CSAM already exist, they simply aren’t used.

This, much like FOSTA, is both a PR stunt and a way to undercut 230/encryption(though FOSATA was only aimed at 230) since going at those directly has so far failed to work, so treating it as an honest attempt to combat CSAM is already giving it far more legitimacy than it deserves.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...