The Tech Policy Greenhouse is an online symposium where experts tackle the most difficult policy challenges facing innovation and technology today. These are problems that don't have easy solutions, where every decision involves tradeoffs and unintended consequences, so we've gathered a wide variety of voices to help dissect existing policy proposals and better inform new ones.

Section 230 Isn't Why Omegle Has Awful Content, And Getting Rid Of 230 Won't Change That

from the one-size-does-not-fit-all dept

Last year, I co-authored an article with my law school advisor, Prof. Eric Goldman, titled “Why Can’t Internet Companies Stop Awful Content?” In our article, we concluded that the Internet is just a mirror of our society. Unsurprisingly, anti-social behavior exists online just as it does offline. Perhaps though, the mirror analogy doesn’t go far enough. Rather, the Internet is more like a magnifying glass, constantly refocusing our attention on all the horrible aspects of the human condition.

Omegle, the talk-to-random-strangers precursor to Chatroulette, might be that magnifying glass, intensifying our urge to do something about awful content.

Unfortunately, in our quest for a solution, we often skip a step, jumping to Section 230—the law that shields websites from liability for third-party content—instead of thinking carefully about the scalable, improvable, and measurable strides to be made through effective content moderation efforts.

Smaller companies make for excellent content moderation case studies, especially relatively edgier companies like Omegle. It’s no surprise that Omegle is making a massive comeback. After 100+ days of quarantine, anything that recreates at least a semblance of interaction with humans, not under the same roof, is absolutely enticing. And that’s just what Omegle offers. For those that are burnt out on monotonous Zoom “coffee chats,” Omegle grants just the right amount of spontaneity and nuanced human connection that we used to enjoy before “social distancing” became a household phrase.

Of course, it also offers a whole lot of dicks.

When I was a teen, Omegle was a sleepover staple. If you’re unfamiliar, Omegle offers two methods of randomly connecting with strangers on the Internet: text or video. Both are self-explanatory. Text mode pairs two anonymous strangers in a chat room whereas video mode pairs two anonymous strangers via their webcams.

Whether you’re on text or video, there’s really no telling what kinds of terrible content—and people—you’ll likely encounter. It’s an inevitable and usual consequence of online anonymity. While the site might satisfy some of our deepest social cravings, it might also expose us to some incredibly unpleasant surprises outside the watered-down and sheltered online experiences provided to us by big tech. Graphic pornography, violent extremism, hate speech, child predators, CSAM, sex trafficking, etc., are all fair game on Omegle; all of which is truly awful content that has always existed in the offline world, now magnified by the unforgiving, unfiltered, use-at-your-own-risk, service.

Of course, like with any site that exposes us to the harsh realities of the offline world, critics are quick to blame Section 230. Efforts to curtail bad behavior online usually start with calls to amend Section 230.

At least to Section 230’s critics, the idea is simple: get rid of Section 230 and the awful content will follow. Their reason, as I understand it, is that websites will then “nerd harder” to eliminate all awful content so they won’t be held liable for it. Some have suggested the same approach for Omegle.

Obvious First Amendment constraints aside (because remember, the First Amendment protects a lot of the “lawful but awful content,” like pornography, that exists on Omegle’s service), what would happen to Omegle if Section 230 were repealed? Rather, what exactly is Omegle supposed to do?

For starters, Section 230 excludes protection for websites that violate federal criminal law. So, Omegle would continue to be on the hook if it started to actively facilitate the transmission of illegal content such as child pornography. No change there.

But per decisions like Herrick v. Grindr, Dyroff v. Ultimate Software, and Roommates.com, it is well understood that Section 230 crucially protects sites like Omegle that merely facilitate user-to-user communication without materially contributing to the unlawfulness of the third-party content. Hence, even though there exists an unfortunate reality where nine year-olds might get paired randomly with sexual predators, Omegle doesn’t encourage or materially contribute to that awful reality. So, Omegle is afforded Section 230 protection.

Without Section 230, Omegle doesn’t have a lot of options as a site dedicated to connecting strangers on the fly. For example, the site doesn’t even have a reporting mechanism like its big tech counterparts. This is probably for two reasons: (1) The content on Omegle is ephemeral so by the time it’s reported, the victim and the perpetrator have likely moved on and the content has disappeared; and (2) it would be virtually impossible for Omegle to issue suspensions because Omegle users don’t have dedicated accounts. In fact the only option Omegle has for repeat offenders is a permanent IP ban. Such an option is usually considered so extreme that it’s reserved for only the most heinous offenders.

There are a few things Omegle could do to reduce their liability in a 230-less world. They might consider requiring users to have dedicated handles. It’s unclear though whether account creation would truly curb the dissemination of awful content anyway. Perhaps Omegle could act on the less heinous offenders, but banned, suspended, or muted users could always just generate new handles. Plus, where social media users risk losing their content, subscribers, and followers, Omegle users realistically have nothing to lose. So, generating a new handle is relatively trivial, leaving Omegle with the nuclear IP ban.

Perhaps Omegle could implement some sort of traditional reporting mechanism. Reporting mechanisms are only effective if the service has the resources to properly respond to and track issues. This means hiring more human moderators to analyze the contextually tricky cases. Additionally, it means hiring more engineers to stand up robust internal tooling to manage reporting queues and to perform some sort of tracking for repeat offenders.

For Omegle, implementing a reporting mechanism might just be doing something to do something. For traditional social media companies, a reporting mechanism ensures that violating content is removed and the content provider is appropriately reprimanded. Neither of those goals are particularly relevant to Omegle’s use case. The only goal a reporting mechanism might accomplish is in helping Omegle track pernicious IP addresses. Omegle could set up an internal tracking system that applies strikes to each IP before the address is sanctioned. But if the pernicious user can just stand up a new IP and continue propagating abuse, the entire purpose of the robust reporting mechanism is moot.

Further, reporting mechanisms are great for victimized users that might seek an immediate sense of catharsis after encountering abusive content. But if the victim’s interaction with the abusive content and user is ephemeral and fleeting, the incentive to report is also debatable.

All of this is to drive home the point that there is no such thing as a one-size-fits-all approach to content moderation. Even something as simple as giving users an option to report might be completely out of scope depending on the company’s size, resources, bandwidth, and objectives.

Another suggestion is that Omegle simply stop allowing children to be paired with sexual predators. This would require Omegle to (1) perform age verification on all of its users with the major trade-off being privacy—not to mention the obvious that it may not even work. Nothing really stops a teen from stealing and uploading their parents’ credit card or license; and (2) require all users to prove they aren’t sexual predators (???)—an impossible (and invasive) task for a tiny Internet company.

Theoretically, Omegle could pre-screen all content and users. Such an approach would require an immense team of human content moderators, which is incredibly expensive for a website that has an estimated annual revenue of less than $1 million and less than 10 employees. Plus, it would completely destroy the service’s entire point. The reason Omegle hasn’t been swallowed up by tech incumbents is because it offers an interesting online experience completely unique from Google, Facebook, and Twitter. Pre-screening might dilute that experience.

Another extreme solution might be to just strip out anonymity entirely and require all users to register all of their identifying information with the service. The obvious trade-off: most users would probably never return.

Clearly, none of these options are productive or realistic for Omegle; all of which are consequences of attacking the awful content problem via Section 230.

Without any amendments to Section 230, Omegle has actually taken a few significant steps to effectively improve their service. For example, Omegle now has an 18+ Adult, “unmoderated section” in which users are first warned about sexual content and required to acknowledge that they’re 18 or older before entering. Additionally, Omegle clarifies that the “regular” video section is monitored and moderated to the best of their abilities. Lastly, Omegle recently included a “College student chat” which verifies students via their .edu addresses. Of course, to use any of Omegle’s features, a user must be 18+ or 13+ with parental permission.

The “unmoderated section” is an ingenious example of a “do better” approach for a service that’s strapped for content moderation options. Omegle’s employees likely know that a primary use case of the service is sex. By partitioning the service, Omegle might drastically cut down on the amount of unsolicited sexual content encountered by both adult and minor users of the regular service, without much interruption to the service’s overall value-add. These experiments in mediating the user to user experience can only improve from here. Thanks to Section 230, websites like Omegle increasingly pursue such experiments to help their users improve too.

But repealing Section 230 leaves sites like Omegle with one option: exit the market.

I’m not allergic to conversations about how the market can self-correct these types of services and whether they should be supported by the market at all. Maybe sites like Omegle—that rely on their users to not be awful to each other as a primary method of content moderation—are not suitable for our modern day online ecosystem.

There’s a valid conversation to be had within technology policy and Trust and Safety circles about websites like Omegle and whether the social good they provide outweigh the harms they might indirectly cater to. Perhaps, sites like Omegle should exit the market. However, that’s a radically different conversation; one that inquires into whether current innovations in content moderation support sites like Omegle, and whether such sites truly have no redeemable qualities worth preserving in the first place. That’s an important conversation; one that shouldn’t involve speculating about Section 230’s adequacy.

Jess Miers is a third-year law student at Santa Clara University School of Law and a Legal Policy Specialist at Google. Her scholarship primarily focuses on Section 230 and content moderation. Opinions are her own and do not represent Google.

Filed Under: , , ,
Companies: omegle

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Section 230 Isn't Why Omegle Has Awful Content, And Getting Rid Of 230 Won't Change That”

Subscribe: RSS Leave a comment
13 Comments
Anonymous Coward says:

Maybe sites like Omegle—that rely on their users to not be awful to each other as a primary method of content moderation—are not suitable for our modern day online ecosystem.

Maybe night clubs full of strangers looking for sex ought to have been banned, but the weren’t because what happened inside was not reported on very often, and they weren’t owned by one company.

Anonymous Anonymous Coward (profile) says:

Re: Re: Re:

If a fight starts, and the management doesn’t control it well enough, and some bystander gets hurt, you can bet some insurance company will sure try. Then there are the dram shop laws, which can have a devastating impact on not only an establishment, but their employees as well. There are some things in meatspace where liability is different, but they aren’t limited to speech, like the Internet is.

Celyxise (profile) says:

Re: Re: Re: Re:

And in these examples, the nightclub would only be liable for its own actions: if the staff was aware of the fight and chose to do nothing, if the staff failed to follow policy regarding the dram shop laws like checking ID or over serving a customer. These are actions or inaction of the nightclub and its staff and shows that the protections provided by Section 230 parallel those of a meatspace venue.

A nightclub is not liable for a fight breaking out, unless it facilitated the fight or it refused to respond after being notified.
A website is not liable for illegal content on its site, unless it facilitated in its creation or it refused to take down the content after being notified.

Anonymous Coward says:

Why, with all this concern, haven’t cars been banned yet? They tick all the social hot-buttons:

  • Cause Climate Change(TM) (cooling or warming, depending on the date of the press release)
  • Use Roads (building of which is apparently racist, although I’m not quite sure how)
  • Cause Deaths (not so many as recreational toxins or Covid-19, but more than recreational-toxin-distributing-networks and the milder strains of influenza, and FAR more than police)
  • Enable disreputable social contacts of all SORTS of kinds (sex, violence, rude gestures or language, noise pollution, political bumper stickers)
  • And if there isn’t a rumor that car use causes autism in unvaccinated children, can’t someone just go start it?

LET’S MAKE THIS HAPPEN!

"1865: CDC says guns don’t cause lead poisoning. 2010: CDC says automobiles don’t attract and focus 5G cell phone rays to cause premature aging."

PaulT (profile) says:

Re: Re:

"Why, with all this concern, haven’t cars been banned yet?"

Because people who apply two seconds of intelligent thought on the issue, rather than attacking strawmen as you have done, understand that real life is a compromise. While other modes of transportation are preferable, cities in the US have been designed in such a way that movement without a car is next to impossible, and the need to work and eat outweigh the other factors.

MathFox says:

Private chat

As I understand Omegle offers a private chat service. Moderation goes against the concept of private chat as there will be a moderator "listening in". (Maybe not all the time, but the option is there.)

If I compare it with email: My ISP promises to only look in my email when that’s required for trouble-shooting. So Omegle could say: it’s a private chat, we will not read or watch, so we can’t moderate. Leave the chat if you don’t like the way the "conversation" is going.
I think that will also pass legal scrutiny without section 230; a bar-owner is not responsible for what his visitors say in a private conversation.

Anonymous Coward says:

This problem has already been solved by the commercial sector,

Internet content censoring services are available to the consumer for a fee. The beef here, is that consumers don’t want to pay for it.

Nobody is forcing anybody to observe uncensored content. Uncencored content is simply cheaper, because censorship is labor intensive.

The economic model is inconvienient for some. But that is always the case. That doesn’t give the neighborhood PTA the right to restrict speech outside of the schoolroom.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
12:04 SOPA Didn't Die. It's Just Lying In Wait. (5)
09:30 Demanding Progress: From Aaron Swartz To SOPA And Beyond (3)
12:00 How The SOPA Blackout Happened (3)
09:30 Remembering The Fight Against SOPA 10 Years Later... And What It Means For Today (16)
12:00 Winding Down Our Latest Greenhouse Panel: Content Moderation At The Infrastructure Layer (4)
12:00 Does An Internet Infrastructure Taxonomy Help Or Hurt? (15)
14:33 OnlyFans Isn't The First Site To Face Moderation Pressure From Financial Intermediaries, And It Won't Be The Last (12)
10:54 A New Hope For Moderation And Its Discontents? (7)
12:00 Infrastructure And Content Moderation: Challenges And Opportunities (7)
12:20 Against 'Content Moderation' And The Concentration Of Power (32)
13:36 Social Media Regulation In African Countries Will Require More Than International Human Rights Law (7)
12:00 The Vital Role Intermediary Protections Play for Infrastructure Providers (7)
12:00 Should Information Flows Be Controlled By The Internet Plumbers? (10)
12:11 Bankers As Content Moderators (6)
12:09 The Inexorable Push For Infrastructure Moderation (6)
13:35 Content Moderation Beyond Platforms: A Rubric (5)
12:00 Welcome To The New Techdirt Greenhouse Panel: Content Moderation At The Infrastructure Level (8)
12:00 That's A Wrap: Techdirt Greenhouse, Broadband In The Covid Era (17)
12:05 Could The Digital Divide Unite Us? (29)
12:00 How Smart Software And AI Helped Networks Thrive For Consumers During The Pandemic (41)
12:00 With Terrible Federal Broadband Data, States Are Taking Matters Into Their Own Hands (18)
12:00 A National Solution To The Digital Divide Starts With States (19)
12:00 The Cost Of Broadband Is Too Damned High (12)
12:00 Can Broadband Policy Help Create A More Equitable And inclusive Economy And Society Instead Of The Reverse? (11)
12:03 The FCC, 2.5 GHz Spectrum, And The Tribal Priority Window: Something Positive Amid The COVID-19 Pandemic (6)
12:00 Colorado's Broadband Internet Doesn't Have to Be Rocky (9)
12:00 The Trump FCC Has Failed To Protect Low-Income Americans During A Health Crisis (26)
12:10 Perpetually Missing from Tech Policy: ISPs And The IoT (10)
12:10 10 Years Of U.S. Broadband Policy Has Been A Colossal Failure (7)
12:18 Digital Redlining: ISPs Widening The Digital Divide (21)
More arrow