Texas Court Gets It Right: Dumps Texas's Social Media Moderation Law As Clearly Unconstitutional

from the nicely-done dept

Back in June we reported on how Florida’s social media moderation bill was tossed out as unconstitutional in a Florida court. The ruling itself was a little bit weird, but an easy call on 1st Amendment grounds. It was perhaps not surprising, but still stupid, to see Texas immediately step up and propose its own version of such a bill, which was signed in September. We again predicted that a court would quickly toss it out as unconstitutional.

And that’s exactly what has happened.

There was some whispering and concerns that Texas’ law was craftier than the Florida law, and parts of it might survive, but, nope. And this ruling is actually more thorough, and more clear than the slightly jumbled Florida ruling. It’s chock full of good quotes. The only thing that sucks about this ruling, honestly, is that Texas is definitely going to appeal it to the 5th Circuit court of appeals and the 5th Circuit is the craziest of Circuits and seems, by far, the most likely to ignore the basic 1st Amendment concepts in favor of some weird Trumpist political grandstanding.

However, for this brief shining moment, let’s celebrate a good, clean ruling that vindicates all the points many of us have been making about just how batshit crazy the Texas law was, and how it was so blatantly an infringement on the 1st Amendment rights of websites. There are a bunch of pages wasted on proving that the trade groups who brought the lawsuit have standing, which aren’t worth rehashing here beyond saying that, yes, trade groups for internet companies have the standing to challenge this law.

From there, the ruling gets down to the heart of the matter, and it’s pretty straight forward. Content moderation is the same thing as editorial discretion and that’s clearly protected by the 1st Amendment.

Social Media Platforms Exercise Editorial Discretion Protected by the First
Amendment

Judge Robert Pitman cites all the key cases here — Reno v. ACLU (which tossed out all of the CDA — minus Section 230 — as unconstitutional, but also clearly established that the 1st Amendment applies to the internet), Sorrell v. IMS Health (establishing that the dissemination of information is speech) and, perhaps most importantly, Manhattan Cmty. Access v. Halleck, the Justice Brett Kavanaugh-authored ruling we’ve highlighted many times for making it quite clear that private internet companies are free to moderate however they see fit. It also cites the key case that was instrumental to the ruling in Florida: Miami Herald v. Tornillo, which made clear the 1st Amendment protections for editorial discretion:

Social media platforms have a First Amendment right to moderate content disseminated on
their platforms. See Manhattan Cmty. Access Corp. v. Halleck, 139 S. Ct. 1921, 1932 (2019) (recognizing
that “certain private entities[] have rights to exercise editorial control over speech and speakers on
their properties or platforms”). Three Supreme Court cases provide guidance. First, in Tornillo, the
Court struck down a Florida statute that required newspapers to print a candidate’s reply if a
newspaper assailed her character or official record, a “right of reply” statute. 418 U.S. at 243. In
1974, when the opinion was released, the Court noted there had been a “communications
revolution” including that “[n]ewspapers have become big business . . . [with] [c]hains of
newspapers, national newspapers, national wire and news services, and one-newspaper towns [being]
the dominant features of a press that has become noncompetitive and enormously powerful and
influential in its capacity to manipulate popular opinion and change the course of events.” Id. at
248–49. Those concerns echo today with social media platforms and “Big Tech” all the while
newspapers are further consolidating and, often, dying out. Back to 1974, when newspapers were
viewed with monopolistic suspicion, the Supreme Court concluded that newspapers exercised
“editorial control and judgment” by selecting the “material to go into a newspaper,” deciding the
“limitations on the size and content of the paper,” and deciding how to treat “public issues and
public officials—whether fair or unfair.” Id. at 258. “It has yet to be demonstrated how
governmental regulation of this crucial process can be exercised consistent with First Amendment
guarantees of a free press as they have evolved to this time.”

There’s also a fun bit for all the very silly people who keep insisting that social media websites are “common carriers” which could subject them to certain restrictions. The court says “nope,” highlights how very different they are from common carriers, and moves on.

This Court starts from the premise that social media platforms are not common carriers.
“Equal access obligations . . . have long been imposed on telephone companies, railroads, and postal
services, without raising any First Amendment issue.” United States Telecom Ass’n v. Fed. Commc’ns
Comm’n, 825 F.3d 674, 740 (D.C. Cir. 2016). Little First Amendment concern exists because
common carriers “merely facilitate the transmission of speech of others.” Id. at 741. In United States
Telecom, the Court added broadband providers to its list of common carriers. Id. Unlike broadband
providers and telephone companies, social media platforms “are not engaged in indiscriminate,
neutral transmission of any and all users’ speech.” Id. at 742. User-generated content on social media
platforms is screened and sometimes moderated or curated. The State balks that the screening is
done by an algorithm, not a person, but whatever the method, social media platforms are not mere
conduits. According to the State, our inquiry could end here, with Plaintiffs not needing to prove
more to show they engage in protected editorial discretion. During the hearing, the Court asked the
State, “[T]o what extent does a finding that these entities are common carriers, to what extent is that
important from your perspective in the bill’s ability to survive a First Amendment challenge?” (See
Minute Entry, Dkt. 47). Counsel for the State responded, “[T]he common carriage doctrine is
essential to the First Amendment challenge. It’s why it’s the threshold issue that we’ve briefed . . . .
It dictates the rest of this suit in terms of the First Amendment inquiry.” (Id.). As appealing as the
State’s invitation is to stop the analysis here, the Court continues in order to make a determination
about whether social media platforms exercise editorial discretion or occupy a purgatory between
common carrier and editor.

There’s also a short footnote totally dismissing the fact that the Texas bill, HB20, tries to just outright declare social media sites as common carriers. That’s not how any of this works.

HB 20’s pronouncement that social media platforms are common carriers… does not impact this Court’s legal analysis.

The judge briefly notes that social media is obviously different in many ways than newspapers, and that AI-based moderation is certainly a technological differentiator, but then brings it back around to basic principles: it’s still all editorial discretion.

This Court is convinced that social media platforms, or at least those covered by HB 20,
curate both users and content to convey a message about the type of community the platform seeks
to foster and, as such, exercise editorial discretion over their platform’s content.

In fact, Texas legislators’ and the governor’s own hubris helped sink this bill by admitting in the bill itself and in quotes about the bill, how this is all about editorial discretion.

Indeed, the text of
HB 20 itself points to social media platforms doing more than transmitting communication. In
Section 2, HB 20 recognizes that social media platforms “(1) curate[] and target[] content to users,
(2) place[] and promote[] content, services, and products, including its own content, services, and
products, (3) moderate[] content, and (4) use[] search, ranking, or other algorithms or procedures
that determine results on the platform.” Tex. Bus. & Com. Code § 120.051(a)(1)–(4). Finally, the
State’s own basis for enacting HB 20 acknowledges that social media platforms exercise editorial
discretion. “[T]here is a dangerous movement by social media companies to silence conservative
viewpoints and ideas.” Governor Abbott Signs Law Protecting Texans from Wrongful Social Media Censorship,
OFFICE OF THE TEX. GOVERNOR (Sept. 9, 2021), https://gov.texas.gov/news/post/governorabbott-
signs-law-protecting-texans-from-wrongful-social-media-censorship. “Texans must be able to
speak without being censored by West Coast oligarchs.” Bryan Hughes (@SenBryanHughes),
TWITTER (Aug. 9, 2021, 4:34 PM), https://twitter.com/SenBryanHughes/status/
1424846466183487492 Just like the Florida law, a “constant theme of [Texas] legislators, as well as
the Governor . . . , was that the [platforms’] decisions on what to leave in or take out and how to
present the surviving material are ideologically biased and need to be reined in.” NetChoice, 2021 WL
2690876, at *7. Without editorial discretion, social media platforms could not skew their platforms
ideologically, as the State accuses of them of doing. Taking it all together, case law, HB 20’s text, and
the Governor and state legislators’ own statements all acknowledge that social media platforms
exercise some form of editorial discretion, whether or not the State agrees with how that discretion
is exercised.

And then, once it’s clear that moderating is the same as editorial discretion, it’s easy to see how the bill’s restrictions are a clear 1st Amendment problem. It does this, first, by highlighting the impossible choices the bill puts in front of social media companies, using the example of content about Nazis.

The State claims that social media platforms
could prohibit content categories “such as ‘terrorist speech,’ ‘pornography,’ ‘spam,’ or ‘racism’” to
prevent those content categories from flooding their platforms. (Resp. Prelim. Inj. Mot., Dkt. 39, at
21). During the hearing, the State explained that a social media platform “can’t discriminate against
users who post Nazi speech . . . and [not] discriminate against users who post speech about the antiwhite
or something like that.” (See Minute Entry, Dkt. 47). Plaintiffs point out the fallacy in the
State’s assertion with an example: a video of Adolf Hitler making a speech, in one context the
viewpoint is promoting Nazism, and a platform should be able to moderate that content, and in
another context the viewpoint is pointing out the atrocities of the Holocaust, and a platform should
be able to disseminate that content. (See id.). HB 20 seems to place social media platforms in the
untenable position of choosing, for example, to promote Nazism against its wishes or ban Nazism
as a content category. (Prelim. Inj. Mot., Dkt. 12, at 29). As YouTube put it, “YouTube will face an
impossible choice between (1) risking liability by moderating content identified to violate its
standards or (2) subjecting YouTube’s community to harm by allowing violative content to remain
on the site.”

And thus:

HB 20’s prohibitions on “censorship” and constraints on how social media platforms
disseminate content violate the First Amendment.

Why?

HB 20 compels social media platforms to
significantly alter and distort their products. Moreover, “the targets of the statutes at issue are the
editorial judgments themselves” and the “announced purpose of balancing the discussion—reining
in the ideology of the large social-media providers—is precisely the kind of state action held
unconstitutional in Tornillo, Hurley, and PG&E.” Id. HB 20 also impermissibly burdens social media
platforms’ own speech. Id. at *9 (“[T]he statutes compel the platforms to change their own speech in
other respects, including, for example, by dictating how the platforms may arrange speech on their
sites.”). For example, if a platform appends its own speech to label a post as misinformation, the
platform may be discriminating against that user’s viewpoint by adding its own disclaimer. HB 20
restricts social media platforms’ First Amendment right to engage in expression when they disagree
with or object to content.

At this point, the court dismisses, in a footnote, the two cases that very silly people always bring up: Pruneyard and Rumsfeld. Pruneyard is the very unique shopping mall case, which has very limited reach, and Rumsfeld is about a university allowing or not allowing military recruiters on campus. Supporters of efforts to force websites to host speech point to both cases as some sort of “proof” that it’s okay to compel speech, but both are very narrowly focused, and anyone relying on either is doing a bad faith “well, in these cases you could compel speech, so in this case obviously you can as well.” But the judge isn’t having any of it.

The Court notes that two other Supreme Court cases address this topic, but neither applies here. PruneYard
Shopping Center v. Robins is distinguishable from the facts of this case. 447 U.S. 74 (1980). In PruneYard, the
Supreme Court upheld a California law that required a shopping mall to host people collecting petition
signatures, concluding there was no “intrusion into the function of editors” since the shopping mall’s
operation of its business lacked an editorial function. Id. at 88. Critically, the shopping mall did not engage in
expression and “the [mall] owner did not even allege that he objected to the content of the [speech]; nor was
the access right content based.” PG&E, 475 U.S. at 12. Similarly, Rumsfeld v. Forum for Academic
& Institutional Rights, Inc. has no bearing on this Court’s holding because it did not involve government
restrictions on editorial functions. 547 U.S. 47 (2006). The challenged law required schools that allowed
employment recruiters on campus to also allow military employment recruiters on campus—a restriction on
“conduct, not speech.” Id. at 62, 65. As the Supreme Court explained, “accommodating the military’s message
does not affect the law schools’ speech, because the schools are not speaking when the host interviews and
recruiting receptions.”

Even more importantly, the court rejects the transparency requirements in HB20. Again, this part was one that some people thought might slide through and be left in place. We’ve discussed, multiple times, how transparency on these issues is important, but that mandated transparency actually creates serious problems. The court, thankfully, agrees.

To pass constitutional muster, disclosure requirements like these must require only “factual
and noncontroversial information” and cannot be “unjustified or unduly burdensome.” NIFLA, 138
S. Ct. at 2372. Section 2’s disclosure and operational provisions are inordinately burdensome given
the unfathomably large numbers of posts on these sites and apps. For example, in three months in
2021, Facebook removed 8.8 million pieces of “bullying and harassment content,” 9.8 million pieces
of “organized hate content,” and 25.2 million pieces of “hate speech content.” (CCIA Decl., Dkt.
12-1, at 15). During the last three months of 2020, YouTube removed just over 2 million channels
and over 9 million videos because they violated its policies. (Id. at 16). While some of those removals
are subject to an existing appeals process, many removals are not. For example, in a three-monthperiod
in 2021, YouTube removed 1.16 billion comments. (YouTube Decl., Dkt. 12-3, at 23–24).
Those 1.16 billion removals were not appealable, but, under HB 20, they would have to be. (Id.).
Over the span of six months in 2018, Facebook, Google, and Twitter took action on over 5 billion
accounts or user submissions—including 3 billion cases of spam, 57 million cases of pornography,
17 million cases of content regarding child safety, and 12 million cases of extremism, hate speech,
and terrorist speech. (NetChoice Decl., Dkt. 12-2, at 8). During the State’s deposition of Neil
Christopher Potts (“Potts”), who is Facebook’s Vice President of Trust and Safety Policy, Potts
stated that it would be “impossible” for Facebook “to comply with anything by December 1, [2021].
. . [W]e would not be able to change systems in that nature. . . . I don’t see a way that we would
actually be able to go forward with compliance in a meaningful way.” (Potts Depo., Dkt. 39-2, at 2,
46). Plaintiffs also express a concern that revealing “algorithms or procedures that determine results
on the platform” may reveal trade secrets or confidential and competitively-sensitive information.
(Id. at 34) (quoting Tex. Bus. & Com. Code § 120.051(a)(4)).

The Section 2 requirements burden First Amendment expression by “forc[ing] elements of
civil society to speak when they otherwise would have refrained.” Washington Post v. McManus, 944
F.3d 506, 514 (4th Cir. 2019). “It is the presence of compulsion from the state itself that
compromises the First Amendment.” Id. at 515. The provisions also impose unduly burdensome
disclosure requirements on social media platforms “that will chill their protected speech.” NIFLA,
138 S. Ct. at 2378. The consequences of noncompliance also chill the social media platforms’ speech
and application of their content moderation policies and user agreements. Noncompliance can
subject social media platforms to serious consequences. The Texas Attorney General may seek
injunctive relief and collect attorney’s fees and “reasonable investigative costs” if successful in
obtaining injunctive relief. Id. § 120.151.

I’ll just note that we had just mentioned that Washington Post v. McManus case earlier this week in calling out the Washington Post’s hypocrisy in calling for mandatory disclosure rules for internet companies…

And Judge Pitman isn’t done yet with the constitutional problems of HB20.

HB 20 additionally suffers from constitutional defects because it discriminates based on
content and speaker. First, HB 20 excludes two types of content from its prohibition on content
moderation and permits social media platforms to moderate content: (1) that “is the subject of a
referral or request from an organization with the purpose of preventing the sexual exploitation of
children and protecting survivors of sexual abuse from ongoing harassment,” and (2) that “directly
incites criminal activity or consists of specific threats of violence targeted against a person or group
because of their race, color, disability, religion, national origin or ancestry, age, sex, or status as a
peace officer or judge.” Tex. Civ. Prac. & Rem. Code § 143A.006(a)(2)–(3). When considering a city
ordinance that applied to ‘“fighting words’ that . . . provoke violence[] ‘on the basis of race, color,
creed, religion[,] or gender,”’ the Supreme Court noted that those “who wish to use ‘fighting words’
in connection with other ideas—to express hostility, for example, on the basis of political affiliation,
union membership, or []sexuality—are not covered.” R.A.V. v. City of St. Paul, Minn., 505 U.S. 377,
391 (1992). As Plaintiffs argue, the State has “no legitimate reason to allow the platforms to enforce
their policies over threats based only on . . . favored criteria but not” other criteria like sexual
orientation, military service, or union membership. (Prelim. Inj. Mot., Dkt. 12, at 35–36); see id.

There’s also some good language in here for those who keep insisting that setting (often arbitrary) size barriers or carveouts on these laws is perfectly fine. Not so if they lead to discriminatory impact on venues for speech:

HB 20 applies only to social media platforms of a certain size: platforms with 50 million
monthly active users in the United States. Tex. Bus. & Com. Code § 120.002(b). HB 20 excludes
social media platforms such as Parler and sports and news websites. (See Prelim. Inj. Mot., Dkt. 12,
at 17). During the regular legislative session, a state senator unsuccessfully proposed lowering the
threshold to 25 million monthly users in an effort to include sites like “Parler and Gab, which are
popular among conservatives.” Shawn Mulcahy, Texas Senate approves bill to stop social media companies
from banning Texans for political views, TEX. TRIBUNE (Mar. 30, 2021), https://www.texas
tribune.org/2021/03/30/texas-social-media-censorship/. “[D]iscrimination between speakers is
often a tell for content discrimination.” NetChoice, 2021 WL 2690876, at *10. The discrimination
between speakers has special significance in the context of media because “[r]egulations that
discriminate among media, or among different speakers within a single medium, often present
serious First Amendment concerns.” Turner Broad. Sys., Inc. v. F.C.C., 512 U.S. 622, 659 (1994). The
record in this case confirms that the Legislature intended to target large social media platforms
perceived as being biased against conservative views and the State’s disagreement with the social
media platforms’ editorial discretion over their platforms. The evidence thus suggests that the State
discriminated between social media platforms (or speakers) for reasons that do not stand up to
scrutiny.

And, of course, everyone’s favorite: HB 20 is unconstitutionally vague.

First, Plaintiffs take issue with HB 20’s definition for “censor:” “block, ban, remove,
deplatform, demonetize, de-boost, restrict, deny equal access or visibility to, or otherwise
discriminate against expression.” Tex. Civ. Prac. & Rem. Code § 143A.001(1). Plaintiffs argue that
requiring social media platforms to require “equal access or visibility to” content is “hopelessly
indeterminate.” (Prelim. Inj. Mot., Dkt. 12, at 37) (quoting id.). The Court agrees. A social media
platform is not static snapshot in time like a hard copy newspaper. It strikes the Court as nearly
impossible for a social media platform—that has at least 50 million users—to determine whether
any single piece of content has “equal access or visibility” versus another piece of content
given the
huge numbers of users and content. Moreover, this requirement could “prohibit[] a social media
platform from” displaying content “in the proper feeds”

There are some other drafting oddities that the Judge calls out including this one:

HB 20 empowers the Texas Attorney General to seek an injunction not just against
violations of the statute but also “potential violations.” Tex. Civ. Prac. & Rem. Code § 143A.008.
Unlike other statutes that specify that the potential violation must be imminent, HB 20 includes no
such qualification. See, e.g., Tex. Occ. Code § 1101.752(a) (authorizing the attorney general to seek
injunctive relief to abate a potential violation “if the commission determines that a person has
violated or is about to violate this chapter”). Subjecting social media platforms to suit for potential
violations, without a qualification, reaches almost all content moderation decisions platforms might
make, further chilling their First Amendment rights.

As in the Florida case, the court here notes that even if there were some reason under which the law should be judged under intermediate, rather than strict, scrutiny, it would still fail.

HB 20 imposes content-based, viewpoint-based, and speaker-based restrictions that trigger
strict scrutiny. Strict scrutiny is satisfied only if a state has adopted ‘“the least restrictive means of
achieving a compelling state interest.”’ Americans for Prosperity Found. v. Bonta, 141 S. Ct. 2373, 2383,
210 L. Ed. 2d 716 (2021) (quoting McCullen v. Coakley, 573 U.S. 464, 478 (2014)). Even under the less
rigorous intermediate scrutiny, the State must prove that HB 20 is ‘“narrowly tailed to serve a
significant government interest.’” Packingham v. North Carolina, 137 S. Ct. 1730, 1736 (2017) (quoting
McCullen, 573 U.S. at 477). The proclaimed government interests here fall short under both
standards.

It’s not even a difficult call. It’s the kind of “duh” explanation that made it easy for us to say upfront that this law was so obviously unconstitutional:

The State’s first interest fails on several accounts. First, social media
platforms are privately owned platforms, not public forums. Second, this Court has found that the
covered social media platforms are not common carriers. Even if they were, the State provides no
convincing support for recognizing a governmental interest in the free and unobstructed use of
common carriers’ information conduits. Third, the Supreme Court rejected an identical government
interest in Tornillo. In Tornillo, Florida argued that “government has an obligation to ensure that a
wide variety of views reach the public.” Tornillo, 418 U.S. at 247–48. After detailing the “problems
related to government-enforced access,” the Court held that the state could not commandeer private
companies to facilitate that access, even in the name of reducing the “abuses of bias and
manipulative reportage [that] are . . . said to be the result of the vast accumulations of unreviewable
power in the modern media empires.” Id. at 250, 254. The State’s second interest—preventing
“discrimination” by social media platforms—has been rejected by the Supreme Court. Even given a
state’s general interest in anti-discrimination laws, “forbidding acts of discrimination” is “a decidedly
fatal objective” for the First Amendment’s “free speech commands.”…

And, the court practically laughs out loud at the idea that HB 20 was “narrowly tailored.”

Even if the State’s purported interests were compelling and significant, HB 20 is not
narrowly tailored. Sections 2 and 7 contain broad provisions with far-reaching, serious
consequences. When reviewing the similar statute passed in Florida, the Northern District of Florida
found that that statute was not narrowly tailored “like prior First Amendment restrictions.”
NetChoice, 2021 WL 2690876, at *11 (citing Reno, 521 U.S. at 882; Sable Commc’n of Cal., Inc. v. FCC,
492 U.S. 115, 131 (1989)). Rather, the court colorfully described it as “an instance of burning the
house to roast a pig.” Id. This Court could not do better in describing HB 20.

End result: injunction granted, the law does not go into effect today as originally planned. Texas will undoubtedly now appeal, and we can only hope the 5th Circuit doesn’t muck things up, as it’s been known to do. Depending on how this plays out, as well as how the 11th Circuit handles the Florida case, it’s possible this could hit the Supreme Court down the road. Hopefully, both the 11th and the 5th actually take heed of Justice Kavanaugh’s words in the Halleck case, and choose to uphold both district court rulings — and we can get past this silly Trump-inspired moral panic attack on the 1st Amendment rights of social media platforms — the very same rights that enable them to create spaces for us to speak and share our own ideas.

Filed Under: , , , , , , ,
Companies: ccia, netchoice

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Texas Court Gets It Right: Dumps Texas's Social Media Moderation Law As Clearly Unconstitutional”

Subscribe: RSS Leave a comment
34 Comments
Kobysays:

Officially They're A Publisher

That’s fine if a social media company wants to editorialize. You can always choose to be a publisher instead of a platform. The next step now is to hold corporations accountable for their editorials. And the company is responsible for their editorials, not the users. Section 230 reform will be the key.

Mike Masnicksays:

Re: Officially They're A Publisher

You can always choose to be a publisher instead of a platform.

Koby, Koby, Koby. This is how it has always been. The law has always been that you are a publisher of the content you yourself create, and you are an interactive computer service for hosting content other people create.

The next step now is to hold corporations accountable for their editorials

You can’t. Because that’s what the 1st Amendment forbids. Editorials are protected speech, you silly, silly troll.

And the company is responsible for their editorials, not the users.

A company is already responsible for its editorials, but can’t be punished by the state because of the 1st Amendment. I mean this is fundamental stuff…

Section 230 reform will be the key.

Section 230 has fuck all to do with it. It’s the 1st Amendment you hate (and don’t understand).

Kobysays:

Re: Re: Officially They're A Publisher

and you are an interactive computer service for hosting content other people create.

But now when you engage in moderation, it’s editorialization.

You can’t. Because that’s what the 1st Amendment forbids. Editorials are protected speech

The first amendment does NOT protect against defamation or contractual obligations amongst private citizens. Now that moderation is officially recognized as editorialization on behalf of the corporation, we can start to get somewhere.

Section 230 has fuck all to do with it. It’s the 1st Amendment you hate (and don’t understand).

Section 230 is the fig leaf behind which corporations claim that they are not editorializing. It’s free speech that you hate, and you understand that you will use corporations to accomplish the censorship for which government dreams it could perform directly.

Rockysays:

Re: Re: Re: Officially They're A Publisher

But now when you engage in moderation, it’s editorialization.

You are really hung up on that, aren’t you? That’s because you removed it from the context of the judgement. You just saw "editorialization" in what the judge said and ignored everything else which means you are either to stupid to actually understand what the judge said or you know that he meant but voicing that would mean that your argument is null and void.

Section 230 is the fig leaf behind which corporations claim that they are not editorializing. It’s free speech that you hate, and you understand that you will use corporations to accomplish the censorship for which government dreams it could perform directly.

Please tell us how you think it would work without section 230? Do you really think your stupid shit will be tolerated at all? There will be no moderation for stupid people like you, you will just be banned at the merest hint that someone doesn’t like what you say. Or do you really believe you can force yourself onto others against their will? There’s a word for that kind of people.

Btw Koby the dishonest, you have yet to tell us who have been "censored" and what they said. Coward.

Mike Masnicksays:

Re: Re: Re: Officially They're A Publisher

But now when you engage in moderation, it’s editorialization.

Um. It has always been so, Koby.

The first amendment does NOT protect against defamation or contractual obligations amongst private citizens.

No one has argued otherwise. And 230 didn’t and doesn’t change any of that.

Now that moderation is officially recognized as editorialization on behalf of the corporation, we can start to get somewhere.

It has ALWAYS been recognized as such, and editorialization is, by definition, opinion, which has always been protected by the 1st Amendment. 230 has nothing to do with it. Opinion cannot be defamation.

Section 230 is the fig leaf behind which corporations claim that they are not editorializing

This is just flat out false, Koby. You are lying.

It’s free speech that you hate

Dude. You’re the one out here denying the 1st amendment, so fuck off.

That One Guysays:

Every accusation a confession

Having Koby of all people accusing someone of hating free speech is like an arsonist standing on the porch of a house they don’t own and just torched accusing the people who are trying to save the house by putting the fire out of not respecting the property rights of others.

Scary Devil Monasterysays:

Re: Every accusation a confession

"Having Koby of all people accusing someone of hating free speech is like an arsonist…"

I’m with the AC in that reply comparing it to the rapist who thinks the girl saying "No!" should be outlawed.
Because Koby’s combined rhetoric, no matter the "rational" tone, is that of someone who thinks a person’s right to their own property or body infringes on the entitlements of those coveting said body or property.

Anonymoussays:

Re: Re: Re: Officially They're A Publisher

It’s free speech that you hate

Koby, seriously, are you a rapist?

You seem to think that you have some god given right to use somebody else’s private property. Those properties have every right in the world to say "NO" to you using their platforms.

The fact that you seem to think that the gov’t should force <social media site> to allow your speech is akin to somebody thinking they have a right to have sex with anybody they want, regardless if that person says "NO" to you.

In other words, you hate the first amendment because it won’t allow you to force you way onto somebody else’s platform.

So, how many times have you raped somebody?

Anonymoussays:

Re: Re: Re: Officially They're A Publisher

As the pet Singaporean here, I FUCKING KNOW when actual censorship happens.

You don’t hate censorship. What you really hate is the First Amendment, sice it censors the GARBAGE you defend.

By the way, do you hate free speech and the freedom of association because it keeps debunking HOLOCAUST DENIAL, COVID DISINFORMATION/MISINFORMATION AND WHITE SUPREMACY? A yes or no will suffice, you fucking Nazi.

Scary Devil Monasterysays:

Re: Re: Re: Officially They're A Publisher

"But now when you engage in moderation, it’s editorialization."

Every court to make a ruling on this has been very clear that no, it’s not, Koby.

The only mark you keep leaving on this forums is the by now very heavily demonstrated fact that those who are opposed to 230 can’t make a single argument against it without first lying through their teeth.

Anonymoussays:

Re: Officially They're A Publisher

Your never getting your account on that site back. No matter hard you cry. Stomp those feet. Scream at the sky. Go to court. Whatever.

It’s that simple.

Everyone who has a policy of not acting like you is a publisher by your bad faith arguments. Your not as a smart as you think you are.

That’s all

Anonymoussays:

Re: Officially You're Dogshit

Shit you should have warned me you were about to say something stupid. I bought a new kick Koby in the balls booth, but it’s not set up yet. Also there’s a bouncy castle that needs inflating. And the bbq need at least a few hours notice. And I was going to hire a mascot in a giant dog turd outfit.

generateusernamesays:

This wouldn’t be the Court’s place to decide, but two major issues for me would be

1) Should the policymaking process of social media platforms more closely resemble the policymaking process of governments?

2) Should principles of the criminal justice system (things like due process and the right to appeal) apply to social media moderation?

For me, I would lean towards "yes" to both, but as usual, the problem is scale. Facebook/Meta says "Of course…we can’t meaningfully engage with billions of people" but right now, there is no way for users to directly submit feedback regarding policy to social media platforms. If you have an issue with a law, you can write your Congressman. If you have an issue with a platform’s policy, you’re stuck whining about it or leaving the platform entirely. Platforms have started to highlight how they work with experts and activist groups to shape their policies, but I would like to see them bring regular users into the mix as well.

There seems to be widespread support for more transparency and a better appeals process (just not as a government mandate), but even as the judgment notes, it’s a matter of scale. And honestly I’m not sure how to overcome that.

Anonymoussays:

Re:

Social media in general has an easy answer to perceived censorship, go to a platform where what you want to say is accepted. If you cannot draw an audience to that platform, it says something about the views being expressed. Note, there is no law restricting you, or anybody else to using only one social media platform, and it is reasonable for a platform to restrict what you can say there.

Rockysays:

Re:

For me, I would lean towards "yes" to both, but as usual, the problem is scale. Facebook/Meta says "Of course…we can’t meaningfully engage with billions of people" but right now, there is no way for users to directly submit feedback regarding policy to social media platforms.

But that means government regulation of things protected under the First Amendment which is big no.

If you have an issue with a platform’s policy, you’re stuck whining about it or leaving the platform entirely.

If I have an issue with a company or a platform’s policy, I take my business elsewhere. It’s as simple as that, because either you adhere to your principles or you don’t. And if you don’t, you will always be subject to the whims of others while you constantly whine how unfair it is.

Platforms have started to highlight how they work with experts and activist groups to shape their policies, but I would like to see them bring regular users into the mix as well.

Which users specifically? How would they be selected? In my experience most regular users doesn’t care as long as something works as expected. When it comes to social media platforms, it’s the vocal very very‘ small minority we are hearing plus the pundits who have found an easy target to pour their vitriol on.

There seems to be widespread support for more transparency and a better appeals process (just not as a government mandate), but even as the judgment notes, it’s a matter of scale. And honestly I’m not sure how to overcome that.

The whole problem with transparency comes from the fact that all social media platforms had to take steps in curbing the behavior of the assholes, the trolls and the conspiracy nuts. Without them, there would no real need for moderation and the transparency of it. As long as those kind of people exist there can be no real transparency because they will abuse it to continue their anti-social behavior.

Scary Devil Monasterysays:

Re:

"Should the policymaking process of social media platforms more closely resemble the policymaking process of governments?"

That depends. If you want to live in China or old Soviet Russia then yes. Otherwise you might want to retain that fundamental principle of free speech, and leave what speech is allowed or not on private property up to the owner of said property.

"Should principles of the criminal justice system (things like due process and the right to appeal) apply to social media moderation?"

Again, that depends on the owner of the private platform in question. Some might apply such a system. Most won’t.

"For me, I would lean towards "yes" to both…"

And full stop while we go over what you just irrevocably committed yourself to; namely the abolition of the concept that individuals should retain rights of speech and association within the confines of their own private property.

"…but as usual, the problem is scale."

Well, no. China manages this process just fine by making sure that a given percentage of the population has gainful work in spying on the communications and statements of the rest of the population. Neither The Great Firewall or the censorship apparatus comes free.

The only issue is that a social media platforms with slender margins won’t have the ability to hire enough censors and auditors to conveniently carry out such mediation, let alone more high-priced paralegals to carry out appeals processes.
Nor does the state of course. Shutting a thousand people out of a million out of your property is just much easier than allowing those thousand people to mediate – which will cost you a thousand times more effort than simply clicking a few buttons and run an algorithm.

"If you have an issue with a platform’s policy, you’re stuck whining about it or leaving the platform entirely."

And that’s the way every private service works. Don’t like the rules of your local bar? Whine about it powerlessly or leave. Don’t like the rules of the local mall? Whine powerlessly or leave. Don’t like the way the car dealer, green grocer, butcher or delikatessen does business, as stated upfront on their rules? Whine powerlessly or leave.

And that’s the way it SHOULD BE in a free society.

The first half of your comment is just plain Koby. False equivalence, false premise and false assumption shat out in a reasoned tone of voice. Just consider this a notice of Strike One on the "alt-right troll" test. A few more of those and we’ll just have to start referring to you as Koby v2.0.

"Platforms have started to highlight how they work with experts and activist groups to shape their policies, but I would like to see them bring regular users into the mix as well."

And several platforms may or may not be doing that, depending on which rules and routines they want set up around the guest rights they extend for their property.
I am, however, very curious as to how you’d deswcribe a "regular user" given that an "expert" and an "activist" won’t be able to cater to their needs and desires. Are your "regular" users unable to match the expectations of civil rights activists?

To me that sounds as if what you call a "regular" user would be better described as "minority user unable to match societal expectations of conduct".

"There seems to be widespread support for more transparency and a better appeals process…"

If that’s the case then eventually major platforms will try to cater to the desires of their product…the people using that platform on a daily basis.

I would certainly personally find it quite beneficial if Facebook and Twitter would start publishing the reasons as to why given posts or users were banned – if nothing else i’m curious to see what the "anti-conservative" bias the alt-right keeps screaming about actually refers to. But my opinion – or that of anyone else, really – doesn’t matter. The owner of the platform makes the rules applicable for that platform. The market then decides whether those rules are acceptable for the platform to become and remain popular.

To even start asking the question you keep posing means you aren’t quite on board with the fundamental principles of property ownership, freedom of speech, and freedom of choice.

I would advise reading up on those concepts – because you could replace the word private platform with Bar, private home, or, more disturbing, people’s bodies and see just how badly a violation of these core principles infringe on basic human rights.

This comment has been deemed insightful by the community.
That One Guysays:

That's why you don't say the quiet part out loud

Gotta love how the grandstanding backfired so nicely on them, they were so eager to boast about how their law would stick it to those ‘liberal’ platforms that they forgot that the government’s not allowed to get involved in those sort of decisions thanks to that pesky first amendment that they hate so much.

Lostinlodossays:

Oped ahead:

Let’s start by saying what I’m not going to cover:
Forced speech is just as bad as censorship. I don’t agree with it.

Company, vs community, flagging/tagging is a problem to be addressed elsewhere. Twitter walks a fine line between directed commentary and self publishing with their company comments. Facebook does the same.
That is a concern beyond 230 and more in line with right to be wrong debates.

So let’s look at how 230 is actually a good thing.

The poster is responsible for their own actions.
Be it face book or google. Algorithms or ai.

The poster is responsible for the post. To serve, link, rank, host, transmit, or supply: the sole responsible party is the source.
We can (and should) argue for better Jon/Jane doe Subpoena issuance.
Illegal posting should be dealt with. Be it libel, slander, scams, and cecp. Hosts should, and must, help in such cases. By supplying to law enforcement requested information.
But a host can not and should not be on the hook for users any more than any other manufacturer or provider should be responsible for the actions of clientele.

230 protects minority opinions. By allowing companies to host legal speech no mater how provocative or fringe, all voices are able to be offered by those willing to host it.

That’s a very important aspect lost in censorship debates.
“The greatest champions of free speech are those who die for for speech they abhor” ANC 1974
“Support of free speech mandates you support what you hate” -Flint 1989
“We disagree, but I support that” MLK (? via X-the story 1995)
“I fail to agree with Comrade Hai Kato but that is the point! Our future requires we fight with words and not with rifle or sword!” -Walentyna 1990

“It is not what he says but our reaction. To condemn a person by word is to condemn ourselves” -R J Hurst 1939

“Think not FORE the word but on them. For it is thy self that must acknowledge we can not, we must not, seek similarity! It is in difference we reach a greater place” Milton 4th 1879

“He who pleads superior is to thyne rest inferior in the eyes of all”— false Franklin 1801 (likely Hancock)

Ultimately 230 places the burden of consequences of speech on the person who opens their mouth, moves their fingers, and not on the person who owns the sidewalk.

As Augustus said ‘speak what you do. And say what you think. Or by your own sword should you be lost. It is a fool (jester) who speaks without action. But the dirt that speaks not at all! And the dirt is to stand upon!’

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:51 [UPDATED]: Myanmar's Military Junta Sentences American Journalist To Eleven Years In Prison (8)
12:01 Hong Kong Government Now Directly Censoring Films In Hopes Of Shutting Down Protest-Related Documentaries (21)
10:50 Fifth Circuit Appeals Court Strips Immunity For Officers Who Arrested A Journalist For Asking Questions (16)
10:44 Why Falsely Claiming It's Illegal To Shout Fire In A Crowded Theater Distorts Any Conversation About Online Speech (85)
05:36 Massachusetts College Decides Criticizing The Chinese Government Is Hate Speech, Suspends Conservative Student Group (144)
16:07 Court Says City Of Baltimore's 'Heckler's Veto' Of An Anti-Catholic Rally Violates The First Amendment (28)
12:19 Chicago Court Gets Its Prior Restraint On, Tells Police Union Head To STFU About City's Vaccine Mandate (385)
11:03 LinkedIn (Mostly) Exits China, Citing Escalating Demands For Censorship (24)
12:10 Court Tells Arkansas Troopers That Muting Anti-Cop Terms On Its Facebook Page Violates The 1st Amendment (37)
13:43 Right-Wing Commentator Dan Bongino Runs Into Florida Anti-SLAPP Law, Now Owes Daily Beast $32,000 In Legal Fees (14)
20:41 North Carolina Sued By Flying Dog Brewery Over Regulatory Body Refusing To Allow Sales Due To 'Offensive' Label (19)
09:59 Now Josh Hawley Is Threatening Google Over 1st Amendment Protected Expression (44)
12:08 PETA Sues NIH And HHS Directors For Blocking Comments With 'PETA' And '#StopAnimalTesting' (59)
10:57 Appeals Court Says The First Amendment Protects Minnesota Woman's Right To Be Super-Shitty About Nearby Islamic School (68)
12:00 Elizabeth Warren Threatens Amazon For Selling Books Containing Misinformation; Perhaps Forgetting The 1st Amendment (49)
09:26 8th Circuit's Bizarre Ruling In Devin Nunes' SLAPP Suit Against Reporter Ryan Lizza (13)
10:43 Satire Site Gets Ridiculous Threat Letter From Baseball Team; cc's Barbra Streisand In Its Response (20)
12:12 Commentator Insists That Fact Checking Is An Attack On Free Speech (163)
13:34 Court: Just Because An Anonymous Yelp Reviewer Is Mean, Doesn't Mean You Get To Unmask The Reviewer (26)
12:06 Computer Repair Shop Owner Has To Pay Twitter's Legal Fees Over Bogus SLAPP Suit Regarding Hunter Biden's Laptop (108)
13:36 Report Shows DOJ Engaged In Selective Prosecution To Maximize Punishment For 'Black Lives Matter' Protesters (22)
16:02 Appeals Court Shuts Down Kansas' 30-Year-Old Ag Gag Law (23)
13:39 Ninth Circuit Affirms MSNBC's Anti-SLAPP Motion Against OAN Network's Bullshit Defamation Lawsuit (102)
10:51 Appeals Court Says Iowa's Ag-Gag Law Is About 50 Percent Constitutional (15)
13:45 Nassau County Executive Vetoes Bill That Would Punish People For Making Cops Feel Bad (15)
12:24 Shiva Ayyadurai Drops His Potentially Interesting Lawsuit About Massachusetts Officials Complaining To Twitter About Tweets (33)
12:03 President Of France Sues Citizen Over Billboard Comparing Macron To Hitler (20)
10:44 Top German Court Says Facebook Must Inform Users About Deleting Their Posts Or Suspending Their Account, Explain Why, And Allow Them To Respond (17)
10:44 Superior Court Dumps BS Charges Brought Against New Jersey Homeowner For Her 'Fuck Biden' Signs (20)
03:27 Appeals Court Denies Immunity To University Officials Who Apparently Banned A Christian Student Group Just Because They Didn't Like It (33)
More arrow
This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it