Platforms, Speech And Truth: Policy, Policing And Impossible Choices

from the deplatforming-and-denialism dept

Warning 1: I’m about to talk about an issue that has a lot of nuance in it and no clear “good” answers — and it’s also one that many people have already made up their minds on one way or the other, and both sides will probably not really like at least part of what I have to say. That’s cool. You get to live your own life. But, at the very least, I hope people can acknowledge that sometimes issues are more complex than they appear and having a nuanced discussion can be helpful, and I hope people can appreciate that.

Warning 2: This is a long post, so I’m going to provide a TLDR at the top (right under this, in fact), but as noted above, a part of the reason it’s long is because it’s a complex issue and there’s a lot of nuance. So I strongly advise that if your initial response to my TLDR version is “fuck you, you’re so wrong because…” maybe try reading the whole post first, and then when you go down to the comments to write out “fuck you, you’re so wrong…” you can explain yourself clearly and thoroughly and address the actual points in the post. Thanks!

TLDR: Internet sites have every right in the world to kick people off their platforms, and there’s no legal or ethical problem with that. No one’s free speech is being censored. That said, we should be at least a bit concerned about the idea that giant internet platforms get to be some sort of arbiter of what speech is okay and what speech is not, and how that can impact society more generally. But there are possible solutions to this, even if none are perfect and some may be difficult to implement, and we should explore those more thoroughly, rather than getting into screaming fights over who should or shouldn’t be allowed to use various internet platforms.

So, this post was originally going to be about the choices that Facebook and other internet platforms make concerning who is allowed on their platforms, specifically in response to an interview that Mark Zuckerberg gave back in July, in which he noted that he didn’t think Facebook should remove Holocaust deniers from its platform, saying:

I?m Jewish, and there?s a set of people who deny that the Holocaust happened.

I find that deeply offensive. But at the end of the day, I don?t believe that our platform should take that down because I think there are things that different people get wrong. I don?t think that they?re intentionally getting it wrong, but I think… it?s hard to impugn intent and to understand the intent. I just think, as abhorrent as some of those examples are, I think the reality is also that I get things wrong when I speak publicly. I?m sure you do. I?m sure a lot of leaders and public figures we respect do too, and I just don?t think that it is the right thing to say, ?We?re going to take someone off the platform if they get things wrong, even multiple times.?

This created a huge furor of people talking about trolling, Holocaust denialism, Overton windows and a bunch of other things. But it’s a complex, nuanced topic, and I was trying to write a complex nuanced post. And just as I was getting somewhere with it… this week, a bunch of platforms, including Apple, YouTube and Facebook, removed at least some of Alex Jones accounts or content. This created another furor in the other direction, with people talking about deplatforming, censorship, free speech, monopoly power, and policing truth. And then when Twitter chose not to follow the lead of those other platforms, we were right back to a big furor about keeping hateful wackjob conspiracy theory assholes on your platform, and whether or not you should want to do that.

Chances are no matter what I say is going to piss off pretty much everyone, but let’s do the stupid thing and try to address a complex and extremely nuanced topic on the internet, with unflagging optimism that maybe (just maybe) people on the internet will (for a moment at least) hold back their kneejerk reactions of “good” or “bad” and try to think through the issues.

Let’s start with a few basic principles: no matter what crazy legal analysis you may have heard before, internet sites have every right to remove users for nearly any reason (there may be a few limited exceptions, but none of them apply here). Whether you like it or not (and you should actually like it), corporations do get rights, and that includes their First Amendment rights to have their sites appear how they want, along with deciding who not to associate with. On top of that, again, despite what you may have heard online about Section 230 of the CDA, platforms not only have the right to moderate what’s on their platform without legal liability, they are actually encouraged to do so by that law.

Indeed, if anyone knows this, it’s Alex Jones, since Infowars’ own terms of service makes it clear that Infowars can boot anyone it wants:

If you can’t read that, there’s a long list of rules and then it says:

If you violate these rules, your posts and/or user name will be deleted. Remember: you are a guest here. It is not censorship if you violate the rules and your post is deleted. All civilizations have rules and if you violate them you can expect to be ostracized from the tribe.

One of the rare cases where I can say that, hey, that Alex Jones guy is absolutely right about that (and we’ll leave aside the hypocrisy about him now flipping out about other sites applying those same rules on him).

A separate point that also is important, and gets regularly ignored, is that “banning” someone from these platforms often has the opposite impact of what was intended. Depending on the situation, it might not quite be a “Streisand Effect” situation, but it does create a martyr situation, which supporters will automatically use to double down on their belief that they’re in the right position, and people are trying to “suppress the truth” or whatever. Also, sometimes it’s useful to have “bad” speech out in the open, where people can track it, understand it… and maybe even counter it. Indeed, often hiding that bad speech not only lets it fester, but dulls our ability to counter it, respond to it and understand who is spreading such info (and how widely).

So, really, the question comes down to whether or not these platforms should be removing these kinds of accounts. But, before we should even answer that question, there’s a separate question, which is: What options are there for platforms to deal with content that they disfavor? Unfortunately, many people assume that it’s a binary choice. You either keep the content up, or you take it down. But that hardly gets at the long list of possible alternatives. You can encourage good behavior and discourage bad behavior (say, with prompts if the system senses you’re doing something bad, or with reminders, or by a community calling you out for bad behavior or lots of other options). Depending on the platform, you can minimize the accessibility or findability of certain content. You can minimize the reach of certain content. You can append additional information or put a “warning flag” on content. You can “shadow ban” content. You can promote “good” content to go with any content you deem to be bad. Or you can do nothing. Or you can set things up so that your users are able to help promote or minimize good or bad content. Or you can create tools that allow your users to set their own preferences and thresholds. Or you can allow third parties to build tools that do the same thing. The list goes on and on and on.

And, yet, so much of this debate seems to ignore much of this (other than shadowbanning, which some people pretend is somehow evil and unfair). And, indeed, what concerns me is that while various platforms have tried some combinations of all of these things, very few seem to have really committed to these ideas — and just get bounced back and forth between extreme pressure on two sides: “ban all the assholes” v. “how dare you fucking censor my favorite idiot.”

So with the question of Alex Jones or holocaust deniers, internet platforms (again) have every right to kick them off their platforms. They don’t want to be associated with assholes? Good for them. But, at the same time, it’s more than a bit uncomfortable to think that anyone should want these giant internet platforms deciding who can use their platforms — especially when having access to those platforms often feels close to necessary to take part in modern day life*. It’s especially concerning when it reaches the level that basically online mobs can “demand” that someone be removed. And this is especially worrisome when many of the decisions are being made based on the claim of “hate speech,” a term that not only has an amorphous and ever-changing definition, but one that has a long history of being abused against at risk groups or those the government simply dislikes (i.e., for those who advocate for rules against “hate speech” think about what happens when the person you trust the least gets to write the definition).

* Quick aside to you if you’re that guy rushing down to the comments to say something like “No one needs to use Facebook. I don’t use Facebook.” Shut up. Most people do use Facebook. And for many people it is important to their lives. In some cases, there are necessary services that require Facebook. And you should support that rather than getting all preachy about your own life choices, good or bad.

On top of that, I think that most people literally cannot comprehend both the scale and complexity of the decision making here when platforms are tasked with making these decisions. Figuring out which pieces of content are “okay” and which are “bad” can work when you’re looking at a couple dozen piece of content. But how about a million pieces of content every single day? Or more? Back in May, when we ran a live audience “game” in which we asked everyone at a Content Moderation Summit to judge just eight examples of content to moderate, what was striking was that out of this group of professionals in this space, there was no agreement on how to handle any piece of content. Everyone had arguments for why each piece of content should stay up, be taken down, or have flag appended to it. So, not only do you have millions of pieces of content to judge, you have a very subjective standard, and a bunch of individuals who have to make those judgment calls — often with little training and very little time to review or to get context.

Antonio Garcia Martinez, who worked at Facebook for a while, and has been a fairly outspoken critic of his former employer (writing an entire book about it), has reasonably warned that we should be quite careful what we wish for when asking Facebook to cut off speech, noting that the rest of the world has struggled in every attempt to define the limits of hate speech, and it’s an involved and troubling process — and yet, many people are fine with handing that over to a group of people at a company they all seem to hate. Which… seems odd. Even more on point is an article in Fortune by CDT’s Emma Llanso (who designed and co-ran much of that “game” we ran back at the content moderation summit), warning about the lack of transparency when platforms determine this kind of thing, rather than, say, the courts. As we’ve argued for years, the lack of transparency and the lack of due process is also a significant concern (though, when Mark Zuckerberg suggested an outside due process system, people completely freaked out, thinking he was arguing for a special Facebook court system).

In the end, I think banning people should be the “very last option” on the table. And you could say that since these platforms left Jones on for so long while they had their internal debates about him, that’s what happened. But I don’t think that’s accurate. Because there were alternative solutions that they could have tried. As Issie Lapowsky at Wired pointed out in noting that this is an unwinnable battle, the “do nothing, do nothing, do nothing… ban!” approach is unsatisfying to everyone:

When Facebook and YouTube decided to take more responsibility for what does and doesn’t belong on their platforms, they were never going to satisfy all sides. But their tortured deliberations over what to do with Jones left them with only two unenviable options: Leave him alone and tacitly defend his indefensible actions, or ban him from the world’s most powerful platforms and turn him into the odious martyr he now is.

Instead, we should be looking at stronger alternative ideas. Yair Rosenberg’s suggestion in the Atlantic is for counterprogramming, which certainly is an appealing idea:

Truly tackling the problem of hateful misinformation online requires rejecting the false choice between leaving it alone or censoring it outright. The real solution is one that has not been entertained by either Zuckerberg or his critics: counter-programming hateful or misleading speech with better speech.

How would this work in practice?

Take the Facebook page of the ?Committee for Open Debate on the Holocaust,? a long-standing Holocaust-denial front. For years, the page has operated without any objection from Facebook, just as Zuckerberg acknowledged in his interview. Now, imagine if instead of taking it down, Facebook appended a prominent disclaimer atop the page: ?This page promotes the denial of the Holocaust, the systematic 20th-century attempt to exterminate the Jewish people which left 6 million of them dead, alongside millions of political dissidents, LGBT people, and others the Nazis considered undesirable. To learn more about this history and not be misled by propaganda, visit these links to our partners at the United State Holocaust Museum and Israel?s Yad Vashem.?

Obviously, this intervention would not deter a hardened Holocaust denier, but it would prevent the vast majority of normal readers who might stumble across the page and its innocuous name from being taken in. A page meant to promote anti-Semitism and misinformation would be turned into an educational tool against both.

Meanwhile, Tim Lee, over at Ars Technica, suggested another possible approach, recognizing that Facebook (in particular) serves multiple functions. It hosts content, but it also promotes certain content via its algorithm. The hosting could be more neutral, while the algorithm is already not neutral (it’s designed to promote the “best” content, which is inherently a subjective decision). So, let bad content stay on the platform, but decrease its “signal” power:

It’s helpful here to think of Facebook as being two separate products: a hosting product and a recommendation product (the Newsfeed). Facebook’s basic approach is to apply different strategies for these different products.

For hosting content, Facebook takes an inclusive approach, only taking down content that violates a set of clearly defined policies on issues like harassment and privacy.

With the Newsfeed, by contrast, Facebook takes a more hands-on approach, downranking content it regards as low quality.

This makes sense because the Newsfeed is fundamentally an editorial product. Facebook has an algorithm that decides which content people see first, using a wide variety of criteria. There’s no reason why journalistic quality, as judged by Facebook, shouldn’t be one of those criteria.

Under Facebook’s approach, publications with a long record of producing high-quality content can get bumped up toward the top of the news feed. Publications with a history of producing fake news can get bumped to the back of the line, where most Newsfeed users will never see it.

Others, such as long-time free speech defender David French, have suggested that platforms should ditch concepts like “hate speech” that are not in US law and simply stick to the legal definitions” of what’s allowed:

The good news is that tech companies don?t have to rely on vague, malleable and hotly contested definitions of hate speech to deal with conspiracy theorists like Mr. Jones. The far better option would be to prohibit libel or slander on their platforms.

To be sure, this would tie their hands more: Unlike ?hate speech,? libel and slander have legal meanings. There is a long history of using libel and slander laws to protect especially private figures from false claims. It?s properly more difficult to use those laws to punish allegations directed at public figures, but even then there are limits on intentionally false factual claims.

It?s a high bar. But it?s a bar that respects the marketplace of ideas, avoids the politically charged battle over ever-shifting norms in language and culture and provides protection for aggrieved parties. Nor do tech companies have to wait for sometimes yearslong legal processes to work themselves out. They can use their greater degree of freedom to conduct their own investigations. Those investigations would rightly be based on concrete legal standards, not wholly subjective measures of offensiveness.

That’s certainly one way to go about it, but I actually think that would create all sorts of other problems as well. In short, determining what is and what is not defamation can often be a long, drawn out process involving lots and lots of lawyers advocating for each side. The idea that platforms could successfully “investigate” that on their own seems like a stretch. It would be fine for platforms to have a policy saying that if a court has adjudicated something to be defamatory, then they’ll take it down (and, indeed, most platforms do have exactly that policy), but having them make their own determinations of what counts as defamation seems like a risky task, and that would end up in a similar end state as where we are today with a lot of people angry at the “judgments from on high” with little transparency or right of appeal.

As for me, I still go back to the solution I’ve been discussing for years: we need to move to a world of protocols instead of platforms, in which transparency rules and (importantly) control is passed down away from the centralized service to the end users. Facebook should open itself up so that end users can decide what content they can see for themselves, rather than making all the decisions in Menlo Park. Ideally, Facebook (and others) should open up so that third party tools can provide their own experiences — and then each person could choose the service or filtering setup that they want. People who want to suck in the firehose, including all the garbage, could do so. Others could choose other filters or other experiences. Move the power down to the ends of the network, which is what the internet was supposed to be good at in the first place. If the giant platforms won’t do that, then people should build more open competitors that will (hell, those should be built anyway).

But, if they were to do that, it lets them get rid of this impossible to solve question of who gets to use their platforms, and moves the control and responsibility out to the end points. I expect that many users would quickly discover that the full firehose is unusable, and would seek alternatives that fit with how they wanted to use the platform. And, yes, that might mean some awful people create filter bubbles of nonsense and hatred, but average people could avoid those cesspools while at the same time those tasked with monitoring those kinds of idiots and their behavior could still do so.

I should note that this is a different solution than the one that Twitter’s Jack Dorsey appeared to ham-fistedly suggest this week on his own platform, in which he suggested that journalists need to do the work of debunking idiots on Twitter. He’s not wrong, but what an awful way to put it. Lots of people read it to mean “we set up the problem that makes this giant mess, and we’ll leave it to journalists to come along and sort things out for free.”

Instead, what I’m suggesting is that platforms have to get serious about moving real power out to the ends of their network so that anyone can set up systems for themselves — or look to other third parties (or, even the original platforms themselves for a “default” or for a set of filter choices) for help. In the old days on Usenet there were killfiles. Email got swamped with spam, but there were a variety of anti-spam filters that you could plug-in to filter most of it out. There are ways to manage these complex situations that don’t involve Jack Dorsey choosing who stays on the island and who gets removed this week.

Of course, this would require a fundamental shift in how these platforms operate — and especially in how much control they have. But, given how they keep getting slammed on all sides for the decisions they both do and don’t make, perhaps we’re finally at a point where they’ll consider this alternative. And, hey, if anyone at these big platforms wants some help thinking through these issues, feel free to contact us. These are the kinds of projects we enjoy working on, as crazy and impossible as they may feel.

Filed Under: , , , , , , , ,
Companies: facebook, twitter, youtube

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Platforms, Speech And Truth: Policy, Policing And Impossible Choices”

Subscribe: RSS Leave a comment
346 Comments
nobodynobody says:

Re: Re:

Yeah, i can’t believe he’s endorsing that strategy of dealing with these people. Holocaust deniers should not be given any sort of validation in any remotely mainstream publication or platform, internet or otherwise, nor should anti-vaxxers, flat-earthers, and various other nutjobs who are outright factually wrong.

These places encourage echo chambers, so why in the world would you want to allow anybody to participate in those echo chambers? They do not further discourse in any sort of good faith, period. People who subscribe to these nutjob theories aren’t looking for honest and true discourse.

i thought he was going to be nuanced, not do the ‘let’s listen to both sides in Holocaust denial’ thing, jeez.

i mean, let’s all talk about how Newton was completely wrong about gravity or how in some alternate parallel world i believe in, 2+2 = 5, GOTTA LISTEN TO BOTH SIDES. /s

Leigh Beadon (profile) says:

Re: Re: Re:

anti-vaxxers, flat-earthers, and various other nutjobs

Anti-vaxxers may be idiots and clearly incorrect, but I’m not so sure setting a precedent of "ban people who question the pharmaceutical industry" is a great idea either. If you’re banning flat-earthers, are you also banning fundamentalist Christians who believe the earth is 6,000 years old? I mean, that’s equally factually incorrect, but I suspect you might get some backlash on that one.

And, in general, precisely how many scientists in how many fields is Facebook supposed to employ in order to make these determinations as new claims emerge in the future?

let’s listen to both sides in Holocaust denial

Saying "attempting to ban all holocaust deniers may not be the best solution for various reasons" is not the same thing as saying they deserve your attention or "let’s listen to both sides".

let’s all talk about how Newton was completely wrong about gravity

I mean… he sorta was.

nobodynobody says:

Re: Re: Re: Re:

Okay, let’s go into the whole ‘GRAVITY IS JUST A THEORY GUYSSSSS’ because that SO NEEDS A PLATFORM!!!

BIG SCIENCE IS MAKING MONEY OFF THE THEORY OF GRAVITY. QUESTION EVERYTHING, EVEN PROVABLE FACTS, BECAUSE FEELINGS!

ALSO, MY OPINION IS 2+2 = 5 AND IT’S JUST AS IMPORTANT AS STEPHEN HAWKING’S. HE’S HUMAN, I’M HUMAN, SAME THING. HE TELLS YOU 2+2 = 4 BUT HE’S A PHARMA SHILL AND I’M NOT.

I mean, seriously? That’s all you have to say about that? Why don’t you go tell your kids tonight gravity is just a theory so they should try doing 5th-floor parkour after dark because it’s fun? Or better yet, how about you let your local cool conspiracy nut encourage it to your children? THE IMPORTANT PART IS HE HAS A PLATFORM TO SAY WHAT HE WANTS, RIGHT?

Hey, you know what? While we’re at it, Twitter should let the ‘infidels’ and cartels post violently graphic pictures of them dragging bodies up and down the street BECAUSE THEY SHOULD HAVE A PLATFORM; THEY HAVE A VOICE, TOO; MAYBE THEY ARE RIGHT ABOUT SOMETHING!

i also think NASA should open up all their rocket building to public comment and then whatever gets upvoted the most is how they execute putting people in space. GONNA WORK OUT GREAT, I BET.

/s i mean, that is such a total load of B.S.

Mike Masnick (profile) says:

Re: Re: Re:

Yeah, i can’t believe he’s endorsing that strategy of dealing with these people. Holocaust deniers should not be given any sort of validation in any remotely mainstream publication or platform, internet or otherwise, nor should anti-vaxxers, flat-earthers, and various other nutjobs who are outright factually wrong.

I’m in agreement that none of those people deserve any sort of validation. But… seeing as I know that I’m not the one putting together the list of what content will be allowed and what won’t be, I’m a bit concerned about who WILL be making that determination. There are lots of things that are acceptable today that weren’t acceptable just a few decades ago. Would you have been okay if a few decades ago an equivalent of Twitter banned any talk of same sex marriage? Or go back a few more decades and perhaps it would have banned any discussion of divorce. Or civil rights. I’m not saying these things are the same, but I am saying that a few decades ago many people DID view same sex marriage in the same way that you and I might view flat-earthers today. So… be careful what you wish for.

i thought he was going to be nuanced, not do the ‘let’s listen to both sides in Holocaust denial’ thing, jeez.

I most certainly did NOT say "listen to both sides." I would not, because I don’t believe that. Did you even read the post?

Hephaestus (profile) says:

Re: Re: Re: Re:

It has been a while since I posted here. The way this is headed concerns me very much, so I thought it might be fun to play Advocatus Diaboli (devils advocate) along many, many paths…

But… seeing as I know that I’m not the one putting together the list of what content will be allowed and what won’t be, I’m a bit concerned about who WILL be making that determination

And right you should, you speak out about copyright issues, limitations on copyright and corporate greed … how soon until you are on the receiving end of being shadow banned on search engines, social media, and have your accounts deleted online? We have already seen what government agencies are capable of when left unchecked.

There is this thing called the Streisand Effect maybe you have heard of it (nudge, nudge, wink, wink)?

Since, Alex Jones was banned from Facebook, his app has become the most downloaded on Googles Play store (see NYT). He has added followers on Twitter 100 times faster than before. And he has become bigger than CNN, what am I saying I have more followers than CNN, so that is nothing, never mind.

Banning him will make him disappear

400 years ago the catholic church had this list called the "Index Librorum Prohibitorum" (Index Of Banned Books). I believe every book on that list can be found today. You ban something people wonder what am I missing? what secrets are they trying to hide? … well maybe I should take a look. This also goes to the Streisand Effect section.

We are a private company and the first amendment doesn’t apply

It does if you banned certain people because the government pressured you to do so. Remind me again, how many hearings have the social media companies testified in front of? What was the implied threat, oh yeah regulation.

We are a publicly traded company, we have lawyers, and nothing can happen to us, we have a EULA that allows us to ban anyone we want

Until it comes out, that as a publicly traded company, you have alienated half your subscriber base, and made them wonder if they are being shadow banned because of their views or thoughts. Leading to your delete account pages being so swamped they no longer work. It is as if people had changed their names to Elvis and decide to leave the building. Which in the end causes investor lawsuits, because your job was to make money and not do politics.

I could go on for hours on this. I will leave you with one final thought. Since Mason Wheeler already broke Godwin’s law.

First they came for (insert your least favorite group, racial, political, religious, sexual orientation, annoying neighbor, mother/father in-law, etc) I said nothing.

Then they came for (insert your second favorite here, then third, then forth…)

Then with no one left to speak for me …

they came for me.

This is how this crap starts. Learn from history people.

Stephen T. Stone (profile) says:

Re: Re: Re:2

It does if you banned certain people because the government pressured you to do so. Remind me again, how many hearings have the social media companies testified in front of? What was the implied threat, oh yeah regulation.

Yeah, and some of those hearings and implied threats came after suspensions, bannings, and other actions that were implied to have proven an anti-conservative bias present within those companies. The government did not want Alex Jones banned from social media; if anything, they likely wanted Twitter, Facebook, etc. to keep him around because banning him would be “silencing” another conservative voice.

Hephaestus (profile) says:

Re: Re: Re:5 Re:

But regulation always mean government has control. And the words we are with the government, we are here to help always end poorly.

Lets start chanting one word, one nation, one goal for our great nation, we want to be like … Venezuela, Venezuela, Venezuela, follow along with me.

Oh come chant it to me, it will be fun… sing it to the tune we are the champions of the world by Queen, with a broken record player …

Hephaestus (profile) says:

Re: Re: Re:5 Re:

“Regulation does not necessarily mean “banning people”, you know.”

Of course it does, to answer your original post. Any regulation is a ban. Because regulating any speech, no matter how hateful, is a ban on free speech. And ….

niger, spik, whitey, kike … etc, etc, etc,

which will get me banned here?

Which one will not get me banned here?

Double standard anyone?

Thad (profile) says:

Re: Re: Re:2 Re:

First they came for…

Yeah, I remember how that poem is about how in Nazi Germany, private platforms blocked conspiracy theorists from using them.

No, wait, sorry. That poem is about how the government was rounding people up and putting them in camps.

If you’re worried about governments rounding people up and putting them in camps, I suggest that perhaps you should spend less time being angry at Facebook and more time being angry at ICE.

Anonymous Coward says:

Re: Re:

I see it as a result of precedents and avoiding slippery slopes. While the previous is a logical fallacy because it doesn’t prove things (a barber will in fact stop at cutting your hair and won’t try to cleave your skull off) there is some merit to it. It is related to the fallacy fallacy as well – just because something contains a fallacy doesn’t mean they are automatically wrong just that it isn’t proven. I can say that flat earthers are a bunch of morons – an ad hominem fallacy but that doesn’t make the earth flat.

Ironically there is a Jewish religious tradition of it with ‘fences around the law’ where the devout would proscribe things that could possibly lead to breaking the holy law. I believe one interpretation of the no meat with dairy mixing practice is the ‘not to have a calf boiled in his mother’s milk’ as an admonishment against senseless cruelty. If one doesn’t know if the source of the calf and the source of the milk line up with one hundred percent certainty just don’t mix the two – even if you think you are using goat milk with veal there could be a mix up not noticed until later. Given that it involves both religion and law you had better believe there are a lot of arguments about it – don’t take anything I say as more than a paraphrase of remembered tidbits.

Like the slippery slope fallacy it can be silly or it can be principled like “Better one hundred guilty go free than one innocent be imprisoned, for doing otherwise will lead to only contempt for the law: if following the law is no guarantee of innocence why follow the law”?

Uriel-238 (profile) says:

Re: Re: A calf boiled in it's mothers' milk...

…is a specific part of a ritual wedding feast prepared in the name of the goddess Asherah.

Between the time when the oceans drank Atlantis and the rise of the sons of Aryas, [Yahweh] (before He was Yahweh) had a consort, Asherah, who was a bit more ambitious than He was, and when She became too popular, His temple priests gathered together a mob of loyalists who sacked the Asheran temple, slew its clerics and acolytes and burned it to the ground.

After that it was pronounced a capital sin to worship Nod and Asherah, and the proscription against a calf boiled in it’s mothers’ milk had to do with criminalizing Asheran practices and traditions.

Religions being what they are and the applicability of myths being eternal, this is not to say that more modern interpretations are wrong (any more than, say, the flight of Icarus being a lesson about hubris), but that’s how that bit got started.

Mark Murphy (profile) says:

A Small Matter of Control

Instead, what I’m suggesting is that platforms have to get serious about moving real power out to the ends of their network so that anyone can set up systems for themselves… Of course, this would require a fundamental shift in how these platforms operated — and especially in how much control they had.

That seems unlikely. Control is the name of the game for Internet properties such as the ones that you are citing. I think one could make a plausible argument that control is more important than near-term profits. It seems more likely that a firm with control can earn future profits than a firm with profits can earn future control.

Which is why I have to call a wee bit o’ shenanigans on:

Most people do use Facebook. And for many people it is important to their lives. In some cases, there are necessary services that require Facebook. And you should support that rather than getting all preachy about your own life choices, good or bad.

Tactically, I agree. Strategically, I expect that the only way to "move to a world of protocols instead of platforms" will be to move off of Facebook, et. al. to other things. Partly, those "other things" (hopefully) will be "protocols instead of platforms". Partly, without loss of market share, I do not see the existing Internet properties embracing a loss of control.

Stephen T. Stone (profile) says:

Re: A Small Matter of Control

Strategically, I expect that the only way to "move to a world of protocols instead of platforms" will be to move off of Facebook, et. al. to other things.

Facebook and their ilk are making it much easier by being cesspools of rage and bullshit. I know the Mastodon fediverse got another spike in Twitter refugees yesterday thanks to Jack’s explanation of why Twitter has not yet booted Alex Jones.

Thad (profile) says:

Re: A Small Matter of Control

FYI, a few of us have been having a pretty good conversation on this subject over in the Voting by Cell Phone thread — with apologies for the thread-jacking, but this post wasn’t up yet.

Generally, I’m with you: I think people should quit using Facebook or Twitter if they’re alarmed by their actions or their policies. But I acknowledge Mike’s point, too: this is much easier said than done. If you’re running a business and you need to advertise it, giving up Facebook or Twitter effectively means hobbling your chances of attracting an audience. It’s simply not a practical solution under those circumstances.

But if you’re not? If you’re just a private individual using Facebook and Twitter to keep in touch with people? You should think about whether there are alternatives that would work for you and your friends — alternatives that may be a little less convenient, but which also don’t include all the stuff you don’t like about Facebook and Twitter.

(And if you’re reading Techdirt, you’re probably tech-savvy; you may already be aware of Mastodon, or know how to set up a WordPress blog, or a phpBB messageboard, or some other kind of independent platform.)

My feeling is, if you’re worried about how big Facebook and Twitter are, the solution is to make them smaller.

Though, again, I can’t tell people who rely on those services for income to make that leap.

I mentioned in the conversation I linked up top that I think switching from the big, corporate platforms to smaller, independent ones is going to require that a whole lot of people re-evaluate how they think about the Internet, and that changing opinions like that is a very difficult thing indeed. But I mentioned that there’s been, at least, some success in "buy local" campaigns in meatspace; I made the analogy that Budweiser is still doing a brisk business but microbreweries are more popular than ever. It’s quite possible that there could be a rise of smaller, independent platforms that ate into Facebook’s marketshare but Facebook would continue to be the biggest dog on the block.

It’s also possible that the current backlash against Facebook results in subscribers quitting Facebook but just moving over to Instagram (which Facebook owns) or some other monolithic platform.

Ultimately, it’s quite clear that both Facebook and Twitter are very vulnerable in this moment (which is where the Budweiser analogy breaks down; Anheuser-Busch InBev never lost 20% of its stock price in one day). There’s a lot of anger and dissatisfaction. What will the results of that anger and dissatisfaction ultimately be? We’ll see.

Anonymous Coward says:

Re: Re: A Small Matter of Control

I acknowledge Mike’s point
But where does this optimism come from? Are there good examples of closed platforms becoming open protocols to the benefit of their users? If anything it’s the opposite. Services tend to become more closed and anti-user over time.

Anonymous Coward says:

Re: Re: A Small Matter of Control

My feeling is, if you’re worried about how big Facebook and Twitter are, the solution is to make them smaller.

This is really the key to the whole issue. The problem isn’t that a private company chose to censor someone. The problem is that a private company that acts as a virtual monopoly acted to censor someone. These giant tech monopolies need to be broken up.

Christenson says:

Re: Re: Re:2 A Small Matter of Control

For the proposition that Facebook is a public commons:
https://www.theguardian.com/commentisfree/2018/aug/10/infowars-social-media-companies-conspiracy

and a well-articulated viewpoint as to why banning Alex jones is a mistake.

That source states that Internet has 4 Gig users, and 2 Gig (half) of them are active on Facebook. Might not be a monopoly, but close enough to be a huge concern.

Leigh Beadon (profile) says:

Re: Re: Re:3 A Small Matter of Control

You don’t become a monopoly by being big, or by serving a majority of the market. You become a monopoly by being the only provider in the market.

Facebook is not the only provider – not even close. People have LOTS of options for everything Facebook does – a few very big ones, dozens of medium-sized ones, and countless small ones. Hell, it seems that today’s young people don’t even care about Facebook anymore – we’re all just a bunch of old folks moaning about a social network that’s already out of fashion and stands a good chance of falling from grace within a generation.

Remember when MySpace was the undisputed king of social media? Remember when Digg seemed to rule the internet? Where are they now?

Leigh Beadon (profile) says:

Re: Re: Re:3 A Small Matter of Control

Or look at it this way:

How many of the 2-billion Facebook users and the 1.8-billion YouTube users stated in that article overlap? And how many also use one or more of Twitter, Reddit, Tumblr, LinkedIn, Snapchat, or Pinterest on a regular basis? I would estimate that only a very small minority are exclusively Facebook users. So how is that a monopoly?

M says:

Re: Re: A Small Matter of Control

You may be right in a way. However, denying services and unpersoning people is still a chilling action, however much publicity it gives someone.

Also, it makes some sense for YT and Facebook to take it out. They’re taking down ‘hateful content’. The problem is when you have LinkedIn doing the same for the person, when LinkedIn doesn’t even host any of the content, or Mastercard cancelling accounts for certain people. That becomes the same thing as a store having a ‘no gays’ sign. Sure, go to another store. But tomorrow, that store will also have a similar sign. Then the others have a similar sign, as they have the right to do it as well, and the only way to get your groceries is to take the money you keep stashed under the floor planks in your house and get it from some back alley dealer. This is a point of employing anti-discrimination protocols, rather than ‘being regulated because it’s so big’.

Wendy Cockcroft (user link) says:

Re: Re: Re: A Small Matter of Control

M, here’s your problem: when people behave badly on a social media or other platform the users affected approach the admins to demand that Something Must Be Done. When that fails they go to their Congressional representatives and ask them to sort it out. This is the result.

So far, solutions to all this complaining has resulted in

1) Inaction. Result: more complaining till they couldn’t ignore it any more.
2) Some action to remove the most egregious items. Result: The increase in users resulted in more egregious items and automation via keywords, etc., to remove them. Result: some innocent items got removed.
3) More action to remove contentious items. Result: even more complaining, this time about censorship.

Okay, so what do you propose? As business owners they’re obliged to serve their customers or risk losing them. My Twitter feeds into my Linked In so it does appear there and if someone doesn’t like what they see they’ll complain. Chances are, if I repeatedly post items Linked In users don’t like, I’ll be banned.

None of this is “unpersoning” anyone. It may, however, prompt more socially acceptable behaviour on the part of users wishing to posting content that other users find objectionable. This, in and of itself, is entirely objective and will shift with trends in public opinion so if you’re a rainbow flag-waving member of the LGBT community advising the location of a Pride parade in your town it’s unlikely that you’ll have your post removed even if right-wingers complain because there’s a quorum of people who would complain about the complaints. However, if you advocate violence or cruelty to members of the LGBT community your post will most likely be yanked and only right-wingers will complain – and most likely be ignored. Does that make sense?

So basically what we’re experiencing now is the result of years of build-up of resentment on the part of people who have experienced bad behaviour online and have elected to push back rather than be driven off by trolls. As I predicted years ago when I first started blogging, either the platform admins would do something about bad behaviour online or users would compel them to when the number of affected people reached a certain level. They’ve reached the level and the chickens are coming home to roost. Actions have consequences.

Anonymous Coward says:

Re: A Small Matter of Control

I’m not even sure it’s so much about control as protecting your niche. The competitive space for social media and content platforms is defined by the ability to carve out and protect a niche or brand identity. There needs to be a reason why you post some things to Twitter, other things to Facebook, and still others to Youtube. Pushing various filters and thresholds to end-users will result in a better experience for users, but if you have to do that by allowing the creation of 3rd party tools, you run the risk of having that niche diluted, thereby undermining your competitive edge.

The idea of protocols instead of platforms is great from a tech and user perspective, but can you imagine trying to pitch that to some VC guys? E-mail’s a protocol and it’s everywhere, but who’s making money off of it? How do you recoup your investment? I’m not saying you can’t, but it’s not immediately clear

Leigh Beadon (profile) says:

Re: Re: A Small Matter of Control

E-mail’s a protocol and it’s everywhere, but who’s making money off of it?

  • webmail providers (Gmail, Outlook)
  • hosted email providers (Namecheap, GoDaddy, Google G Suite, MS Office 365 Business Premium)
  • email marketing management and CMS tools (MailChimp, Hubspot)
  • email productivity tools (Boomerang, ToDoist)
  • premium mobile email apps (Newton Mail, AirMail)
  • premium email plugins/integrations for other platforms like WordPress (Bloom, Thrive Leads)
  • email server software providers (MailEnable, Microsoft Exchange Server)
Mike Masnick (profile) says:

Re: Re: A Small Matter of Control

The idea of protocols instead of platforms is great from a tech and user perspective, but can you imagine trying to pitch that to some VC guys? E-mail’s a protocol and it’s everywhere, but who’s making money off of it? How do you recoup your investment? I’m not saying you can’t, but it’s not immediately clear

This is beyond the scope of this topic, but have you seen how much VC money is flowing into token-based companies these days? Many are building protocols (just look at IPFS as an example). There are lots of people who think that using tokens/cryptocurrency is a way to make protocols much more sustainable and profitable.

Thad (profile) says:

Re: Re: Re: A Small Matter of Control

To wit: SoundCloud on the blockchain? Audius raises $5.5M to decentralize music

Audius wants to cut the middlemen out of music streaming so artists get paid their fair share. Coming out of stealth today led by serial entrepreneur and DJ Ranidu Lankage, Audius is building a blockchain-based alternative to Spotify or SoundCloud.

Users will pay for Audius tokens or earn them by listening to ads. Their wallet will then pay out a fraction of a cent per song to stream from decentralized storage across the network, with artists receiving roughly 85 percent — compared to roughly 70 percent on the leading streaming apps. The rest goes to compensating whomever is hosting that song, as well as developers of listening software clients, one of which will be built by Audius.

Might be an interesting one to watch.

Thad (profile) says:

Re: Re: Re:3 Re:

Yeah, I know, I know.

But despite its overexposure as a buzzphrase, the blockchain is quite an amazing technology, and this seems like a good application for it.

I’ve seen people (like Cory Doctorow) arguing for years that the proper solution for the question of how to compensate artists is to include some fixed amount as part of everybody’s internet subscription fee and then track what songs get listened to and how much, and compensate artists accordingly.

This isn’t quite that, as it’s supported by ads or a separate subscription rather than being built into a monthly cable bill, but it sounds like a promising way of tracking what gets listened to (hopefully) without tracking who, specifically, is listening to it, and dividing up the proceeds in a fair way.

Plus, the goal seems to be to cut the labels out of the process, which I’m all for.

I’m sure there are a lot of bugs to work out, not just technically but in terms of the business strategy. But it’s an interesting idea and I think it could turn out to be a model that works and is preferable to the system we’ve got right now, both from customers’ and artists’ point of view.

Thad (profile) says:

Re: Re: Re:5 Re:

If your question is about Audius, then clearly the hope is that enough people sign up that it is profitable — but you’re right, there’s no guarantee of that.

If your question is about Doctorow’s hypothetical of paying your ISP a fee every month to compensate rightsholders, then presumably that fee wouldn’t just be divided up according to the music you listen to, but all the other media you access. Games, videos, news articles, everything.

Implementation of such a system is, of course, an open question, but I feel like — much as I understand Stephen’s impulse to roll his eyes at the mere mention of the word — a blockchain solution is a good fit for this problem.

Dan (profile) says:

CDA Section 230...

Yes, I’m commenting without reading the whole thing (yet). But there’s something you repeat about CDA § 230 that I don’t think is right. You say section 230 "encourages" platforms to moderate. No, it doesn’t–nothing in section 230 encourages, motivates, or in any way leads platforms to moderate. The most it does is to remove a disincentive to moderate, that being the position of some courts that moderation made a platform liable for whatever appeared there. Removing a disincentive does not provide affirmative encouragement.

Now back to read the rest of the post…

Leigh Beadon (profile) says:

Re: CDA Section 230...

The thing is that in writing that portion of the law, Congress made it pretty clear that their goal was to "encourage" and "incentivize" moderation, which is why you see that word used so often to describe s.230

So you are correct that technically the statute itself does not say anything about "encouraging" – but that was and is very much its stated purpose.

That Anonymous Coward (profile) says:

BUT I DON’T USE FACEBOOK!!!!!!!!!!!

So I know very little about how it works.
Sticking an asterisk onto something seems like a fair way to deal with it.

The largest problem is, EVERYBODY & their gluten free dog, is demanding to NEVER!!!!!!!!! be offended.

A mantra I’ve used at least once or twice here is ‘Personal Responsibility is Dead’.
Twitter has the obvious solutions blocking accounts, you can have your very own list of forbidden words, & if you just want to be lead by the nose there are blocklists you can subscribe to so you don’t even have to think about it they block the ‘bad’ people for you!
Instead they slam the report button, cry how wrong it is, tell their friends to be upset to, and Twitter puts the account in time out & gives the DUMBEST explanations as to why. (See Also: https://www.techdirt.com/articles/20180419/17513039676/how-twitter-suspended-account-one-our-commenters-offending-himself.shtml?threaded=true)

I was ‘promoting hatred towards others’.
They couldn’t/wouldn’t explain that to me.
I pointed out the insanity & someone else on the team said oh oops… giggle sorry, we goofed.
No you responded like Pavlov’s Dog. The button was hit enough times and you jumped into action & that action was lock it down & demand he delete it.
They have a ‘team’… who apparently don’t look at timestamps.
A flurry of reports on a year old tweet, but from none of the people in the thread, and all of the accounts reporting have submitted a large number of complaints that end up reversed (when the target bothers to try and fight back against the stupidity that is the team). Its almost like people are abusing the system to silence people! *le gasp*

I dislike Mr. Jones on SOOOO many levels, but he deserves to be able to tweet out his bullshit, just like I tweet out my bullshit. I don’t have to follow him which magically seems to keep a vast portion of his bullshit from my sight. I think he is a conman fleecing the rubes & any action you take drives the rubes deeper into his hands. Leaving him alone, the best course of action, isn’t possible when the ‘team’ kneejerk reacts to the report button being smacked ignoring anything around it. If he was tweeting out the addresses of the Sandy Hook parents, we all know that is a clear violation of the rules & should be reported. If he is claiming Hillary is a Lizard Alien with dementia… I have to ask why you bother giving him your attention. But people love to report because there is no fscking downside to doing it.
If I crank call 911 (without call spoofing to make it look like it came from the WH switchboard) I get a visit from Johnny Law & if I am flippant to Johnny Law he plays Uber… with cuffs with my ass. If I keep doing it, I end up in a cell to teach me that 911 isn’t a toy.
Imagine if the braintrust that mass reported me had gotten a minor time out for abusing the system, think they would pursue that tactic again right away? Imagine if Twitters system showed Trust & Safety they have reported this account 20 times, none upheld. Think they would zap the reported account instantly… or perhaps look and see if it violated any actual rules and if not kick off a vacation for the reporters.

The mass reporter people think its funny silencing people on Twitter, if their main tool for doing so had the power to silence them if they abuse it… well once they got done screaming how unfair it was into the void they might learn to avoid doing it again lest they end up off the platform for abusing 911.

(sees letter from Del Harvey to staff about their new and improved punishments for dehumanizing behavior)

Nevermind.

That Anonymous Coward (profile) says:

Re: Re: Re:

I agree, but doing what they did made him a martyr.
It would have taken about 3 hours before he tweeted something that violated the actual rules.

There are millions of Twitter users, I’m pretty sure I’ve never seen tweets from 99.9% of them. If I was seeing tweets I disliked I should block the account, & not focus on if I don’t stand up and declare war on person I disagree with the fate of the universe falls to evil!!!!!!!!

Everybody can be somebodies asshole, so if you start booting people off for being an asshole where & when can you stop?

Stephen T. Stone (profile) says:

Re: Re: Re:

I agree, but doing what they did made him a martyr.

So what? Those companies had the right to boot him. They exercised that right. That his getting banned made him a martyr for misguided free speech absolutists and crackpot conspiracy theorists does not change those facts.

Everybody can be somebodies asshole, so if you start booting people off for being an asshole where & when can you stop?

Booting someone for being a disruptive asshole is a subjective judgment call. Booting someone for making defamatory statements or expressing hatred toward a historically oppressed minority of the population, however, is far less subjective.

Thad (profile) says:

Re: Re: Re:2 Re:

Booting someone for making defamatory statements or expressing hatred toward a historically oppressed minority of the population, however, is far less subjective.

It’s still pretty subjective.

Determining for certain whether or not a statement is defamatory is up to the courts. There are certain reasonable guidelines that a layperson can follow to make a best guess — is it a factual statement, or a statement of opinion based on disclosed facts? But even people who have an above-average knowledge of the law can trip up on that question.

(Try saying "Shiva Ayyadurai’s claim to be the inventor of e-mail is an opinion, not a factual statement" and see how many people in the comments disagree with that, despite it being the core of Techdirt’s defense and the judge’s decision to dismiss the case. Indeed, here’s a thread where a bunch of people jump down my throat for saying that — including you, though at least you were a little nicer about it than the guy who called me a "pathetic loser" — despite, again, it being a legally correct statement.)

Accurately recognizing hate speech is a difficult problem too; sometimes it’s very, very obvious, but sometimes it requires context; something that appears to be hateful rhetoric on its surface may not be. There are plenty of stories on Techdirt of posts or tweets reporting abuse being flagged as abusive themselves, and of course the recent story of That Anonymous Coward being suspended from Twitter for using a homophobic slur in reference to himself. And I believe you dropped a C-bomb the other day; there are people who would see the mere use of that word as "expressing hatred toward a historically oppressed [segment] of the population" ("minority" from your original quote replaced here because women are, of course, not a minority), even though the context in which you used it was not actually demeaning toward women.

Defamation and hate speech can be easy to spot. Some examples are flagrant. But some are not. And understanding context and nuance does not work at scale.

That’s another reason why I think going back to smaller communities is the best fix here: because in a small enough community, the moderators know all the members and all the inside jokes. No community is ever going to agree 100% about any moderation decision, but a moderator who’s part of the community can make an informed decision about whether or not a comment is abusive, in a way that somebody reading a post completely out of context, without knowing any of the people in the conversation, can’t.

Stephen T. Stone (profile) says:

Re: Re: Re:3 Re:

Also, I appreciate two things about this comment:

  1. That change from “minority” to “segment” when quoting me; I will try to use that particular wording from now on.
  2. The fact that you sincerely deconstructed my bullshit while giving me new ideas to think about so I can form better opinions.

Good show, sir! ????

Thad (profile) says:

Re: Re: Re:4 Re:

Thanks. And I’d like to add that I enjoy our conversations and you give me a lot to think about too, even if we don’t always agree.

And in this case it’s not so much that I’m disagreeing with you as trying to tease out some exceptions to what you’re saying. You’re making good points, as usual; I just want to point out that they don’t apply in all cases and the devil is often in the details.

Stephen T. Stone (profile) says:

Re: Re: Re:5

Oh by all means, feel free to point out when I make absolutist statements that should have more nuance injected into them. I cannot learn if no one tells me what I got wrong.

Besides, being able to admit to being wrong is a good thing. Can you imagine how much of a shithead someone would be if people told him that a certain claim was wrong and he kept insisting he was right?

That Anonymous Coward (profile) says:

Re: Re: Re:3 Re:

claps

I like it.

Part of the problem is it is very rare that people will look at anything other than the single tweet presented to them by someone outraged.

I had some alt-right nutjob trying to verbally rip my throat out b/c he saw a single tweet and decided his best plan would be to attack me for mocking his hero…
The problem is his hero & I follow each other, we have a long history of inappropriate comments back and forth. We disagree on somethings but we set that shit aside and look for where we agree. I can ask questions & not get the standard you leftie commie etc. b/c he knows I am honest in my desire to understand their position.
So I have this unhinged idiot screaming at me, so I’m mocking him b/c well thats what one does when confronted with idiots, he gets so very worked up & then I pointed out you follow hero, he doesn’t follow you… he does follow me. He got real quiet real fast.

I get a couple of these a week, someone seeing 1 tweet out of a tweet storm or long thread & going bonkers. Its easy to see 280 characters and get riled up, but more people need to start opening the whole thread to get the context.

Ninja (profile) says:

Re: Re: Re:3 Re:

“”minority” from your original quote replaced here because women are, of course, not a minority”

Just a bit of a side note, when you talk about minority in terms of discrimination and equality you don’t talk about the overall number of individuals in such position but rather the fact that that segment, however big it is, does not receive equal treatment. Women are a minority segment in our current patriarchal society. In Brazil black people and their half-breed with white people (is that the right word to define it in English?) are more than half of the population but they suffer from multiple injustices so they are a minority group. It’s about representation and equality.

Just my 2 cents. I’m not criticizing your comment, just focusing this specific issue of definitions.

James Burkhardt (profile) says:

Re: Re: Re:4 Re:

While potentially correct from a grammatical point of view, oppressors love to co-opt language and can use a term like minority in a discussion of oppression to position themselves as the oppressed minority, because of the use of semantic games to establish a definition of themselves as the minority. And, of course, since they are the minority, not the majority, they can’t be oppressing the ‘majority’

While they can still co-opt the use of more accurate terms like ‘oppressed segment’ or just ‘the oppressed’, it keeps the discussion on their justifications on the oppression part, making it harder to disguise their threadbare arguments with a vainer of intellectualism and philosophy.

Padpaw (profile) says:

Re: Re: Re:2 Re:

The problem I think many including myself have is the selective enforcement they did.

Where they let other groups stay up for the exact same type of hate speech as jones, only they see nothing wrong with it because it’s directed at people those in charge don’t like.

If they are picking and choosing who they ban based on if they dislike their targets or not, that seems like censorship based on personal views instead of the rules.

Stephen T. Stone (profile) says:

Re: Re: Re:3

The problem I think many including myself have is the selective enforcement they did.

Which is entirely fine.

If they are picking and choosing who they ban based on if they dislike their targets or not, that seems like censorship based on personal views instead of the rules.

It is not “censorship” when someone gets booted from a platform they do not own, even for arbitrary and capricious reasons. Even the InfoWars terms of service document says that. A Twitter ban does not prevent someone from speaking their mind elsewhere—it only denies a platform and an audience to that person.

And by the same token, Twitter cannot be forced to host speech which its owners/operators do not want to host. The same goes for any other website. If someone were to argue that a White supremacist forum has the right to boot users for expressing disdain toward White people, they would have to accept that Twitter has the right to boot users for expressing White supremacist views.

Feel free to argue about the morality and ethics of banning people based on political viewpoints. That is a discussion worth having. But legally, the government can no more force Twitter to host Alex Jones’s speech any more than we can force Techdirt to host ours.

Anonymous Coward says:

Re: Re: Re:4 Re:

It is not “censorship” when someone gets booted from a platform they do not own, even for arbitrary and capricious reasons.

If you get kicked off the telegraph, or get kicked off plain old telephone service (POTS) for arbitrary and capricious reasons then whether you want to use the label “censorship” or not, it’s still unlawful.

Further, along similar lines, in Turner Broadcasting v FCC (1997), the local television stations didn’t own TBS’s cable system, but the "must-carry" provisions of the Cable Television Consumer Protection and Competition Act of 1992 were still upheld.

Anonymous Coward says:

Re: Re: Re:5 Re:

If you get kicked off the telegraph, or get kicked off plain old telephone service (POTS) for arbitrary

That would be equivalent to being kicked of the Internet. Being kicked off of FaceBook or Twitter is like being banned from a pub or cafe, you have to take you custom else where. The people you want to talk to do not have to follow to your new watering hole, but they are free to do so if they want to listen to you.

Anonymous Coward says:

Re: Re: Re:6 Re:

… kicked of[f] the Internet. Being kicked off of FaceBook or Twitter…

Do you understand very clearly that existing statute does not support the distinction you want to make here?

47 USC § 230(f) Definitions

(2) Interactive computer service

The term "interactive computer service" means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.

So, with that clear understanding, you’re invited to suggest new statutory language that would convey the distinction that lots of people would like to make.

Normatively, I may be inclined to agree that you’re making a worthwhile distinction that ought to be expressed in the Communications Act of 1934, as amended by the Telecommunications Act of 1996, and as further amended.

Anonymous Coward says:

Re: Re: Re:8 Re:

The distinction or its lack is irrelevant…

At this point, I’m not going to accuse you personally of arguing in bad faith, or acting as a Verizon shill.

But everyone ought to understand the bait-and-switch that’s being offered.

Stone, for instance, argues that “platforms” have every right to terminate service for arbitrary and capricious reasons. But a “platform” isn’t a term currently understood by courts or the law.

So, what he’s arguing, in essence, is that telegraph companies and telephone carriers have every right to kick subscribers. And that’s bullshit. Almost no one agrees with that. The people who do agree with that are generally paid by the telecoms.

Anonymous Coward says:

Re: Re: Re:12 Re:

Why should they have fewer rights to decide what speech will or will not be allowed on those platforms than a newspaper or a magazine publisher?

If that’s the tack you want to take, then, why should they have fewer duties than a newspaper or magazine publisher?

Perhaps the answer to that is they’re offering communications service to the public at large — at scale.

Stephen T. Stone (profile) says:

Re: Re: Re:13

why should they have fewer duties than a newspaper or magazine publisher?

Like you said: Those services are neither publisher nor speaker. The companies may have a moral/ethical duty to moderate their services in a similar fashion as newspapers and magazines, but to make such moderation a legal requirement would destroy the usability of those services.

Anonymous Coward says:

Re: Re: Re:14 Re:

… to make such moderation a legal requirement would destroy the usability of those services.

O’Brien v Western Union (1st Cir. 1940)

The immunity of the telegraph company from liability to a defamed person when it transmits a libellous message must be broad enough to enable the company to render its public service efficiently and with dispatch. Speed is the essence of the service. . . .

Manifestly the telegraph company’s privilege cannot be restricted to cases in which the sender in fact was privileged, as often he may be. It must be broader than that, and the cases so hold. Otherwise the company for its own self-protection would have to be permitted to delay sending the message pending some kind of check-up of the circumstances to which the message relates.

(Citations omitted.)

So why should Facebook and Twitter have fewer duties than the telegraph company? The telegraph company can’t arbitrarily and capriciously deny service — no more than a railroad

We have repeatedly said that it is apparent from the legislative history of the [Interstate Commerce Act of 1887] that not only was the evil of discrimination the principal thing aimed at, but that there is no basis for the contention that Congress intended to exempt any discriminatory action or practice of interstate carriers affecting interstate commerce which it had authority to reach.

The anti-discrimination language of the Communications Act of 1934 was sourced from earlier language in the 1887 Act. It’s a deep principle.

Stephen T. Stone (profile) says:

Re: Re: Re:15

So why should Facebook and Twitter have fewer duties than the telegraph company?

Your quoted court case says it all:

the company for its own self-protection would have to be permitted to delay sending the message pending some kind of check-up of the circumstances to which the message relates

You know how busy Facebook and Twitter are these days? Yeah, imagine if all that traffic was ground to a halt because the admins and moderators of those services had to make sure every post was within the boundaries of the law even before they check to see if they do not violate the terms of service. You might say that is a good thing. But if Facebook and Twitter were slapped with that restriction, every other service and website that allows third-party content would have to do it, too. Tumblr, YouTube, Instagram, Blogger, every conceivable forum and imageboard…all of them would have to hold back posts to double check whether they could be posted. And given how the larger services are already understaffed vis-á-vis the scale of moderation they have to do, forcing them to moderate every post would destroy the usability of those services. The Web would effectively grind to a halt.

If Twitter, Facebook, etc. have a hand in creating or posting illegal content, they should be held accountable for it. (That is why Backpage got dinged by the law.) They should not be punished or unduly burdened just for operating a service that some people will use to personally demonstrate the Greater Internet Fuckwad Theory.

Anonymous Coward says:

Re: Re: Re:16 Re:

Your quoted court case says it all

No. That case addresses the previous question — why Twitter and Facebook, just as a telegraph company back in 1940, might have fewer duties than a publisher.

But the question I followed up with, was why Twitter and Facebook, who are engaged in the business of providing communications service to the public at large, should have fewer duties than other communications service providers? Such as telegraph companies. Why should Facebook and Twitter be permitted to act arbitrarily, capriciously, unreasonably, and unjustly.

I’m not saying there aren’t plausible reasons for that. But you yourself haven’t given any here.

And lately, it’s the modern phone companies’ well-known and public position that they shouldn’t have any greater duties or obligations than Facebook or Twitter.

Anonymous Coward says:

Re: Re: Re:17 Re:

The big reason that FaceBook et. al. should be able to moderate, while phone companies should not is that the phone is essentially person to person communications, while websites are person to the whole world communications. That is one is basically private communications, and the other is a self publishing platform.

Anonymous Coward says:

Re: Re: Re:18 Re:

… while websites are person to the whole world communications.

So the Comcast sits astride the gateway between a person at their residence or business, and the whole world wide web.

            Person <-(Comcast)-> WWW

Applying your stated distinction, Comcast should be able to moderate your residential connection.

You know, most people don’t agree with that concept. Personally, I think your stated distinction needs some restatement.

Anonymous Coward says:

Re: Re: Re:19 Re:

Comcast like the phone, sit between you and who you want to exchange messages with, and so like the phone is a common carrier.

Failure to distinguish between the communications level and service level of the Internet is like failing to distinguish between the phone company and the businesses that you use the phone to communicate with.

The reason that Comcast and friends do not want to be classified as a common carrier is because they set between you and all those services that are eating away at their cable subscriptions.

Stephen T. Stone (profile) says:

Re: Re: Re:17

Why should Facebook and Twitter be permitted to act arbitrarily, capriciously, unreasonably, and unjustly.

Because at the end of the day, Facebook and Twitter are not the following two things:

  • A public utility
  • Owned by the government

They are still, for all their size and cultural influence and monetary value, platform services controlled by privately-owned corporations. They should have no fewer or no more legally-bound duties than any other similar platform. I have asked this of multiple people, and no one has yet to give me a straight answer: Why should Twitter, Facebook, etc. be forced into hosting speech which the owners and operators of those services do not want to host?

Twitter and Facebook, who are engaged in the business of providing communications service to the public at large

What they provide is a platform for speech; the Internet is the actual communications service.

Anonymous Coward says:

Re: Re: Re:18 Re:

They should have no fewer or no more legally-bound duties than any other similar platform.

In law, right now, descriptively, platforms similar to Facebook and Twitter are 47 USC § 230(f)(2) “interactive computer services”.

(2) Interactive computer service

The term "interactive computer service" means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet

Yet, according to polling, five out of six people in the nation today oppose the idea that Verizon, Comcast and others ought to be permitted to restrict access to or availability of voice and/or data services.

I’ve tossed you a softball question twice now, slow hanging fat and square right over the plate — instead of taking the opportunity to smash it out of the park — you continue to insist on a principle that telegraph and telephone carriers have every right to kick subscribers.

Why should Twitter, Facebook, etc. be forced into hosting speech which the owners and operators of those services do not want to host?

They’re commercial business. If those for-profit business corporations don’t want to obey the law Congress enacts, they can flee to Europe or China. Or they can simply get out of the business of providing communications service to the public, take their servers down, and devote them to bitcoin mining or something.

The law on this goes back to circa 1623. Unfortunately, that means the case is reported in that bastard Anglo-Norman language known as “Law French”.

mes si jeo take down mon signe againe, jeo moy mesme discharge de cest burthen

Essentially, the judges there say if you don’t like the business you’re in, you can take your marbles and go home. Until then, you’ve got to obey the law for that class of business.

One way or another, the American people have a core right to organize and regulate commerce among the several states, and with foreign nations, so as—

… to make available, so far as possible, to all the people of the United States, without discrimination on the basis of race, color, religion, national origin, or sex, a rapid, efficient, Nation-wide, and world-wide wire and radio communication service with adequate facilities at reasonable charges…

Stephen T. Stone (profile) says:

Re: Re: Re:19

They’re commercial business.

…fucking what? How does that justify forcing a privately-owned service to host speech that its owners and operators do not want to host?

Or they can simply get out of the business of providing communications service to the public

A privately-owned business that runs a communication platform protected by both the First Amendment and Section 230 should not be forced out of business if it does not do something that literally any other kind of platform cannot and would not ever be asked to do by the government. The government cannot force a newspaper or a magazine to run an article about a certain subject; why should it ever be able to tell Twitter that it absolutely must host the insane ramblings of Alex Jones or else?

Anonymous Coward says:

Re: Re: Re:20 Re:

… protected by both the First Amendment

The First Amendment does not prohibit enforcement of §§ 201 and 202 of the Communications Act of 1934 against communications providers.

… and Section 230

Section 230 can be amended. That’s what we’re talking about.

 

Ultimately, all the rights guaranteed in the Bill of Rights belong to the people — not the corporations. Now the people do have a fundamental right to organize — and the geneology on that right of organization goes back to the Mayflower Compact — but no one should doubt that there are limits on the right to organize for even political purposes in corporate form.

When you’re talking about the right to organize for —not political— for commercial purposes in corporate form, the First Amendment was adopted roughly contemporaneously with the Article I, Section 8, Clause 3 commerce power. That commerce power encompasses the provisions of the Communications Act of 1934.

Stephen T. Stone (profile) says:

Re: Re: Re:21

Look, all that talk about the law and limits on organizing and corporations is going over my head. (I am a regular jackoff, not a policy wonk or a third-year law student living off Ramen noodles and spite for their parents.) So I will ask you this one more time, and I would love an answer that does not descend into wonk-ness: What makes services like Twitter and Facebook, and nothing else but those services, deserving of being told by the government that they must absolutely host specific types and examples of speech and expression “or else”?

Stephen T. Stone (profile) says:

Re: Re: Re:22 Re:

Wait, before you answer, I have another question, and this one might be easier to answer in a non-wonk way: If Twitter and Facebook were to announce tomorrow that they would be going out of business and shutting down all services on Monday, what—if anything—should the government be legally allowed to do to keep those privately-owned businesses and their associated services from going dark?

Christenson says:

Re: Re: Re:23 Re:

Can I answer, pretty please??? Please??? (lol)
1) If twitter and Facebook were to announce their disappearance, the Gubmn’t should do nothing. (Well, possibly let people get their data out). But it’s an impossible hypothetical, and if it came to pass, it would do something.
2) What makes twitter and facebook special enough to deserve gubmn’t attention? Outsize influence, mobs doing harm (and not just right wing ones, The Atlantic discusses some incidents in India), and oh, yeah, Russian hacking! Oh, and, don’t forget the Moral Panic (TM), caused by entirely too much screen time on something mysterious and new-fangled to the current geriatric generation of elected politicians, especially congress critters!

Anonymous Coward says:

Re: Re: Re:22 Re:

What makes services like Twitter and Facebook, and nothing else but those services…

It’s very touching that you’d like to limit the discussion in a way that’s simply unsupported by existing law. That’s a cute way to dodge the definition in § 230(f)(2).

Did you graduate high-school? Are you literate enough to read a ballot? Do you need some kind of special-needs assistance to vote? Are you at least partially capable of participating in the American experiment in self-government?

 

… deserving of being told by the government that they must absolutely host specific types and examples of speech and expression “or else”?

If you’re in the commercial business of offering and providing communications service to the American public at large, then the American public has every right to insist that you actually deliver on that offer — and that your corporation must refrain from acting arbitrarily, capriciously, unreasonably and unjustly in providing the communications service.

If your corporation wants to act arbitrarily, capriciously, unreasonably and unjustly, then the American public may demand that you pursue some other line of commerce — one that does not involve significant control over the nation’s political discourse.

‘Cause we have a compelling interest in maintaining effective self-government, free from the capricious whims of any small handful of corporate oligarchs.

Although, maybe you’d rather live in a dystopia where a half-dozen corporations arbitrarily decide who gets to speak and what they can say — what subjects cannot be talked about. Some people would like that, I guess. It’d be like living in a sci-fi novel.

Christenson says:

Re: Re: Re:11 Re:

In my contrarian view, while Twitter and Facebook may not be speakers in that they re-mix user-generated content….

they are very much speakers with the choices that go in to that transformative re-mix! That is, their recommendation algorithms are very much the speech of the platform, and share at least moral culpability for some of the harm.

Now, if we can just get some good definitions to support our intuition before we get hamfisted bad laws involved…..

That Anonymous Coward (profile) says:

Re: Re: Re:4 Re:

Having “rules” is one thing, twisting them to justify we booted them because we don’t like the speaker is where it falls apart.

Leslie Jones famously got a bunch of twitter hate & people who used the N word towards her were banned. Ms. Jones frequently used the N word towards others, not even a blip. Wrong or right it added to the narrative that if you had a blue check you got special rules.

While Twitter has a right to do things how they want on their platform, it is this obvious double standard pissed off the natives. There are hundreds of examples of this happening, including my own time out.

Twitter should stop trying to create justifications for their actions, it gives people who love the conspiracy theories more ammunition and the signal to noise ratio starts making the platform more problematic.

If you say a word is verboten it should be verboten for all, creating a grey area where sometimes it is okay sometimes it is bad depending on how many people complain about it opens the door for abuse. The “manpower’ required to review the complaints keeps growing b/c outsiders now have to attempt to apply context or just slam the timeout button over and over & then deal with the number of appeals while adding to the growing incorrect narrative that all animals are equal some more equal than others.

How can you detect a campagin to silence someone vs someone who offended many people?
When the decisions are boiled down to only the eye of the beholder, you ban the gay guy who said faggot ironically for targeting hate at others… b/c someone allegedly got offended. If the minefield is will this word offend anyone, people will stop talking. We’ll be reduced to inane things, and some asshole is still going to keep reporting people.

It is impossible for a computer to know how a word is being used. Even a human still needs more input to put it into context. Twitter has the ability for users to never have to see the words that offend them, yet people keep slamming the report button b/c it allows them to win points in their head against someone they dislike. Mass reporting is a tactic that takes advantage of the Twitter response of if it was reported we have to act now, and the only tool we have is silencing someone and wait & see if they appeal then invest time in looking at context, and even then depending on the reviewer you can get rejected.

Twitter created the problem by trying to appease people faster & faster rather than point at the tools available already. Twitter also screwed up in letting people know when they’ve been blocked, it encourages bad actors to make more accounts or recruit their social circle to make their point, rather than let them tweet into the void and assume the person is ignoring them.

It is really hard to manage these things automagically, when you have a system and refuse to consider it isn’t actually working. Take the community moderation here, psycho starts with his Google shill rant & when we hit enough votes poof. But if you want to see it, you still can.
On twitter, first time it mutes that thread for me, but if I’ve clicked 20 times that account never shows up in my feed anymore, but I can go to their page if I want to & see what they are up to or proactively put them back into things I can see. It takes away the ability for a group of people to game the system & silence someone for sport b/c it offended them & they can feel they are the best arbitrator of what should be allowed for everyone.

The bad actor isn’t aware that/which people aren’t seeing their spew, can’t climb up on a cross as a martyr being punished by Twitter & the Leftist Conspiracy, and users who have no interest don’t have the spew getting onto them.

This removes the winning points game by how many of their side you can take down & makes users responsible for their experience & not just abdicating it to Big Twitter to protect them.

Stephen T. Stone (profile) says:

Re: Re: Re:5 Re:

Leslie Jones famously got a bunch of twitter hate & people who used the N word towards her were banned. Ms. Jones frequently used the N word towards others, not even a blip.

Putting aside her status as a “verified user” and a celebrity, the important thing to remember about that sort of situation is this: Leslie Jones is a Black woman. If anyone has any right to use that particular word in just about any given context, she sure as hell does.

While Twitter has a right to do things how they want on their platform, it is this obvious double standard pissed off the natives. There are hundreds of examples of this happening, including my own time out.

And you are absolutely free to feel pissed off about that, to complain about that, and to send Twitter management a list of your grievances with how they run the service. Whether Jack listens to you, however, is his prerogative.

If you say a word is verboten it should be verboten for all

I identify as queer. That word has a…certain history with the LGBT community. While I recognize that history, I believe queer, as a label, is both a more inclusive shorthand for the broader LGBT community in general (asexuals, aromantics, etc.) and an easier-to-use descriptor for my personal sexual orientation than, say, a Kinsey scale number or a detailed explanation. If another LGBT person called me “queer”, I would have little issue with it. If a straight person called me “queer” and meant it entirely as an insult, however, I would take issue with that usage.

If Twitter wanted to ban the word queer because of its history as an anti-LGBT slur, would you agree that, despite the context explained above, I should be banned for self-identifying as a queer man?

Twitter has the ability for users to never have to see the words that offend them, yet people keep slamming the report button b/c it allows them to win points in their head against someone they dislike.

The gamification of social interaction networks has become a major issue, yes. A discussion about how to either prevent it or mitigate the damage it can do is a discussion worth having.

Twitter created the problem by trying to appease people faster & faster rather than point at the tools available already. Twitter also screwed up in letting people know when they’ve been blocked, it encourages bad actors to make more accounts or recruit their social circle to make their point, rather than let them tweet into the void and assume the person is ignoring them.

Agreed on both counts.

Christenson says:

Re: Re: Re:6 Faster and Faster!

Looking at even the trolling in these comments, “Faster and Faster” seems to me to be a serious element of the problem.

Pull out that instant gratification, and you take away that addictive dopamine reward from many trolls. That goes for bans, too; moderation should not be done instantly.

Reflecting, the real harms from Facebook and Twitter have been the mobs that form. And, with CDA230 immunity for moderating, the platforms pick and choose for us on an individual level what we see, so if I like Alex Jones, I won’t see when Pizzagate is called BS, or anyone calling him loony tunes. I don’t think CDA230 was intended for the case where there’s so much moderation that the moderation itself becomes the message.
****

Now, the problem with words like queer and nigger is that the reaction they cause is heavily context dependent, and computers are famously bad at context. Same is actually true for moderation, go see the trolls on this post!

Michael Whitetail says:

First off, I will admit that I didn’t read the whole post. But I don’t feel that this is a knee jerk reaction post. I am just curious where people fall on what *really* is censorship versus what people are calling censorship in these types of situations.

If the Mike covered this in his post, I missed it, and I apologize. The post is quite long and dry and it was hard not to skip large sections.

Now on to my questions. I struggle with the definition of censorship, which a quick search gives the following:

noun
1. the suppression or prohibition of any parts of books, films, news, etc. that are considered obscene, politically unacceptable, or a threat to security.

Classically, this would occur when the government stopped publication and/or distribution of written/audio/video materials.

So I have often thought, is a publishing house censoring content when it refuses to publish based on a moral or ethical stance? Does the ability to self publish and distribute your own content mean that you actually cannot be censored by anyone other than the government?

In the case of deplatforming, does the ability to self publish on your own internet site mean that you aren’t being censored?

I see logical arguments for both sides of the debate, but like Mike, I know the answer isn’t binary, black or white. And I actually cannot seem to make up my own mind on the topic. So the big question here is, where do you fall on this?

Stephen T. Stone (profile) says:

Re: Re:

is a publishing house censoring content when it refuses to publish based on a moral or ethical stance?

No.

Does the ability to self publish and distribute your own content mean that you actually cannot be censored by anyone other than the government?

Yes.

In the case of deplatforming, does the ability to self publish on your own internet site mean that you aren’t being censored?

Yes.

You are guaranteed a right to speak your mind. You are not guaranteed an audience.

Mason Wheeler (profile) says:

So Zuckerberg said:

I’m Jewish, and there’s a set of people who deny that the Holocaust happened.

I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong, but I think… it’s hard to impugn intent and to understand the intent. I just think, as abhorrent as some of those examples are, I think the reality is also that I get things wrong when I speak publicly. I’m sure you do. I’m sure a lot of leaders and public figures we respect do too, and I just don’t think that it is the right thing to say, “We’re going to take someone off the platform if they get things wrong, even multiple times.”

So far, so good. Then he went back on this and took the stuff down anyway.

It’s been said that those who do not learn from history are doomed to repeat it. Here’s a bit of history that hasn’t been all that widely studied, that we’re currently in the early stages of repeating: hate speech laws from a century ago.

It might be surprising to learn that Weimar Germany had very strong, very modern laws against hate speech, and one of the strongest beneficiaries of those laws were the Jews. They used them to fight back against very real discrimination in their day, and the courts did "the right thing" the vast majority of the time.

One frequent target of such laws was a hate-filled guy by the name of Adolf Hitler. He ended up getting smacked down for his serial offenses so much that he eventually got injunctions against him, preventing him from holding further rallies. ("Deplatformed," to use the modern parlance.) Well, that went over perfectly and we never heard from that troublemaker again… right?

Oh, wait, no. That’s not what happened at all. It made a martyr out of him. The Nazis were able to point to the way he was being censored and use it as a rallying cry, which ended up being massively successful and we all know where that led.

So yes, there is a massive ethical problem with allowing people to be censored from the modern-day public square, and the fundamental problems involved do not change one whit if those doing so are private rather than state actors. With great power comes great responsibility, and when you become powerful enough to do things that historically only governments were capable of doing, the restraints that we have historically placed upon governments must be applied as well.

Stephen T. Stone (profile) says:

Re: Re: Re: Small point of contention here, but...

I saw the quoted part in your comment and felt a compulsion to reply to it. I realize that is not part of your comment, though. Still, as I said: Execution overrides intent. I did not mean to imply anything about you in my comment, and I will do better about being reactionary to comments here in the future. You have my sincere apologies for any discomfort or offense that I caused.

Leigh Beadon (profile) says:

Re: Re:

the modern-day public square

People keep using this term to refer to social media as though it’s now indisputably true. But it’s not. In fact, it’s pretty silly. Maybe you could argue that the internet as a whole is the modern-day public square, but the idea that each and every major web platform or social media service is "the public square" makes no sense.

when you become powerful enough to do things that historically only governments were capable of doing, the restraints that we have historically placed upon governments must be applied as well

The restraint placed on the government is that it is not allowed to make laws that prohibit speech. Facebook is not capable of making laws.

Anonymous Coward says:

Re: Re: Re:

People keep using this term to refer to social media as though it’s now indisputably true. But it’s not. In fact, it’s pretty silly. Maybe you could argue that the internet as a whole is the modern-day public square, but the idea that each and every major web platform or social media service is "the public square" makes no sense.

You seem to imply that there can only be one public square. This was never the case. Each town had a square, and if you didn’t like it you could move. But then you wouldn’t be addressing your community, you’d be addressing some other community, just as if you move from Facebook to Twitter.

"All the party invitations in Cambridge come through Facebook. If you don’t use Facebook you don’t get to any parties, so you’ll never meet any girls, you won’t have any kids and your genes will die out."

Each of the well-known platforms holds more people than even the largest "public squares" ever could.

Leigh Beadon (profile) says:

Re: Re: Re: Re:

So should Starbucks be regulated as the public square? In my estimation, far more people have conversations with both friends and strangers in one of America’s totally ubiquitous Starbucks locations than they do in its public parks and town squares. To use Mason’s vague standard, I suspect that Starbucks Corporation in 2018 has, in many ways, a level of power that “historically” (which I take to mean in the late 1700s when the bill of rights was written) was only held by governments.

Leigh Beadon (profile) says:

Re: Re: Re:3 Re:

If the World Wide Web is the thing that’s transformative and unprecedented, why are you deeming multiple individual services like Facebook and Twitter to be the public square?

Large though those platforms may be, they still represent only a small fraction of the public’s ability to publish content and engage in speech on the web.

Anonymous Coward says:

Re: Re: Re:2 Re:

So should Starbucks be regulated as the public square?

Whether and how Starbucks should be regulated as a public square, and whether it is a de-facto public square, are different questions. For now I express no opinion on the former, though I’ll note in passing that California’s Pruneyard decision might already be regulating Starbucks in that way.

Stephen T. Stone (profile) says:

Re: Re: Re:3

Starbucks and, say, Twitter are privately-owned corporations that both offer a service to the public and, under a certain interpretation, provide a “public square” to that same public. That one operates a chain of brick-and-mortar coffeehouses and the other operates an online social interaction network, and that Twitter’s service is literally being that “public square”, makes little difference. What would make Twitter more deserving of “public square” restrictions than Starbucks?

Anonymous Coward says:

Re: Re: Re:4 Re:

What would make Twitter more deserving of “public square” restrictions than Starbucks?

What do you mean by "public square restrictions"? Prohibitions on kicking people out? And "more deserving…than Starbucks"? Starbucks may already have these restrictions in California, and no court has said Twitter does (even in California they’ve been chipping away at the Pruneyard ruling).

Stephen T. Stone (profile) says:

Re: Re: Re:5

What do you mean by "public square restrictions"? Prohibitions on kicking people out?

Yes.

And "more deserving…than Starbucks"?

If Starbucks were to have more latitude in deciding who could be kicked out of its stores than Facebook would have in deciding who could be kicked off that service, Starbucks would seem as though it was more deserving of that latitude than Facebook. I would then have to ask why that is the case.

James Burkhardt (profile) says:

Re: Re: Re:5 Re:

The pruneyard ruling was interpreted a bunch of different ways, but in 2012 SCOTUS took up a case and cut it back to a narrower stance: it only applies to areas designed for the public to gather and linger, like food courts, plazas, and atriums, but not areas designed to facilitate the movement of individuals, sidewalks and right of ways. It certainly doesn’t apply to the stores themselves anymore.

Uriel-238 (profile) says:

Re: Re: Re:2 Starbucks

The rallies of the German National Socialist Party took place in beer halls. It’s a valid question.

Whether or not social gathering at public places should be regulated may depend on whether or not the state / public approves of the positions of that group.

Or not.

I think in this case political speech is like armament (guns) in that we expect individuals of the public to be responsible in its care and use. That we have notions like hate speech or incitement implies that we don’t trust members of the public to talk among themselves responsibly.

But then for the same reasons we might not trust people with talking or guns, we might not trust them with voting either. Look what happened in 2016.

It’s ultimately a paradox of the people. They can’t be trusted to be adults, but then we have no adults to supervise them. TechDirt provides a continuous flow of counterexamples to the notion our persons of authority (law enforcement, judges, legislators, et. al) are responsible and make just decisions.

Find the (Madisonian) angels that can articulate which groups are not allowed to gather in Starbucks / Beer Halls, and we’ll have a notion of what groups should be allowed to speak on Facebook and Twitter.

Given Facebook has a penchant for censoring Danish mermaids and brfeastfeeding mothers, I would argue we haven’t found those angels yet.

Anonymous Coward says:

Re: Re: Re:

The restraint placed on the government is that it is not allowed to make laws that prohibit speech. Facebook is not capable of making laws.

What’s the practical difference here? If I don’t like my country’s speech laws, I can choose a different country (maybe), or try to get the laws changed (maybe). If I don’t like Facebook’s rules, I can choose a different website. They can’t throw me in prison, but prison isn’t the only thing the First Amendment was meant to guard against.

Anonymous Coward says:

Re: Re: Re:2 Re:

Are you honestly asking me what the practical difference between Facebook and the US government is? How much time do you have?

No, I’m wondering why you felt that was the appropriate point on which the actions of Facebook et al. should be evaluated, and whether we should focus so strongly on it. You made the point that "Facebook is not capable of making laws", as if that proves something in and of itself; you didn’t really explain how that helps anyone. "Not capable of making laws" gives us little solace if the actions they take can cause much of the same harm as actual laws. (Though like Mike, I’m going to hedge with "no good answers". I’m not proposing regulation, but won’t dismiss it out of hand.)

Facebook is code, the First Amendment is law, and Lawrence Lessig wrote an entire book—two decades ago—from the premise that "code is law" in the modern world. The distinction between governments and private actors is, in my mind, not quite as binary as you make it seem. Not to say that being banned from Facebook is anything like being thrown in a gulag.

Anonymous Coward says:

Re: Re: Re:3 Re:

The distinction between governments and private actors is, in my mind, not quite as binary as you make it seem.

The big difference the government can order you not to publish in any media, electronic or paper based, and through you in jail if you disobey that order. Private actors on the other hand can trough you off their platforms, but not stop you from publishing elsewhere.

Therefore while FaceBook can throw you off there platform, they cannot through you of the Internet.

Leigh Beadon (profile) says:

Re: Re: Re:2 Re:

Another good question:

If Facebook is the “public square”, should people have any expectation or protection of privacy for the data they generate there? Should Facebook have any obligation or ability to enable and respect private/public settings on posts, groups, events, etc? Is all the data Facebook stores on people’s conduct in the public square equivalent to government records, and subject to FOIA requests?

Anonymous Coward says:

Re: Re: Re:3 Re:

If Facebook is the "public square", should people have any expectation or protection of privacy for the data they generate there?

That is a good question. Okay, it’s technically wrong to declare that Facebook is simply "the public square". It’s shorthand. Nobody means that everything passing through the platform, including private messages, is part of the public square.

The public parts of Facebook, like the Infowars pages, are like a public square. Anyone can see that stuff. Group/family spaces might be akin to a living room or rented library conference room that such a group might have otherwise met in; only "members" are allowed. Direct messages are perhaps more analogous to telephone or parcel-delivery systems, which are privately-owned though we’ve seen fit to regulate them in various ways as "common carriers"—they have to provide privacy and cannot arbitrarily refuse service.

None of this means we need to reach the same conclusions w.r.t. Facebook.

Leigh Beadon (profile) says:

Re: Re: Re:4 Re:

The public parts of Facebook, like the Infowars pages, are like a public square.

Wouldn’t that also mean that all public page operators themselves are barred from moderating content or blocking users from their pages? You can organize a rally in the public square and invite who you want, but you can’t prevent other members of the public from attending.

Leigh Beadon (profile) says:

Re: Re: Re:5 Re:

(p.s. I do acknowledge that you are not actually arguing for these automatic matching restrictions on Facebook to the gov’t – I’m just attempting to highlight how messy things become when people start confidently applying this “public square” language to Facebook. I just don’t think the two are so easily comparable, for a whole host of reasons.)

Anonymous Coward says:

Re: Re: Re:6 Re:

I’m just attempting to highlight how messy things become when people start confidently applying this "public square" language to Facebook

Yeah, it is indeed a giant mess that’s going to get worse before it’s resolved. We’re talking about things that never really existed before, so, naturally, we’ll quickly hit the limits of every analogy. Which doesn’t mean that courts won’t make the same screwups… (Aereo, the third-party doctrine, …).

We can nevertheless look at various types of regulation that cover different aspects of the service(s), and evaluate how well they’ve worked in their original problem-domains and whether that’s something we might want to apply here. And we’ll have to watch out for stupid proposed regulations, because we know damn well that that’s going to happen… really, we should probably keep the legislators away from this area for at least a few more years.

WRT your earlier question about whether they’d have to let everyone speak, I don’t know. Probably another sign of the analogy’s limits. Is it better to say Infowars is like a newspaper, and can decide which comments to print, while Facebook is like the postal service (USPS cannot ban certain newspapers)? Or maybe public square is accurate enough. It’s not like every public square was always hosting a single conversation; there were always subgroups, who might try to speak quietly for privacy or try to eject troublemakers, even when the government running the square could not. Well, Facebook fits the definition in terms of a meeting place at least, and "public square" is more succinct than "newspaper and postal service and telephone operator and directory listing and…", so I’m inclined to stick with this shorthand despite its numerous problems.

Stephen T. Stone (profile) says:

Re: Re: Re:7

Is it better to say Infowars is like a newspaper, and can decide which comments to print, while Facebook is like the postal service (USPS cannot ban certain newspapers)?

No, because that would give Infowars more rights in moderating the open-to-the-public section of its site than Facebook. Facebook does not deliver mail or operate schools or fix public roads. It does not operate as an arm of the government. It should not be regulated as if it were.

As someone else pointed out in this comments section, the primary issue with Facebook being “too big” is not one of arbitrary size distinctions or technical functionality, but of cultural influence. All of these arguments about censorship and regulation, then, should be focused on two connected ideas: who has the ultimate control of a force that can influence culture and society, and what they are doing with that control (and influence).

Anonymous Coward says:

Re: Re: Re:8 Re:

Facebook does not deliver mail or operate schools or fix public roads. It does not operate as an arm of the government. It should not be regulated as if it were.

Facebook does deliver mail. Not physically, but electronically much like telegraph operators did, and they were regulated as common carriers despite being privately owned. Similarly for railroad and pipeline operators, and telcos.

Anonymous Coward says:

Re: Re: Re:8 Re:

As someone else pointed out in this comments section, the primary issue with Facebook being “too big” is not one of arbitrary size distinctions or technical functionality, but of cultural influence.

One could say the same thing about political parties, and they would hate not being able to control the direction of political discussion.

(I know they would ensure laws about speech do not apply to themselves).

Derek Kerton (profile) says:

Re: Re: Re:

A better metaphor is that the www is the “public square”, and facebook, etc are big merchants with storefronts on the public square.

They are NOT the square itself, but ARE contiguous with that prime real estate.

As shops on a square lose business, they go away and get replaced by new ones, but the square remains. Similarly, MySpace and Friendster had prime addresses, but those are now Twitter or Facebook.

Still, today, anybody can take a soapbox and go speak/rant in the public square, like Times Square NYC. But one cannot go to Times Square with a soapbox, enter the ESPN Zone restaurant, and deliver a filibuster about how media is controlled by Jews and Disney is tool of globalists.

Just as Alex Jones can set up his own site on the virtual town square at http://www.inforwars.com — but maybe not welcome inside Facebook.

Anonymous Coward says:

Re: Re: Re: Re:

… Alex Jones can set up his own site on the virtual town square at http://www.inforwars.com&hellip;

Are the .com gtld servers an essential public facility?

If you recall, the .com registry is operated by Verisign, a publicly-traded for-profit corporation, under agreement with ICANN, “formally organized as a nonprofit corporation ‘for charitable and public purposes’ under the California Nonprofit Public Benefit Corporation Law.”

Are the .com gtld servers an essential public facility?

Can Versign kick http://www.inforwars.com out of the .com gtld?

Anonymous Coward says:

Re: Re:

Given the control over the political process given to corporations via Citizens United, and we know that this is the case due to numerous studies (check Noam Chomsky talking about this and you’ll get all the citations you need), then corporate censorship becomes akin to government censorship.

Yes, corporations should be able to ban/edit/fiddle however they want, but I suggest that you prevent them from influencing government (campaign “donations”) too.

Killercool (profile) says:

Re: Internet sites have every right in the world to kick people off

There is a very narrow set of protected classes, and you probably know that there is, seeing as you jumped on the first one.

For the sake of argument, here they are:
Race.
Color.
Religion or creed.
National origin or ancestry.
Sex.
Age.
Physical or mental disability.
Veteran status.
Genetic information.
Citizenship.

This list isn’t true for everything, and you have to prove that the reason you were refused service was for being one of the protected classes.

So, if someone can prove the internet site banned them for being black, and not just an asshole, then NO, they can’t be banned. Just like a restaurant can’t refuse service for being white, but they can kick you out for being disruptive.

Anonymous Coward says:

Re: Re: Internet sites have every right in the world to kick people off

There is a very narrow set of protected classes…

During the second half of the 20th century, and into the beginning of this one, an interesting change in legal thought —or perhaps the popular conception of “legal” thought— has taken place. Back at the beginning of the 20th century there was a firmly established standard barring unreasonable discrimination in providing public accomodations and essential facilities.

“Unreasonable”, of course, was measured against the prevailing social attitudes of the day. And back in those days, Jim Crow had force, with among other things, the election of President Wilson in 1912. (Woodrow Wilson was a racist.)

Later on, tending towards the middle of the 20th century, it became established, through a number of cases at the federal level, that discrimination on the basis of race was “unreasonable”.

Further, the Supreme Court eventually came up with a concept of “inherently suspect classifications” that, if utilized, strongly tended to indicate &unreasonable discrimination.

Towards the end of the century, we see the notion of “protected classes” grow into the popular mind.

At this point, today, popular thinking seems to have gone from a standard-based bar against unreasonable discrimination — to a rule-based bar against discrimination on the basis of “protected class”. People seem to have entirely forgotten about any standard-based bar against unreasonable discrimination in the provision of public accommodations and essential facilities.

In fact, some people seem to delight in the notion that large corporations may act unreasonably — and that the legislatures, courts, and the people themselves are quite powerless to stop these large corporations’ unreasonable behaviour.

Stephen T. Stone (profile) says:

Re: Re: Re:

At this point, today, popular thinking seems to have gone from a standard-based bar against unreasonable discrimination — to a rule-based bar against discrimination on the basis of “protected class”. People seem to have entirely forgotten about any standard-based bar against unreasonable discrimination in the provision of public accommodations and essential facilities.

We have a good reason for that: Unreasonable discrimination these days typically takes the form of discrimination based on who someone is rather than what they do. Refusing service to someone who wants a swastika-shaped cake is reasonable; refusing service to someone only because they are Jewish is not. Booting someone from a platform for saying dumb bullshit is reasonable; booting someone from a platform only because they are Black or gay or an atheist—or, for that matter, White or straight or Christian—is exceptionally unreasonable.

Anonymous Coward says:

Re: Re: Re:2 Re:

We have a good reason for that…

“Reasonableness”, at bottom, is a question for the jury.

Just consider, though, that any old judge can mechanically apply some rule without sending a complicated case to a jury. And, note, observationally, there’s been a drastic decline in jury trials over the past few decades.

Anonymous Coward says:

Re: Re: Re:2 Re:

Unreasonable discrimination these days typically takes the form of discrimination based on who someone is rather than what they do. Refusing service to someone who wants a swastika-shaped cake is reasonable; refusing service to someone only because they are Jewish is not.

That one’s actually tricky. Refusing service to someone of Jewish ancestry would be discriminating based on who they are; refusing because they express support for Jewish religious teachings would be discriminating based on what they do. If we say that’s not OK, does that make it wrong to discriminate against the Westboro Baptist "Church"? Should that depend on how sincere we think their beliefs to be, or offensive we find them, or how widespread the "religion" is?

Similarly, being gay is not the same thing as choosing to have gay sex. The law protects both, but the latter is technically a choice.

Stephen T. Stone (profile) says:

Re: Re: Re:3

Refusing service to someone of Jewish ancestry would be discriminating based on who they are; refusing because they express support for Jewish religious teachings would be discriminating based on what they do.

The latter woud still be considered religious discrimination because the discrimination would be based on the expression of a religious belief. You can argue whether such protections are deserving of being the law, but for now, US law says we all get them.

If we say that’s not OK, does that make it wrong to discriminate against the Westboro Baptist "Church"?

Yes, it does. I despise those homophobes and their message, yet I believe their odious beliefs should not disqualify them from the protections of the law and the exercise of their rights.

Should that depend on how sincere we think their beliefs to be, or offensive we find them, or how widespread the "religion" is?

No metric exists that can accurately measure the sincerity of a religious belief, so we cannot use that. “Offensive” is a subjective standard based on personal standards, so we cannot use that. And using the size of a religious sect to determine who we can discriminate against is just asking for trouble.

Similarly, being gay is not the same thing as choosing to have gay sex. The law protects both, but the latter is technically a choice.

Two things.

  1. Being gay is still a reason people can lose their jobs, get evicted, and be refused service in a majority of US states.
  2. Technically, yes, having consensual sex with someone of the same sex behind closed doors is a choice; that said, the law protects people who engage in it from being arrested for it mostly because the law would otherwise unfairly target and punish queer people for doing things that straight couples can do without penalty.
Anonymous Coward says:

Re: Re: Re:4 Re:

No metric exists that can accurately measure the sincerity of a religious belief, so we cannot use that. “Offensive” is a subjective standard based on personal standards, so we cannot use that. And using the size of a religious sect to determine who we can discriminate against is just asking for trouble.

So what are we left with? If Jones said his videos are expressions of his religious beliefs, would he have a legitimate religious discrimination case against Facebook? What differentiates "religious" beliefs from personal opinions, if not popularity?

Being gay is still a reason people can lose their jobs

Not legally, according to the EEOC. I’d hope the same protection would exist for service-refusal and eviction.

Stephen T. Stone (profile) says:

Re: Re: Re:5 Re:

If Jones said his videos are expressions of his religious beliefs, would he have a legitimate religious discrimination case against Facebook?

Well, for starters, Jones’s lawyers are trying to argue in court that he is simply an “entertainer” whose outlandish statements are part of an act and thus should not be taken seriously, so there is that. But more to the point…

So what are we left with? […] What differentiates "religious" beliefs from personal opinions, if not popularity?

Hell if I know for sure. We do not and cannot have an objective standard by which we can judge the sincerity of an expressed belief or opinion, religious or otherwise. A lot of this sort of thing is a judgment call. That said: When those opinions belittle an entire segment of the population based on who they are (e.g., anti-gay religious beliefs) or defame people (e.g., Alex Jones’s “Sandy Hook was a false flag operation” claptrap), booting from a platform the people who express those opinions becomes much less morally questionable.

Not legally, according to the EEOC.

That only applies to federal employment. On the state level, around 30 states—last time I checked, anyway—have no such protections enacted for LGBT people.

Anonymous Coward says:

Re: Re: Re:6 Re:

            Not legally, according to the EEOC.

That only applies to federal employment.

The link that the other poster provided didn’t really support his assertion. Here are some better links from the EEOC—

Overview

EEOC interprets and enforces Title VII’s prohibition of sex discrimination as forbidding any employment discrimination based on gender identity or sexual orientation. . . .

Note that in the EEOC context, “Title VII” generally refers to Title VII of the Civil Rights Act of 1964.

Leigh Beadon (profile) says:

Re: Internet sites have every right in the world to kick people off

Anti-discrimination laws are a separate, specific thing that block the unequal provision of services on the basis of certain protected qualities such as race. They act explicitly as an exception to the more basic idea that a private entity can deny service to anyone it chooses – and generally yes, they apply equally to online services.

While there may be a separate (though related) discussion to have about such laws and their impact on various rights, they don’t change this broader analysis and they certainly don’t apply to the question of blocking based on political positions, viewpoints, etc.

Anonymous Coward says:

Re: Internet sites have every right in the world to kick people off

A restaurant has the right to ban someone who is loudly pushing a point of view, and whose behavior is likely to drive other customers away. That person cannot claim they have to eat and therefore the restaurant should let them in as they have alternative ways of obtaining food.

Anonymous Coward says:

I’m finding it hard to square to things here: first “Internet sites have every right in the world to kick people off their platforms, and there’s no legal or ethical problem with that.”

The second: “Most people do use Facebook. And for many people it is important to their lives. In some cases, there are necessary services that require Facebook.”

Once necessities start going on a platform, then the ethical calculus changes. They don’t, and probably shouldn’t, have a legal requirement to host everything, but if Facebook is genuinely *essential*, then there *are* ethical issues with them banning people.

Christenson says:

Re: Re:

That the largest internet platforms (particularly twitter and facebook) are becoming like monopolistic public utilities is largely not disputed. One way of looking at it is that these two companies have perhaps 80+% of the general-purpose social media market.

Many of the complaints in our more trollish comments arise from the disconnect between the expectations that these are non-discriminatory public utilities and the legal reality that these are private entities.

Once something becomes a public utility, or a public accommodation, then non-discrimination becomes a legal requirement. That is what Net Neutrality is really all about. It’s been argued where to legally draw the line between a small site like Techdirt and a huge platform like Facebook.

I’m very much in favor of Techdirt’s approach: What i post is my speech, Facebook algorithms promoting it without comment is Facebook’s speech, so Infowars on Facebook is not a problem but Facebook promoting Infowars is a problem; it has certainly caused some real harms.

I’d be really interested to see what would happen if all the hotels in town decided not to serve people wearing sneakers…

Thad (profile) says:

Re: Re: Re:

That the largest internet platforms (particularly twitter and facebook) are becoming like monopolistic public utilities is largely not disputed.

Bullpucky. I just disputed it 90 minutes ago.

I agree that Facebook and Twitter have an alarming, outsized influence on our discourse. I do not agree at all that they are monopolies, or that they bear any resemblance to public utilities.

Many of the complaints in our more trollish comments arise from the disconnect between the expectations that these are non-discriminatory public utilities and the legal reality that these are private entities.

But that’s just it: the expectations that they are public utilities is disconnected from reality.

Anonymous Coward says:

Re: Scott Yates or beernutz, 59 comments total, 6 average per year,

because TEN AND A HALF years, 21 Dec 2007. 2 year gap after very first, which is typical, yet ODD: excited about a new site, then just drop it? HMM.

Also, it’s pretty clear (to me, who has list of nearly all accounts for the last 5 years) these RARE accounts only come out when FEW comments. — Exactly as if the site is trying to give the appearance of more interest.

Narcissus (profile) says:

Re: Re: Scott Yates or beernutz, 59 comments total, 6 average per year,

(to me, who has list of nearly all accounts for the last 5 years)

It’s really hard to say this without sounding insulting but you really need help. This kind of obsession is really not healthy. Please, please, please talk to somebody. I’m serious.

Alternatively, try to wean yourself of your compulsive behaviour to visit and comment on this site. Just go watch porn or something whenever you feel the need to go here..

Anonymous Coward says:

>Most people do use Facebook. And for many people it is important to their lives. In some cases, there are necessary services that require Facebook. And you should support that rather than getting all preachy about your own life choices, good or bad.

You could make the same argument for religion, slavery, cigarette smoking, etc. Just because an institution is widespread in society doesn’t make it efficient or just.

crade (profile) says:

* Quick aside to you if you’re that guy rushing down to the comments to say something like “No one needs to use Facebook […]”

Not Quite.. but certainly in my mind, the bar for when the government should need to step in and force companies to provide a particular service in a particular way should be higher than you imply. Facebook is not an essential service we all lose Facebook tomorrow.. life goes on relatively unscathed.

Put otherwise… you create a tool, lots of people like it and use it.. at what point exactly do you cross the line from “you make an awesome tool and lots of people use it” to “your tool is now essential/important enough to society.. and we can’t trust you [who, unlike us are able and insightful enough to build these tools that are so essential in the first place] to administer it responsibly so in the interest of public good we are taking control [to some degree]”?

To me, this thinking is backwards.. For one, it’s kindof an unfair abuse of power and removal of important freedoms but perhaps more importantly because the only people who have demonstrated that they are at all equipped to make informed decisions regarding what will make these tools most useful for society are most the people who are able to make tools that are so useful to society.

I thought a decent solution for this might be something akin to the esrb ratings system.. something of a standard created by and for industry (even if only to preempt regulatory intervention) that would assist with transparency by laying out expectations for your users for how content will be monitored on your platform in a concise, clearly communicated and easily understood (if not neccessary always precise) manner.

Platforms could certainly allow their users to choose from a selection of available standards, but wouldn’t neccessarily be required to do so, and more popular standards would spread

Thad (profile) says:

Re: Re:

but certainly in my mind, the bar for when the government should need to step in and force companies to provide a particular service in a particular way should be higher than you imply.

Where in the article did you read any implication that the government should intervene?

I see an article that makes suggestions about how Facebook and Twitter should voluntarily provide technical solutions to their users to help mitigate this dilemma. I don’t see a government mandate mentioned anywhere.

Platforms could certainly allow their users to choose from a selection of available standards, but wouldn’t neccessarily be required to do so, and more popular standards would spread

That…sounds exactly like Mike is suggesting.

crade (profile) says:

Re: Re: Re:

“Where in the article did you read any implication that the government should intervene?”

This part:
But, at the same time, it’s more than a bit uncomfortable to think that anyone should want these giant internet platforms deciding who can use their platforms

The way I see it, there is only one way that this decision is actually removed from the “giant internet platforms” and that is through regulation. If they implement what Mike suggests, that is just the result of their decision (at the moment) it doesn’t make it so they aren’t making the decision.

“That…sounds exactly like Mike is suggesting.”
Yeah that was the idea, I was trying to say you could still accomplish something very similar to what Mike was suggesting using this hypothetical ratings system

My understanding of Mike’s suggestion is more for every platform to have all the possible options and let the users select what they want every time. In my mind this is expecting too much from your users and I think it needs to be simplified. You could choose to only implement one or even no standards or you could allow the users to select one, the key is that what you are doing is transparent to the users so they can decide accordingly.

Stephen T. Stone (profile) says:

Re: Re: Re:

My understanding of Mike’s suggestion is more for every platform to have all the possible options and let the users select what they want every time.

I hate to keep bringing up Mastodon when talking about this kind of subject, but the Masto protocol has per-post privacy settings: A post can be publicly visible (with the option of showing it or hiding it on the public timelines), visible only to followers, or a direct message to a mentioned user. It also has a “content warning” system that allows for putting sensitive topics (or the punchline to a bad joke) behind a warning box that a user can choose to click. That functionality extends to images as well, and an account setting can automatically hide all images behind a “sensitive image” warning. As far as block/mute functionality, a user can mute or block an individual user or, if necessary, an entire instance (useful for victims of a coördinated harassment campaign from that soon-to-be-blocked instance). It might not be exactly what Mike was suggesting, but hey, it’s a start.

crade (profile) says:

Re: Re: Re:4 Re:

You bet, but you are uncomfortable with the decision in your case not with the fact that they can make the decision.

You don’t say you are uncomfortable with them deciding whether or not they spew their hate speech outside funerals because (in my mind at least) that is the same thing as saying you are uncomfortable with them having the legal right to do so (which certainly isn’t the same as saying you want to remove that right)

Ryunosuke (profile) says:

A few things..

A platform (Facebook, Twitter, Periscope, etc) has the right to refuse business to individuals or group, The reason being, is two-fold, That does not mean that these groups can go to Fox news or Sinclair to peddle their poison.

And 2 (and probably more importantly) There ARE federal laws, Felonies at that, against inciting violence and riots. There is NOT however any Federal laws against incitement of ethnic hatred (which I guess is what they are counting on).

Leigh Beadon (profile) says:

Re: Choosing what content winds up in your Newsfeed

Good idea and there’s already a fully established and open platform they could base this on: RSS. Pretty much every website out there already has an RSS feed ready and waiting.

I’ve been thinking about that a lot lately. Imagine a web where much of modern social media grew out of RSS rather than proprietary platforms. I can envision a world where the hosting/publishing functions of Twitter and Facebook are totally distributed and syndicated, and they are instead focused on being end-user applications for viewing/aggregating content in your chosen way. There would be capability tradeoffs and different engineering challenges for sure, but man does it sound better than what we’ve got…

It’s so easy to envision how we could have gone down that path instead than the one we did, but now we face the much more challenging question of how we get from here to there.

Anonymous Coward says:

Re: Re: Re: Choosing what content winds up in your Newsfeed

Ready Player One involved a centrally-controlled virtual world. Parzival’s group wanted to win because otherwise IOI would have absolute power over something that ruled over everyone’s lives (much more powerfully than Facebook). And they chose to run it democratically, but the story doesn’t go so far as to put any guarantees against future reversals; I don’t recall them making any "constitution" for the virtual world.

RSS is neat… not fully decentralized, because it usually relies on DNS and registrars can arbitrarily revoke domains for now. In theory you can run it on a .onion address. It has the obvious problem that it’s a "pull" system: millions of clients checking every few minutes whether something new has happened takes some herioc efforts (CDNs) to scale.

Anonymous Coward says:

Re: Choosing what content winds up in your Newsfeed

The problem with RSS is that it does not scale well with number of subscribers, unless the site can afford a CDN. The alternative is a centralized reader, so that it can make one subscription, and feed on to however many people using its services subscribe o that site.

In the social media arena, this is one of the areas where FaceBook and Twitter have an advantage of a distributed protocol, they can handle the user/subscriber issues better than small instances of a distributed protocol, they can deal with a large number of followers or subscribers to a single persons feed.

Anonymous Coward says:

re: disclaimer issue

If you put prominent disclaimers on pages or appended to posts by speakers who are known to be kooks who promote outright falsehoods, it’s good to think it’d be educational, but is that going to be opening the site that puts the disclaimer up to free speech complaints? The speech is not censored per se, but how viable would a complaint about the disclaimer being a form of censorship and free speech interference be if they were to make the claim?

I think it’s a good idea in theory for extreme views that are clearly bullshit and unfounded, but I can also see it going badly for views that are founded in reality but are politically or publically unsupported. True, it would be better than outright deleting or no-platforming unpopular views, but even the disclaimer idea of having the website actively give its implicit approval or disapproval to particular viewpoints is somewhat troubling. Do we want large website moderation teams acting as arbiters for the truth? And by rubber stamping posts instead of simply deleting inconvenient ones?

I do think it’s a better idea than simply saying websites should hunt down and delete bad content though, because there’s gonna be some good content lost to overzealous moderation (due to the nature of it being subjective) and also because it won’t push bad content into the dark where it can’t be publicly refuted. But it’s not a perfect solution either, and it’s possible those disclaimers could have a similar no-platforming martyrdom impact for groups such as the Nazis, the KKK, etc, who use the disclaimers as evidence of a conspiracy.

On small websites, trouble users can be dealt with individually without the martyrdom effect, but it’s a real problem of how to deal with and get rid of someone who is a crap member of the community without giving them the means to cry foul.

Anonymous Coward says:

Re: Re:

If you put prominent disclaimers on pages or appended to posts by speakers who are known to be kooks who promote outright falsehoods, it’s good to think it’d be educational, but is that going to be opening the site that puts the disclaimer up to free speech complaints?

People can complain about whatever they like, so, yes. It’s probably even illegal, if done by a government, but the chance of a court finding against a private company on this basis is negligible. The idea of anyone having a right to speak on a specific platform is itself shaky.

Zgaidin (profile) says:

Mike, you’re not wrong, but I think you may be indulging in too much idealism. I responded above to Mark Murphy (as AC since I forgot to log in first) about protecting your niche as a social media platform, and I think it touches on the actually important give and take of the debate, the business vs. the tech. The further you move from platform toward protocol, the harder it is to square the business end of things, the harder it is to say how you will profit today and tomorrow from the tech you have created and the service you provide. It’s not impossible, but anyone who has ever worked tech support can tell countless stories about how clueless upper management can be about the tech at their disposal. Anyone who has ever worked the management side of a business can tell that, usually, tech guys don’t have a very good grasp on exactly how much negative impact a service outage or network slowdown can be. There’s a reason VC incubators for tech companies exist.

For really the first time we have something that looks sort of like competition in this marketplace, instead of everyone abandoning Geocities for LiveJournal, then abandoning LiveJournal for Myspace, etc, but it’s not exclusive competition. If I use Twitter, that doesn’t mean I don’t also use FB, LinkedIn, YouTube, and Snapchat. What they’re competing for is time and attention to keep eyes on adds – but that makes each of those sites, large and powerful as they seem to us, uniquely vulnerable to public outcry and so they cave to vocal minorities. We all know this. The question is, is there a solution that’s satisfactory to both end users and the bean counters? Is there another revenue stream they could focus on that didn’t make them so vulnerable to passing public whimsy? Is there an option they could explore that would leverage their incredible public presence to lessen the absurd tribal animosity currently prevalent in social and political discussions to undermine brigading types of behavior?

Zgaidin (profile) says:

Re: Re:

@Mike & Leigh – I was typing this while you were responding to my post above. Yes, people make money off e-mail in all sorts of ways, but that wasn’t always the case, only now that it’s an established protocol in wide use. Yes, VC money is flowing into tokens/cryptocurrency because it’s largely uncharted territory with the potential for a huge payoff. You’re talking about combining those two business issues in the worst possible way. Marketing an unknown protocol (even a great one) in a profitable way in an already crowded and dominated space. That’s a tall order.

Leigh Beadon (profile) says:

Re: Re: Re:

Yes, people make money off e-mail in all sorts of ways, but that wasn’t always the case, only now that it’s an established protocol in wide use

I’m not sure that’s true. In my experience, in the early days of widespread public adoption of the internet, email service was very much a part of what you were paying your ISP for.

Most people had ISP-hosted email, and it was more clearly understood by the early adopters that by buying internet service you were getting the necessary infrastructure to make use of multiple different protocols – not just the ability to make http requests and access the world wide web, but also things like an account on a news server (owned and operated and maintained by your ISP) to access Usenet, and an account on a mail server (same) in order to send and receive emails.

While the email protocol itself was open and free, it was only later that ways to make use of it for free became ubiquitous – and indeed in the early days of free webmail, people questioned whether it was even a sustainable service to offer, much less a profitable one.

crade (profile) says:

Re: Re: Re:

I don’t see how a distributed protocol / app combo would be any more a difficult sell than another hosted platform. You don’t market the protocol, you market the end user experience.. I think you would face no more adoption hurdles than if you were trying to create a new hosted platform (which, don’t get me wrong.. are massive).

Thad (profile) says:

Re: Re:

For really the first time we have something that looks sort of like competition in this marketplace, instead of everyone abandoning Geocities for LiveJournal, then abandoning LiveJournal for Myspace, etc, but it’s not exclusive competition. If I use Twitter, that doesn’t mean I don’t also use FB, LinkedIn, YouTube, and Snapchat.

I disagree; GeoCities, LiveJournal, and MySpace were never anywhere near as dominant as Facebook or Twitter.

John Smith says:

Re: Re: Re:

USENET was a blockchain or set up like what we call blockchains today.

USENET also has the free speech everyone claims to want.

AOL was more dominant from 1993-1996, and abused their censorship power to the point of destroying the company. They were the only service at the time that had reliable, instantaneous e-mail (same server), they had keywords before websites, a payment system that could have been Paypal, and yet they threw it all away by chasing everyone onto the web, which slowly caught up technologically.

There will always be a hungry young internet company that wants to gain market share by offering free speech, just as surely as once it gains that market share, it will want to clamp down on dissent or anything it can’t control.

Anonymous Coward says:

Re: Re: Re:

GeoCities in particular always had a poor reputation. It was for the masses who couldn’t even get a “~” page from their ISP, let alone run a “real” site. Angelfire, Tripod, etc. were in the same group, and were real competition. People were already using directory services and search engines by then and didn’t much care which host you used, except to make fun of it; a GeoCities site had no major advantage over the others.

The directory listings, though, were still important in the early days. Yahoo’s listing service might have been the closest thing to a monopoly at the time.

Anonymous Coward says:

Internet services have a right to ban people. This is fact. But just because you have a right to do something, doesn’t mean that doing so is right.

And if a service is, as you say, “close to necessary to take part in modern day life,” perhaps it shouldn’t have the right to ban people after all.

Or on the other hand, perhaps the service should instead be made less necessary.

Either way, there is definitely something very not right in a situation in which a person could be excluded from “necessary” functions of society in retaliation for expressing his/her opinions.

Anonymous Coward says:

I've said it before...

and am saying it again. I have no use for social media myself. But if you’re going to get on the internet put on your big boy/girl pants and be aware you might see something, or rather you probably WILL see something that triggers or offends you. Get over it and yourself.

You are,GASP, not special. The internet does not exist just for you alone.

Social medias problem is that they try to moderate content to protect some groups. Short of someone posting threats, kiddie porn, inciting riot or rebellion or other unlawful acts QUIT MODERATING CONTENT!!!

If it’s not unlawful, it’s a self correcting problem. People who don’t like or are offended by someone’s post won’t be back…problem solved.

FFS everyone needs to grow the fsck up and quit crying when they see something they don’t like.

Christenson says:

Re: I've said it before...

Just curious, what part of people getting killed (Heather Heyer) and jailed for years (the Pizzagate “investigator”) and hounded out of town (Sandy Hook victim’s parents) because of Alex Jones should I just “get over as a self-correcting problem”???

I’m generally in favor of free speech, but when the algorithms on a very influential platform such as facebook lead to mobs, we’ve got a problem.

Leigh Beadon (profile) says:

Re: I've said it before...

If you are not a social media user, then you lack the experience of how certain kinds of content can proliferate and turn a platform into a place nobody wants to be.

And yes, it’s true that “people who don’t like or are offended by someone’s post won’t be back” and… here’s the thing: social media companies want users, they don’t want people to leave and never come back. And they also want to be a place frequented by celebrities, experts, politicians, interesting people – because that’s good for their business and their brand.

So when persistent toxic behavior by some subgroup of users is proliferating and driving away other users, especially high-quality users, it becomes a business issue for the platform to figure out how to foster a better community.

Plus at the end of the day, some people don’t want to be the owners and operators of a forum overrun with holocaust denial or misogynistic harassment or racist insults or what-have you. Many talented people don’t want to work at a place like that. Companies and advertisers don’t want to partner or be associated with a place like that.

As this post points out, this doesn’t mean the solution is “just try to ban all the bad stuff”. But, sorry, your “grow the fuck up” attitude isn’t going to fix anything either, or change this situation at all. And if you want to see what a totally unmoderated social media platform looks like, try signing up for gab.ai

Christenson says:

You've put your finger on something...

So when persistent toxic behavior by some subgroup of users is proliferating and driving away other users, especially high-quality users, it becomes a business issue for the platform to figure out how to foster a better community.

OK, by putting everyone in their filter bubble, Facebook and Twitter have been able to avoid the business issue….or is there more to it??? Or am I just not looking on a long enough timescale?

Zgaidin (profile) says:

Re: You've put your finger on something...

Yes, to some extent by putting people in filter-bubbles/echo chambers, you avoid the problem, for a time at least. That said, I wouldn’t be surprised if that exact behavior contributed to the extreme polarization we see in public discourse these days. If people rarely ever hear/see an opinion they disagree with, they’re more likely to react badly or over-react in the rare times they do, and filter bubbles often propagate an us v. them attitude. Then, all it takes is someone getting aggressive/trollish and posting something were they know the other readers won’t like it, and poof – instant dumpster fire.

And that’s the situation these platforms seem to find themselves in so often. Someone from r/TheDonald cross-posts something to r/Politics and instead of everyone saying “Wow, that guy’s an idiot, moving on…” they engage, yell at each other, and the place becomes toxic. Now the admins feel they have to act before the whole place becomes a cesspool, but they’re a) human, b) understaffed to deal with the volume of content, and c) never going to please everyone with any attempt at censorship – even if it’s warranted and totally within their rights.

Leigh Beadon (profile) says:

Re: You've put your finger on something...

OK, by putting everyone in their filter bubble, Facebook and Twitter have been able to avoid the business issue….or is there more to it??? Or am I just not looking on a long enough timescale?

I think the "filter bubble" aspect is somewhat overstated. The analysis does diverge a bit for Twitter and Facebook though.

For Twitter, a huge part of the appeal and core function of the site is the ability to connect with people outside your immediate circle – inasmuch as it has filter bubbles, they are extremely porous. This is reinforced by nearly every aspect of the design of the site: retweets, subtweets, notifications of what people in your network like or follow, hashtags that link to an open feed of other people from all across twitter, prominent trending topic links that do the same, etc.

Moreover, core to Twitter’s appeal is public figures or just interesting people maintaining a public presence. Twitter and its userbase do not benefit if more people make use of its "filter bubble"-ish capabilities like having a private account or muting all replies.

So to take a specific and widespread example, there is a huge problem with constant and frankly insane harassment of women who have a large twitter presence, especially in certain industries like game design. They face an unduly disproportionate amount of aggression – including coordinated harassment campaigns employing tactics to get around the muting/blocking features that exist. This has driven many women off the platform. Twitter doesn’t want this. They could also switch to a private account, and put themselves in a stronger filter bubble, but Twitter doesn’t want this either – nor do they.

Facebook is somewhat different because it has many different usage patterns and a larger "private, immediate circle" aspect in some respects. However, interconnection is still a big deal: public pages and events are very important to Facebook, and important to its advertising business model. Facebook provides lots of routes out of your "bubble", showing you popular pages or those your friends interact with, etc. It is also home to several large and more public general-interest forums, such as the Facebook pages of major news organizations, popular TV shows, etc. – and it does not want these places ruined by toxicity either, because then these all-important large organizations might pack up and leave.

And that’s just the briefest look at these platforms and how people use them. I won’t even get into YouTube, iTunes, Steam, WordPress, Wikipedia, app stores – all platforms facing these same challenges and all with unique needs.

The whole idea of "filter bubbles" is real and it is very much a factor and a force in how we communicate and consume information in 2018, but it is not an absolute or even necessarily the most dominant trend – and it’s not an automatic solution for the challenges of Twitter or Facebook.

Communications Decency NOT Corporate Deviltry says:

"No, I'll surrender more of my Rights to corporations than you!

What looks like discussion above is a few fanboys eager for fascism. Yes, are a couple who stick up for actual free speech, which is anything within common law and NOT to be controlled by corporations, but most of the comments are eagerness to be ruled by corporate royalty.

Communications Decency NOT Corporate Deviltry says:

Is Section 230 CDA to benefit The Public or corporations?

Obviously The Public by providing speech outlets. You could not find a single politician who’d say it’s to benefit corporations. — Though it’s entirely possible that was and is the intent: a stealthy form of censorship.

Corporations have PR departments with large budgets to get their message out. It’s only individual "natural" persons who need outlets for their views.

Masnick highlights that CDA 230 provides immunity that allows corporations to HOST content without the liability of PUBLISHING it:

"No provider or user of an interactive computer service shall be held liable on account of-

(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected…"

Note first that states causes valid in common law. It’s a requirement for simple decency. Not controversial so far…

Communications Decency NOT Corporate Deviltry says:

Corporatists try to twist CDA into control of OUR PUBLISHING!

How did "Communications Decency Act" get twisted into authorizing complete corporate control of The Public’s new forums?

Because the ultimate purpose of Masnick / EFF blather (both funded by Google) is to sweep on to:

They claim that corporations can use that "restrict access or availability of" clause to, whenever wish — even explicitly over YOUR First Amendment Rights — step in and EDIT comments, to point of becoming THE publisher, even to PREVENT we "natural" persons from publishing on "their" platforms at all!

But those are OUR platforms, The Public’s, NOT theirs.

Corporations are allowed (by The Public) merely to operate the machinery which is to convey The Public’s views. Corporations are NOT to control who gets on, nor WHAT The Public publishes, except under OUR clear common law terms.

But Masnick is for corporations CONTROLLING the speech and outlets of "natural" persons. That’s repeated often here, can’t be mistaken:

"And, I think it’s fairly important to state that these platforms have their own First Amendment rights, which allow them to deny service to anyone."

https://www.techdirt.com/articles/20170825/01300738081/nazis-internet-policing-content-free-speech.shtml

Masnick is not hedging "lawyers say and I don’t entirely agree", or "that isn’t what I call serving The Public", but STATES FLATLY that corporations have arbitrary control of MAJOR outlets for The Public! He claims that YOUR Constitutional First Amendment Right in Public Forums are over-arched by what MERE STATUTE lays out!

Such control (by ANY entity) to remove First Amendment Rights from The Public CANNOT be purpose of ANY statute. It’d be null and void because directly UN-Constitutional.

It’s NOT law, only the assertion of corporatists: no court has yet supported what Masnick claims. Corporatists are reaching for the moon without even a step-ladder. It’s simply a trick Fascists are trying to pull.

The provisions over-riding First Amendment Rights ONLY apply if done in "good faith" for The Public’s purposes.
A corporation intent on stopping YOUR publishing in favor of its own views CANNOT be "good faith" for The Public, but only de facto tyranny and censorship.

Communications Decency NOT Corporate Deviltry says:

Posting on a "private web-site" is NOT a "privilege",

any more than is reading it. THAT’S THE PURPOSE.

1) What does "private" even mean when published and invites entire world?

2) WHO owns a "web-site", anyway? Like physical business, if allow The Public in, then have CEDED some right to "private property". The Public gains, NOT loses. That’s the deal.

3) Where is this "corporation"? Show it to me. And UNDER WHAT PRIVILEGE AND RULES is it even allowed to exist? — By The Public giving it permission, and NOT for the gain of a few, but for PUBLIC USE.

4) Again, mere statute doesn’t over-ride The Public’s Constitutional Right. And no, corporations are NOT persons, do not have rights, they are FICTIONS.

5) The Public’s use is the PURPOSE of any and every web-site. If allows comments, then it’s governed only by common law terms: no arbitrary exclusion. Two-way communications is the purpose of teh internets.

Christenson says:

Re: Posting on a "private web-site" is NOT a "privilege",

1) Private means that some entity may arbitrarily decide what’s to be presented on the site. That anyone may see is the public benefit.

2) Techdirt has an owner. Exactly who, I don’t care, they keep up their end of the bargain nicely and the public gains from what’s posted and the discussion thereof. CDA 230 gives those owners a shield from legal liability if you, a member of the public, abuse that discussion facility in the owner’s opinion.

3) Corporations are a legal fiction of the common law, recognised in the laws of all 50 states. The purpose is to allow large undertakings (classically, steel mills and factories) that are too large for any one individual to finance. Techdirt is small enough it might be a sole proprietorship.

4) Most of the discussion here is about the rights of the public in respect of two websites/corporations with outsized influence. We generally agree that the 1st amendment prohibits the government from deciding what sites should do; but when we get to a private site like Techdirt it should be obvious that the owner should be in ultimate control. We have been arguing back and forth about whether the outsizedness of the two websites should make them more “public”, or more “private”. See laws about public accommodations.

5) Not every website has to serve the PUBLIC directly. CDA230 specifically tells techdirt it’s not legally liable for whatever it may or may not decide to do with comments from the public, or for whatever the public may say in those comments. This prevents all manner of harm and allows techdirt maximum freedom to make a beneficial website. Part of that is not driving away users/readers by a comment section full of trolls.

Stephen T. Stone (profile) says:

Re:

What does "private" even mean when published and invites entire world?

As I have explained before (and you have refused to acknowledge due to either willful ignorance or brain damage), private does not mean the same thing as privately owned. A brick-and-mortar store such as WalMart is privately owned yet open to the public; the same can be said of a service like Facebook.

Anonymous Coward says:

Re: Re: Whichever of mine, I can't see because censored...

As I have explained before (and you have refused to acknowledge due to either willful ignorance or brain damage), private does not mean the same thing as privately owned.

Yeah, but you ‘splaining ain’t all there is to it! You’ve got a word trick there, is all.

Why don’t you ‘splain why you’re FOR corporations controlling The Public’s speech in the forums that were explicitly created FOR The Public? — It’s even against your won interests! No matter how you despise Alex Jones, nothing he says is outside common law. YOU are setting up a real censorship regime, NOT ME.

I’m just typing in some text on a web-site that has HTML for the purpose. Why do you even want to hide it? Need to ‘splain that too! — As the real AC below asks.

Anonymous Coward says:

Re: Re: Re:2 Whichever of mine, I can't see because censored...

Ah yes, that awful trick where words have meanings.

Not an answer, either.

Since you started on new point, I will too.

My repeat has been up 44 minutes now WITHOUT being censored. Do you (or any person) have ANY control over the "hiding", or is it entirely automatic, from some number of clicks? — I’ve asked this dozens of times, but here you are responding, so I can hope for an answer at last? …

Christenson says:

Re: Re: Re:3 Whichever of mine, I can't see because censored...

Troll, Troll, Troll your boat…
Gently down the stream….

Funny, I can click on the underlined gray link to show me the flagged posts and see them, no problem!

I’m not sure exactly how flagging works to auto-hide posts, but I think it takes more than one flag, and Techdirt Insiders might have more of a vote than me. So it’s a collective action. And I’m a volunteer, and I’m going on strike right now, inconveniencing no one!

Stephen T. Stone (profile) says:

Re: Re: Re:

Yeah, but you ‘splaining ain’t all there is to it! You’ve got a word trick there, is all.

No, I do not. But since you seem unable to parse English well on your own, Ricky, let me explain.

Going by the Oxford English Dictionary, the first definition of private is “belonging to or for the use of one particular person or group of people only”. Several sub-defintions refer to things “not to be revealed to others” and a place that is “quiet and free from people who may interrupt”. A private space, for example, would be one reserved for a handful of people by someone in that group for the purpose of a private meeting.

The third definition, however, refers to services and industry. That definition reads as such: “Provided or owned by an individual or an independent, commercial company rather than the state.” By turning the noun private into the adjective privately, we can logically conclude that a privately owned space is one owned by an individual or an independent commercial company instead of the state.

A privately owned service such as Twitter can be private if the service owner limits usage of the service to a small number of people and prevents anyone else from seeing anything that happens on the service. It can also be open to the public like Twitter is while retaining its privately-owned status. Because that service is a privately owned platform for speech, the platform’s owners and operators have every right to decide what is and is not acceptable speech as well as who will or will not be allowed on that platform.

Anonymous Coward says:

Re: Re: Re:2 Then Twitter is NOT private!

A privately owned service such as Twitter can be private if the service owner limits usage of the service to a small number of people and prevents anyone else from seeing anything that happens on the service.

Uh, I may have neglected to mention that distinction is irrelevant in my opinion, besides that it’s not the focus I wish.

But, since you just PROVED that Twitter is NOT a private site, then my point stands: that The Public has been CEDED rights just as Wal-Mart does by inviting person physically.

Oh, and READ the Sandvig decision. It expressly notes that "internet" forums have become PUBLIC SPACES. — Don’t get hung up on the word "spaces" now, doesn’t mean physical…

Stephen T. Stone (profile) says:

Re: Re: Re:3

I may have neglected to mention that distinction is irrelevant in my opinion

You are free to ignore the meaning of words. By the same token, we are free to mock your open admission of willful ignorance by insulting you.

It expressly notes that "internet" forums have become PUBLIC SPACES.

Even if forums and social interaction networks are open to the public, unless they are owned by the public/the government, they remain privately-owned services. You literally cannot force Twitter, in any way, to provide a platform for your speech or guarantee you an audience for that speech.

Gwiz (profile) says:

Re: Re: Re:3 Then Twitter is NOT private!

READ the Sandvig decision. It expressly notes that "internet" forums have become PUBLIC SPACES…

So what? Common areas in malls are also "public spaces" that are privately owned. If you tried to setup a soapbox and a PA system in a mall so you could spout your nonsense you would be asked to leave. If you refused to leave, you could be legally trespassed from the property. This would not be a violation of your First Amendment rights because it’s not the government who is restricting your speech, it’s a private entity.

The same concept applies on the internet. If you want a platform to express yourself on without restrictions, either buy your own servers or build your own mall.

Communications Decency NOT Corporate Deviltry says:

Yet fanboys promote mega-corporations to rule over themselves!

THAT is the BIG puzzle at Techdirt. What’s in it for fanboys? — They’re happy to throw their own First Amendment Rights! And another part of the US Constitution, too, which is interesting to compare: they say content producers have NO right to income from or control copies of creations, but these pirates then stand up for even larger corporations to control their own SPEECH!

On that point, I’m going to assume they’re just aping Masnick.

And what is the (mere) statute or court decisions that grant corporations the RIGHT to over-rule First Amendment to absolutely and arbitrarily control "platforms" as Masnick says? — There is NONE! It’s just his assertion. — AND EVEN IF WERE, THAT CAN CHANGE.

So WHY assert that corporations operating "platforms" are empowered to control the speech of "natural" persons? That’s EXACT OPPOSITE INTENT OF ALL LAW.

Only Mitt Romney and Mike Masnick will say that corporations are "persons" — Romney got roundly hooted for it, and Masnick only does it here in this little walled garden where he’s cultivated vegetables who don’t question him.

Masnick is a total corporatist. Only the mistaken presumption that he acts in "good faith" and shares YOUR views gives him any credibility. — Take away that presumption for a week, and READ what he writes: he’s very open about particularly that "platforms" have an alleged First Amendment Right to arbitrarily control access of we "natural" persons. Masnick believes not only that The Public can be denied access, but since Google controls search, that it can effectively "hide" speech even on alternative smaller outlets you’re forced to use. — Masnick uses "hiding" right here to disadvantage dissenters until they give up and quit commenting. He can thereby claim doesn’t censor.

Corporatists are going for TOTAL control over "natural" persons, period.

Christenson says:

Re: Yet fanboys promote mega-corporations to rule over themselves!

Flagged…for 5 posts in a row, all long.

And for not being able to realize that good thoughts require time and patience to develop, minds require time and patience to change.

We’ve all come to realize that a few operators (or is that nations of the internet?) have outsized influence that has lead to real, meatspace harm when a site like InfoWars was promoted, and we are trying to work out a rational response. Ham-fisted laws aren’t gonna do it. Education might do it, but will take years. Competition doesn’t look too good either. It’s complicated. Partisanship doesn’t help.

John Sprell says:

No one is forcing anyone to go to Jones social media page

Sorry these monopolies need to become public utilities. A lot of people were calling my Senator demanding so. Twitter, Facebook any YouTube are a monopoly and if they get away with this, they will do it to other political points whether it be on the left alternative media or right. No tough, NO ONE IS FORCING ANYONE TO GO TO JONES PAGE. THIS IS BECAUSE TRUMP WON AND THE INTERNET HELPED THAT HAPPEN UNLIKE THE CORPORATE MEDIA PRESS THAT SAID ZERO ABOUT TRANS PACIFIC PARTNERSHIP. KEEP THINKING ITS RIGHT TO CENSOR AND YOUR DAY WILL COME TECHDIRT. NO PUBLIC UTILITY THE RIGHT JUST IN 2018 will push these platforms whether they like it or not.

Christenson says:

Re: Re: No one is forcing anyone to go to Jones social media page

Leigh:
There’s a pretty direct connection between Alex Jones, promoting Alex Jones on Twitter and Facebook, and some real meatspace harm.

At scale, editorial decisions and creating newsfeeds are speech. Twitter and Facebook have lent their credibility to the nonsense from Mr Jones, such as Pizzagate and his interpretation of the Sandy Hook tragedy. This may not be purposeful, but it is the effect. Disclaimers may not un-do it, either.

For Techdirt, note that the scale means this effect of lending credibility is different in kind from what happens when I post my nutty comments here. “I have a bridge for sale”, lol.

Leigh Beadon (profile) says:

Re: Re: Re: No one is forcing anyone to go to Jones social media page

There’s a pretty direct connection between Alex Jones, promoting Alex Jones on Twitter and Facebook, and some real meatspace harm.

I didn’t say there wasn’t, did I?

Not sure if this is the comment you meant to reply to, but my snarky mention of the Beatles as a trio was simply pointing out the inherent absurdity of listing three companies and saying they are a "monopoly" since that, y’know, is by definition not what that word means.

Christenson says:

Re: Re: Re:2 No one is forcing anyone to go to Jones social media page

Monopoly means nothing without a market definition behind it. And, as stated earlier and by others, it’s the outsize influence, market concentration, and lack of consequences for boneheadedness allowing real harm that’s the real issue.

Facebook is close, if the market is defined as posts between friends with a newsfeed from outside the immediate circle.

Twitter is close, if the market is short announcements. Twitterverse is very close to being in dictionaries.

Youtube, if you are sharing the video form, same.

Stephen T. Stone (profile) says:

Re: Re: Re:3

Facebook is close, if the market is defined as posts between friends with a newsfeed from outside the immediate circle.

Not really. While some social interaction networks might not have newsfeeds, they share similar base functions with Facebook. Hell, I could call Facebook an evolved version of MySpace or LiveJournal and still be somewhat right.

Twitter is close, if the market is short announcements.

Not really. Anyone can make short announcements on Tumblr, Facebook, YouTube, Instagram, or literally any other SIN.

Youtube, if you are sharing the video form

You could call that one close, yes, since the direct alternatives to YouTube have not taken off in the same way YouTube did.

Leigh Beadon (profile) says:

Re: Re: Re:4 Re:

You could call that one close, yes, since the direct alternatives to YouTube have not taken off in the same way YouTube did

And even that’s not entirely true, as both Facebook and Twitter also host videos. Also, Facebook can easily be used as a public microblogging service like Twitter, via public posts and its "subscribers" feature. And Twitter can be used with a private account that only follows friends, with optional access to additional newsfeeds via lists & trending topics, thus making it work much like Facebook.

All these services are in direct competition with each other, even though many people use all three.

Leigh Beadon (profile) says:

Re: Re: Re:3 No one is forcing anyone to go to Jones social media page

Facebook is close, if the market is defined as posts between friends with a newsfeed from outside the immediate circle.

Well yeah, if you narrowly define a market as a very specific set of features then everything is a monopoly.

Subway definitely has a monopoly on restaurants that serve submarine sandwiches alongside wraps and baked cookies. The Cartoon Network has a monopoly on cable television networks that exclusively air animated shows. Dave & Buster’s has a monopoly on chain restaurants with multiple card-operated arcade games. Nintendo has a monopoly on family-focused home game consoles with motion controls.

And yet all these companies in fact face huge amounts of fierce competition.

Censored in 15 minutes here on "Free Speech" Techd says:

All in one convenient post. I see "Leigh Beadon" is watching,

so perhaps he’s the Censor, not the lie of "the community" without any Administrator okaying.


Is Section 230 CDA to benefit The Public or corporations?

Obviously The Public by providing speech outlets. You could not find a single politician who’d say it’s to benefit corporations. — Though it’s entirely possible that was and is the intent: a stealthy form of censorship.

Corporations have PR departments with large budgets to get their message out. It’s only individual "natural" persons who need outlets for their views.

Masnick highlights that CDA 230 provides immunity that allows corporations to HOST content without the liability of PUBLISHING it:

"No provider or user of an interactive computer service shall be held liable on account of-

(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected…"

Note first that states causes valid in common law. It’s a requirement for simple decency. Not controversial so far…

Corporatists try to twist CDA into control of OUR PUBLISHING!

How did "Communications Decency Act" get twisted into authorizing complete corporate control of The Public’s new forums?

Because the ultimate purpose of Masnick / EFF blather (both funded by Google) is to sweep on to:

They claim that corporations can use that "restrict access or availability of" clause to, whenever wish — even explicitly over YOUR First Amendment Rights — step in and EDIT comments, to point of becoming THE publisher, even to PREVENT we "natural" persons from publishing on "their" platforms at all!

But those are OUR platforms, The Public’s, NOT theirs.

Corporations are allowed (by The Public) merely to operate the machinery which is to convey The Public’s views. Corporations are NOT to control who gets on, nor WHAT The Public publishes, except under OUR clear common law terms.

But Masnick is for corporations CONTROLLING the speech and outlets of "natural" persons. That’s repeated often here, can’t be mistaken:

"And, I think it’s fairly important to state that these platforms have their own First Amendment rights, which allow them to deny service to anyone."

https://www.techdirt.com/articles/20170825/01300738081/nazis-internet-policing-content-free-speech.shtml

Masnick is not hedging "lawyers say and I don’t entirely agree", or "that isn’t what I call serving The Public", but STATES FLATLY that corporations have arbitrary control of MAJOR outlets for The Public! He claims that YOUR Constitutional First Amendment Right in Public Forums are over-arched by what MERE STATUTE lays out!

Such control (by ANY entity) to remove First Amendment Rights from The Public CANNOT be purpose of ANY statute. It’d be null and void because directly UN-Constitutional.

It’s NOT law, only the assertion of corporatists: no court has yet supported what Masnick claims. Corporatists are reaching for the moon without even a step-ladder. It’s simply a trick Fascists are trying to pull.

The provisions over-riding First Amendment Rights ONLY apply if done in "good faith" for The Public’s purposes. A corporation intent on stopping YOUR publishing in favor of its own views CANNOT be "good faith" for The Public, but only de facto tyranny and censorship.

Posting on a "private web-site" is NOT a "privilege",

any more than is reading it. THAT’S THE PURPOSE.

1) What does "private" even mean when published to and invites entire world?

2) WHO owns a "web-site", anyway? Like physical business, if allow The Public in, then have CEDED some right to "private property". The Public gains, NOT loses. That’s the deal.

3) Where is this "corporation"? Show it to me. And UNDER WHAT PRIVILEGE AND RULES is it even allowed to exist? — By The Public giving it permission, and NOT for the gain of a few, but for PUBLIC USE.

4) Again, mere statute doesn’t over-ride The Public’s Constitutional Right. And no, corporations are NOT persons, do not have rights, they are FICTIONS.

5) The Public’s use is the PURPOSE of any and every web-site. If allows comments, then it’s governed only by common law terms: no arbitrary exclusion. Two-way communications is the purpose of teh internets.

Yet fanboys promote mega-corporations to rule over themselves!

THAT is the BIG puzzle at Techdirt. What’s in it for fanboys? — They’re happy to throw away their own First Amendment Rights! And another part of the US Constitution, too, which is interesting to compare: they say content producers have NO right to income from or control copies of creations, but these pirates then stand up for even larger corporations to control their own SPEECH!

On that point, I’m going to assume they’re just aping Masnick.

And what is the (mere) statute or court decisions that grant corporations the RIGHT to over-rule First Amendment to absolutely and arbitrarily control "platforms" as Masnick says? — There is NONE! It’s just his assertion. — AND EVEN IF WERE, THAT CAN CHANGE.

So WHY assert that corporations operating "platforms" are empowered to control the speech of "natural" persons? That’s EXACT OPPOSITE INTENT OF ALL LAW.

Only Mitt Romney and Mike Masnick will say that corporations are "persons" — Romney got roundly hooted for it, and Masnick only does it here in this little walled garden where he’s cultivated vegetables who don’t question him.

Masnick is a total corporatist. Only the mistaken presumption that he acts in "good faith" and shares YOUR views gives him any credibility. — Take away that presumption for a week, and READ what he writes: he’s very open about particularly that "platforms" have an alleged First Amendment Right to arbitrarily control access of we "natural" persons. Masnick believes not only that The Public can be denied access, but since Google controls search, that it can effectively "hide" speech even on alternative smaller outlets you’re forced to use. — Masnick uses "hiding" right here to disadvantage dissenters until they give up and quit commenting. He can thereby claim doesn’t censor.

Corporatists are going for TOTAL control over "natural" persons, period.

Anonymous Coward says:

Re: Re: I'm not Wikipedia. Try there: it's reliable for such.

Please define “common law” in clear and concise terms so that we may all understand what in the blue hell you are talking about.

Common law is known TO YOU. Here you’re just pretending to question while dodging mine.

This piece I’ve long had ready and shows EXACTLY that YOU know full well about "common law":


Here you are admitting that after harassing me for months, you KNEW The Law:

https://www.techdirt.com/articles/20180130/19040439126/court-shuts-down-troopers-attempt-to-portray-new-ish-minivans-with-imperfect-drivers-as-justification-traffic-stop.shtml#c868

Me: You dodged both "natural" person and "must" terms.

You: I didn’t address the "natural" "person" shit because SovCits and their lingo should only ever be mocked.

You: As for "must": Technically, no, one does not need a license to drive a car. Then again, if you get caught driving without a license, your ass is in trouble no matter how much you stress that you are an individual who is traveling under the auspices of common law as determined by the Founding Fathers under a gold-fringed American flag.

TECHNICALLY IS THE LAW. YOU BELIEVE EXACTLY WHAT "SOVCITS" DO.

You admit that, even mention "traveling", then without apology try to divert with advocating obedience to the police state and more mockery.

Indeed, you implicitly state that "SovCits" are right to worry about an oppressive police state that violates the letter of the law.

I say that you had the rare, precise, TRUE knowledge used for "outing" a Constitutionalist such as an Undercover Agent for the FBI would. — Or other intelligence agency / secret police.
Last bit of evidence is that you only mocked that charge, didn’t deny it. That’s a tactic indicative of those who might later have to admit a lie.
So, AGAIN, you are DODGING AND LYING.

Oh, and you’r arguing with one you say is an idiot troll. So WHY on that, too?

Stephen T. Stone (profile) says:

Re: Re: Re:

you’r arguing with one you say is an idiot troll. So WHY

Anderson: Who are we?
Iscariots: The necessary evil!
Anderson: Why are we necessary?
Iscariots: To purge the world of evil worse than man!
Anderson: And why are we God’s chosen few, ordained to undertake this unholy task?
Iscariots: Because no one else will!
Anderson: And because it’s fuckin’ fun!

Anonymous Coward says:

Re: Why so interested and monitoring site closely late at night?

"Christenson"? You’re in six minutes after mine. Sheerly for interest of those who’d like to so closely monitor Techdirt, how EXACTLY do you DO that? Hit "refresh" every minute?

Flagged again, re-posting the rant without a concise re-write means only the dedicated are willing to read you!

Oh, so YOU set the standards for writing here. Where exactly is your guideline so I won’t run afoul of it again?

Your "reasons" to "flag" sound specious, trivial at best, and your quickness to comment and attack look a whole lot like ASTRO-TURFING.

Christenson says:

Re: Re: Why so interested and monitoring site closely late at night?

Your imagination is feverish….hitting refresh after finishing a reply to you that took a little while to write isn’t exactly difficult or anything!….

And no, I don’t set the standards here. Maybe Techdirt will reply with what it takes to get hidden by flagging, but I suspect it’s more than one of us getting a bad opinion of your post and raising up and hitting that little red button. I write even this much at significant risk of getting flagged myself; I’m OK if that happens.

Let me re-state the reasons for my flag: You re-posted exactly what you said before. That’s generally a no-no, it irritates almost everyone. I think your post is too long, too. So I’m trying to tell you that you will get further with everything if you calm down and shorten what you have to say, and take the time to say it well.

Anonymous Coward says:

Re: Re: Re: Why so interested and monitoring site closely late at night?

You re-posted exactly what you said before.

That’s my protest.

BUT NOW you’re dodging having stated that you "flagged" it the first time! — WHY? — 5 posts separates it into areas, my preference. Same icon, anyone can skip over with two Page Downs? Not permitted by YOU? And where is YOUR authority written again? Because you seemed to speak with authority as if have actual control, and rather gleeful at "flagging"! — As if you knew that those would soon be hidden.

Keep writing. ALL responses help me; you’re bound to slip if as I believe, more than an "AC"…

Anonymous Coward says:

What are you afraid of?

I read the whole article, I read all the hundred plus comments a few hours ago. Nothing was censored. And it was completely boring, almost all the comments were by Techdirt supporters or employees lip syncing with the article. Very little debate, very little contention.

Now I look again, and I see whole swaths of comments censored (hidden). As I open them, I read much more interesting content, another point of view that is well expressed and interesting, albeit in direct opposition with the article.

Why are you so afraid of these views? Is it because you know they are persuasive? Do you think hiding them serves your cause, or amplifies your fear for the whole world to see? Are you peeing in your pants?

Personally, I think the whole tyrannical censorship binge on Facebook, Twitter, YouTube and Techdirt are going to backfire. You all look like disingenuous, dishonest, fearful cowards quivering in your censored safe spaces. You look weak and pitiful, and unworthy of holding any power at all. The more you hide the words of others, the more you amplify their credibility.

What are you afraid of?

Anonymous Coward says:

Re: What are you afraid of? -- I'm commenting only to

show I’m currently on another IP address. Of course, that doesn’t mean isn’t ME on another computer, but it’s NOT. Real comment, real person, really PUZZLED by the fake "free speech" at Techdirt.

You must be new here, AC. Only if comments agree with Techdirt is the key point. — And done in a piece where I’m EXACTLY on topic!

Anonymous Coward says:

Re: Re: What are you afraid of? -- I'm commenting only to

OKAY, AC. There ya go! Now YOUR comment is censored!

I’m one of two who KNOWS that isn’t ME. I raised the possibility, for what little is worth here at this den of astro-turfing.

But anyone new (IF are any!) reading will see what happens here on "free speech" Techdirt.

You CANNOT dissent. Period. Expand this soft, safe, sane, with excuse of "for public good" corporate control to all who question The Establishment, and you see even mild questions will be erased, too dangerous to masnicks!

Anonymous Coward says:

Re: Re: Re: What are you afraid of? -- I'm commenting only to

My point is that Techdirt appears to be characterizing itself as fearful of i new ideas, which seems in direct contrast to the “progressive” stance it takes. Hiding posts that are well reasoned, well explained and interesting speaks for itself to any critical reader. It looks like Fearful Tyranny in action. What else could it be?

Stephen T. Stone (profile) says:

Re:

Why are you so afraid of these views?

Nobody here is afraid of those alleged “views”. Everyone here who is not one of our typical trolls is tired of seeing the same anti-Techdirt screed over and over again. The trolls never make any compelling arguments nor engage in on-topic, on-point, good faith debate. They never offer anything insightful or amusing; they offer bile and insults and, in the case of “Hamilton”, empty rhetoric designed to make them look “reasonable” while they ultimately say a lot of nothing.

If you do not want to get flagged, engage with the commenter community here on good terms and in good faith. You can disagree with people here and express views that go against the supposed orthodoxy of Techdirt without being a shithead about it.

Anonymous Coward says:

Re: Re: Re:4 Re:

A rather circular argument, don’t you think? Your bad faith is based on your perception of others bad faith, and is therefore reasonable and acceptable. Can’t other people apply this argument just as well? I mean, isn’t this an empty argument without any merit at all?

Anonymous Coward says:

Re: Re: Re:3 Re:

“You can disagree with people here and express views that go against the supposed orthodoxy of Techdirt without being a shithead about it.”

This is your statement, right, Stephen? Do you feel a little silly now? Or do you feel justified and reasonable.

Because you look ridiculous.

Stephen T. Stone (profile) says:

Re: Re: Re:4

This is your statement, right, Stephen?

I figured out this gimmick long ago. It will not work on me.

Do you feel a little silly now?

Not really. You never had any intention of discussing the issues laid out in this article or in the serious comments posted before you showed up to troll—well, not in good faith, anyway. I have no problem with mocking you because you did nothing to prove you deserve being taken seriously.

Anonymous Coward says:

This is a sincere (without wax) question

If you don’t want to hear opinions that you don’t like, why not just make this a “members only” site? You seem to give the appearance of wanting the public to comment here, but then you silence that very same public. Why not just “police” comments in advance by only allowing members to post?

Really, this is a sincere question. You are doing a lot of policing, why not just restrict membership to people you trust and agree with? What is the point of “appearing” to be open to opinion, and then silencing opinion?

Leigh Beadon (profile) says:

Re: This is a sincere (without wax) question

As I suspect you’ve noticed, the vast majority of comments on this post have not been flagged – and a significant portion of them are from anonymous commenters. We like our open comments. Lots of great discussion takes place in them. It has nothing to do with “appearing open” – this is our community and we like it, including the feature that allows for the flagging of posts from rambling, disingenuous trolls.

(And by the way, when you have to repeatedly state that your question is sincere, it’s a rather good tipoff that you know it’s not.)

Mike Masnick (profile) says:

Re: This is a sincere (without wax) question

If you don’t want to hear opinions that you don’t like, why not just make this a “members only” site?

We actually like and appreciate opinions we don’t like. But, what the community clearly does not like are disingenuous trolls.

There are lots of conversations that we have on the site where people disagree in respectful, non-disingenuous ways. And those don’t seem to get flagged.

Comment is free here! -- So is the Censoring! says:

Re: Re: This is a sincere (without wax) question

Masnick, you label just so can skip points raised. That’s a key problem in topic.

I’m not disingenuous, just stating my views. That YOU and your wrong views come in for reference is unavoidable.

My views are based in reality and shared by large numbers of persons including Congress and Supreme Court. I point you again to the Sandvig v Sessions decision, from page 7 on, "A. THE INTERNET AS PUBLIC FORUM". Need to read the whole, of course, but it’s clear just from these quotes that the Supreme Court is nearly to exactly the view that I state in the posts which you censored. (By the way, I say censored because clearly deliberate and targeted viewpoint discrimination. You have not responded to dozens of questions state whether an Administrator is involved, so I conclude that one is.)

The discussion is businesses verus "natural" persons.

(page 8) Only last Term, the Supreme Court emphatically declared the Internet a primary location for First Amendment activity: "While in the past there may have been difficulty in identifying the most important places (in a spatial sense) for the exchange of views, today the answer is clear. It is cyberspace…."

(page 9) The Internet "is a forum more in a metaphysical than in a spatial or geographic sense, but the same principles are applicable."

Key point: "the same principles are applicable." — Again, that’s applying to "natural" persons who in the instant case are accessing web-sites against TOS and corporate wishes, which of course is EXACTLY apposite to using forums and requiring them to be NEUTRAL.

Nothing in The Constitution supports the Corporatist view that "platforms" under CDA 230 are authorized to discrinate against viewpoints rather than on common law terms. And CDA always requires "good faith", as I show in the already censored comments.

The bottom-line question for readers: Do YOU want to be SUBJECT to Corporate Control? — If so, just follow Masnick blindly, he’ll lead you into the high-tech prison!

Anonymous Coward says:

Re: Re: Re: This is a sincere (without wax) question

Very interesting legal and Constitutional analysis. I would submit that perhaps Honest Abe Lincoln did quite a good job of summarizing why censorship and legal shenanigans would eventually collapse onto their own infirm foundations. It had to do with America’s ability to think. We’ve been doing it a long time. That’s what the founders counted on more than anything else. As long as we are a thinking culture, and can find a way to connect with each other, we recognize tyranny and lawlessness and will never cede control to those who would employ either. We invented common sense in politics at a time when it was new.

Anonymous Coward says:

Re: Re: Re: This is a sincere (without wax) question

The Internet “is a forum more in a metaphysical than in a spatial or geographic sense, but the same principles are applicable.”

The Internet is a collection of websites, and so long as you can create and control your own website, you are part of that forum. That is you can set up your room, in a common forum, and if nobody enters that is not a free speech issue, but rather people deciding they do not want to hear what you want to say. If you have that problem, you are not entitles to barge into somebody else room and try to take over their audience.

Anonymous Coward says:

Re: Re: Re:2 This is a sincere (without wax) question

The Internet is a collection of websites, and so long as you can create and control your own website, you are part of that forum.

Most people these days use a hosting company for their website. So, should hosting companies be considered common carriers?

47 USC § 201

(a) It shall be the duty of every common carrier engaged in interstate or foreign communication by wire or radio to furnish such communication service upon reasonable request therefor . . .

47 USC § 202(a)

It shall be unlawful for any common carrier to make any unjust or unreasonable discrimination in charges, practices, classifications, regulations, facilities, or services for or in connection with like communication service, directly or indirectly, by any means or device, or to make or give any undue or unreasonable preference or advantage to any particular person, class of persons, or locality, or to subject any particular person, class of persons, or locality to any undue or unreasonable prejudice or disadvantage.

As the D.C. Circuit explained in Verizon v FCC (2014), those provisions are at the core of the concept of common carriage.

Although the nature and scope of the duties imposed on common carriers have evolved over the last century, see, e.g., Orloff v. FCC (D.C.Cir.2003) (discussing the implications of the relaxation of the tariff-filing requirement), the core of the common law concept of common carriage has remained intact. In National Association of Regulatory Utility Commissioners v. FCC (D.C.Cir.1976) ("NARUC I"), we identified the basic characteristic that distinguishes common carriers from "private" carriers — i.e., entities that are not common carriers — as "[t]he common law requirement of holding oneself out to serve the public indiscriminately."

(Emphasis added; pincites omitted.)

Should internet website hosting companies be classified as common carriers?

How else would you ensure that “you can create and control your own website” ?

Anonymous Coward says:

Re: Re: Re:4 Re:

            How else would you ensure that “you can create and control your own website”?

Buying your own servers.

Incidentally, I don’t agree with Supreme Court nominee Kavanaugh’s position that a showing of market power is necessary for common carriage determination. As a matter of Supreme Court precedent, I think he’s wrong. Otoh, he’s the nominee and I’m not.

All the same, here, in this context, I’d agree that the market for internet website hosting providers isn’t that concentrated, and there are plenty of alternative providers. And buying your own servers is a feasible option, as opposed to renting.

But what about Cloudfare?

Unfortunately, these days, DDoS protection is probably essential for a modern website of any prominence. Is Cloudfare an essential facility for practical website hosting?

Anonymous Coward says:

Re: Re: Re:3 This is a sincere (without wax) question

The ISPs are common carriers, although the fight over net neutrality is them fighting that designation so that they can exercise control over what sites you visit, and/or how much traffic from those sites you can consume. So long as they will provide you with a connection and fixed IP address, for which you may have to pay extra, you have a means of publishing to that big forum called the Internet.

Note, free speech, means freedom to have your say, it does not say that you can distribute your speech for free, but other than standing on the street corner, distributing speech has always cost he speaker money, unless a publisher decide what they had to say was worth publishing.

That One Guy (profile) says:

Re: Re: Re:5 Of course he did

The U.S. Court of Appeals for the D.C. Circuit disagrees.

At this point I’m pretty sure if he thought it would be beneficial to them(and he could get away with it) he’d be willing to classify them as planets, so the fact that he says they aren’t isn’t really that surprising.

Adding to the humor the companies themselves have a rather bi-polar view on the subject, where they want to be considered as falling under Title II when it benefits them, but object when it doesn’t.

Anonymous Coward says:

Re: Re: Re:3 Sockpuppetry [was Is it possible to create...]

Wow, you seem upset. Let me explain my reasons.

If you go back in time a little, there was some powerful mathematical writing by a fellow named Goedel or Godel, depending on who cites him. He explained something notable – that mathematical systems that refer to themselves are inherently unsound. Also, that mathematical systems that do not refer to themselves are not powerful enough to express anything interesting.

This same truth was expressed by Bach in his “endlessly rising canon”, and by Escher in artistic form. Self reference is just the most expressive medium in which to express a profundity.

That is, when you express something in literature by using the two person conversational form, you can say much more with many fewer words. Same for Bach, same for Escher, both of which are widely acclaimed for this fundamental expression of truth.

I was using shorthand to get my idea across. Sorry if that got your knickers in a knot.

That One Guy (profile) says:

Re: Re: Re: TD Comment Theater presents: 'Me, Myself and I.'

Aw, you told them how to spot the same person across multiple comments, I was waiting to see if a few more ‘people’ would chime in to talk about how that first person was absolutely right(not that it’s at all hard to spot them even without the gravitar).

Anonymous Coward says:

Re: Re: Re:2 TD Comment Theater presents: 'Me, Myself and I.'

Aw, you told them how to spot the same person across multiple comments…

Oh, it’s easy to spot me across multiple comments even crossing multiple articles— if I’m not the absolutely worst offender in following up on my own comments, then I’m certainly in the top two or three in that regard. You can spot me every time like that.

In my defense, I’ll tell you that it often takes me considerable time to put together a short note with well-researched supporting hyperlinks. How much time should it take to put together a mere comment? And then it just often turns out that short note requires correction, clarification or other follow-up.   I apologize for the error.   Frequently.

Or I may be closely following an evolving story in the news… In that case, it’s like putting together the pieces of a not-quite-well-fitting puzzle — a puzzle which fucking mutates and grows tendrils as its picture takes shape.

Christenson says:

Re: Re: Re:3 TD Comment Theater presents: 'Me, Myself and I.'

Presenting yourself as an anonymous coward, and replying to yourself intelligently as the puzzle mutates and sends out its tendrils isn’t at all the same thing as trying to deceive the commentariat into thinking you are three different people, only one of whom is a disingenuous (and angry) troll.

And by the way, just from the tone and the timing, I suspected the sock-puppetry.

Anonymous Coward says:

Re: Re: Re:4 TD Comment Theater presents: 'Me, Myself and I.'

Hmm.. Are you calling someone a disingenuous and angry troll? How does that align with “people here are in the habit of thinking you are guilty of what you accuse others of”. Does that make YOU a disingenuous and angry troll? I think so, by your own logic.

No one was trying to deceive anyone, it was a literary form, as I clearly explained. Everyone can see that posts are marked with an icon that identifies the writer by IP address. Just reset your router to change the icon. Easy peasy.

If we could all live by the logic you propose, that ad hominem attacks are shameful to the writer and not their target, this would be a considerably improved forum.

Anonymous Coward says:

Re: Re: Re:5 TD Comment Theater presents: 'Me, Myself and I.'

Are their any Techdirt readers or commenters that would support the idea of establishing some publicly pronounced accepted norms of behavior? For example, that ad hominem attacks are censored? That is a simple public policy that everyone could understand and would substantially improve the quality of the discussions that take place here. No attacking the writer, or their history, stick to the topic.

Anyone? Or is silence and submission the only acceptable virtue signaling of group membership for this community? I suspect the later, and those that do not want to be abused by the Techdirt mob know they cannot safely respond.

Anonymous Coward says:

Re: I think the age of anyone caring about comment policing is past

It is past because the war is over. While Obama was in power, he did his level best to destroy America as it was hostorically defined, and to create a new America based on the minorities abusing the majorities. He did this openly, he called it “wealth redistribution”.

Up and until the election of Donald Trump, the left had all societal tools at their disposal. The news media, the print media, and electronic media, with almost no exceptions. He steered America towards it’s own demise and replacement with a revolution of the minorities who felt disenfranchised at the expense of the all the rest of us.

This revolution failed. Even with all the tools of power at his disposal, Trump was elected. He was so radical in his implementation, corrupting the DOJ, the FBI, the IRS, schools, Universities, health care, foreign policy, etc. that Trump looked like the best option to American people. Trump, hairy head and all.

The time to care about policing social media is past. No one cares. From this point forward, the conservative side of American politics will dominate, I predict for a hundred years or more. Now students are being schooled in real American history again. Universities will follow, as will the DOJ, the FBI, the IRS, all of it. It was saved from destruction by the American people, who will never relinquish power again.

No one cares who polices this stuff. The American People have already won the war, and the left will now simply eat it’s own children. It’s not important.

Witness Donald Trump. Welcome to the Century of Conservative Rule.

Anonymous Coward says:

Re: Re: I think the age of anyone caring about comment policing is past

Perhaps my point was not clear as it could have been. Allow me please to phrase it another way, perhaps suggesting that American society has developed on a one way trip that it will never return from.

America has been refining it’s idea of itself for over 200 years. We have, though our Constitution, incredible flexibility as a society. Not unlimited flexibility, of course, but as long as people consent to new creative approaches to solving problems by voting for their favorite political candidate, we have wandered far and wide over the ideological map.

What has changed now is the speed at which we can conceive, communicate and challenge and discuss. In the last few years, we have heard every argument from the left, and we have heard every argument from the right. The speed that these arguments can be made, distributed, endorsed or condemned has been accelerated far beyond what Hamilton and the other Federalist paper authors could have anticipated.

In Hamilton’s time, it took days or weeks or months for arguments to be presented and either endorsed or rebuffed. Now it takes seconds. We have heard all the important arguments. They fly by in the wink of an eye now. The speed of the opinion pendulum of American Voters is no longer delayed by Pony Express.

And we have decided, as a society, once and for all. This is not the past, this is the future. Communication is faster, resolution is faster, and what America choose in Donald Trump as an implementer of the American Dream, which he spoke directly to, repeatedly. We all know what we want him to do, and he is doing it. To the majority of the American people, that is how they see the America of today and tomorrow.

Nothing is going to change for the foreseeable future. The same censorship is going to take place, the same anarchists and going to espouse the same positions they did since Soviet Russia, there really is nothing new under the sun. We’ve seen and read it all. Several times.

And American have decided. There will be no pendulum swinging back any time soon, not until there are creative ideas to drive it. New ideas that have not yet been presented. Until then, Welcome to the Century of Conservatives.

Your only hope is to stop censoring and start creating. Censoring doesn’t work for long, and in today’s world, by demonstration, it doesn’t work at all.

Anonymous Coward says:

Re: Re: Re:2 I think the age of anyone caring about comment policing is past

Ok, I see I have not yet reached you, perhaps a third try: Censorship is a tool of the party in power. Censorship is used to maintain power. The American Left is the first to try to use censorship while they are out of power. They are using their corporatist legal analysis to justify censorship, but to what end? They have carefullly engineered their platforms to give them that power. They are applying every technological tool of tyranny they can muster, and are well funded to defend their legal position.

How is it going? Well, I they are getting the “participation” trophy, and they are giving it to themselves. In electronic media, is it a very one sided discussion in their favor. In the “real world”, to address your perception of reality and mine, they are losing every important election. Going forward, they will just lose more.

That is, in reality, the left are bunch of losers, bigly. The rest of us are getting promotions, buying nice things, paying off our debts, and saving for our futures with stock accounts that up 30%+. Trump’s popularity is nearly double what it was on Election Day. Real enough for you?

Stephen T. Stone (profile) says:

Re: Re: Re:3

In the “real world”, to address your perception of reality and mine, they are losing every important election.

They lost in 2016 because the Republican presidential candidate was a White man who used xenophobia and racism as his primary campaign promises (building the wall, dismantling Obamacare because fuck Obama, only letting the “right” immigrants in, the Muslim travel ban) and Republican voters seized on the chance to slow the so-called “browning of America”.

The rest of us are getting promotions, buying nice things, paying off our debts, and saving for our futures with stock accounts that up 30%+

Define “rest of us”, because I know people who live paycheck-to-paycheck hoping that nothing major happens to drain what little money they have in their checking account. I doubt they are seeing the success you claim they are.

Tim R says:

I don’t know if this has been brought up yet, but I had the opportunity to peruse the fine piece of legalese that is the InfoWars Terms Of Service, and for some reason, the one part that jumped out at me was in section 14 (already way too many sections for a document that should be accessible and interpretable by a layperson):

14. BREACH, REVOCATION AND CANCELLATION.
14.1. In the event that you breach any provision of this Agreement, you agree that we may immediately terminate your use of our Services and System.
14.2. In the event such a breach occurs by you, we may post on the Website that you have violated our terms and conditions of service.

So, according to InfoWars, they reserve the right to name and shame you on their web site for violating the Terms of Service. While I’m sure that’s perfectly legal, there just seems to be a juvenile element to it, especially considering that they feel the need to explicitly point it out.

Anonymous Coward says:

Re: Any breach of service must be due to common law cause.

That’s actually the key point of topic, on which Masnick says corporations have absolute and arbitrary power, and which me and “conservatives” (I bet Alex Jones views it this way too), believe that only common law — which well-known cases up to Supreme Court — have defined, such as explicitly advocating violence against a person, not just saying it can be used at some point.

Corporatist Masnick is explicitly turning even the First Amendment, the prime guarantee of free speech, into a means for corporations to regulate speech that he doesn’t like. It’s not going to please you if suppressed by a corporation rather than gov’t.

Read my reply to Masnick above, which references recent Supreme Court decision strongly implying that internet forums are the new public forums, to be protected against arbitrary denial of service.

Anonymous Coward says:

Re: Re: Any breach of service must be due to common law cause.

What you explain sounds intuitively right to me. I believe a common reader that comes to Techdirt believes he is participating in a public forum being held in a public square, and that part of Techdirt’s strategy is to promote that appearance. The unwritten and unsaid censorship policy here is an abomination to any student of history, as Mason Wheeler rightly pointed out at the top. It all seems strange to me that a party out of power wants to censor and malign potential new members. You would think they would be welcoming anyone they could find, they need the numbers, and doing their best to promote new ideas, not bury them. They seem to be stuck in the past, wasting time trying to impeach a successful president, and howling at the moon.

Anonymous Coward says:

Re: Re: Re: Any breach of service must be due to common law cause.

I believe a common reader that comes to Techdirt believes he is participating in a public forum being held in a public square, and that part of Techdirt’s strategy is to promote that appearance.

There’s an interesting suit allowed to go forward against Twitter (I believe) which is based on that being fraud. Inviting people in, saying you’re for free speech even when difficult to bear, that "comment is open to all" — and here on Techdirt are ZERO commenting guidelines, nor warning that comments can be edited (which "hiding" is, adding an editorial comment that they’re too dangerous to view without a warning) — and then discriminating against viewpoints, that’s civil FRAUD.

By the way, Masnick trots out the continuing LIE that it’s all the mysterious "community" with an opaque "voting system", won’t state whether an Administrator is ever making a decision. The observable fact right above is that "Stone" gets to make vile senseless one-liner ad hominem comments, and jeer that he can do more, while my on-topic lengthy comments, into which I put time and thought, are censored.

Anonymous Coward says:

Re: Re: Re:2 Any breach of service must be due to common law cause.

PS: with vile fanboys (who may well be astro-turfing, just look how “Christenson” above ardently defends the site and appears to speak with authority and foreknowledge until I make it backtrack), Techdirt is usually successful at driving away serious people. That’s its key weapon. I advise you to leave, as won’t change minds here. I stay only for personal fun and to warn others that Techdirt is not all that it appears to be, and what you don’t see in topics and views is far more important…

Anonymous Coward says:

Re: Re: Re:3 Re:

I’m totally with you on your “fraud” analysis, I see it the same way. They want to give the appearance of a legitimate news and opinion site, but they literally censor and attack any “invaliders” that hold different opinions than their own. They use a lot of fake IDs to make a single voice appear as many, and though they have openly admitted that some posts are written by paid employees, they steadfastly refuse to identify them. It’s pretty much bullshit from pig to post. On the other hand, sometimes it is entertaining to help them work through their reasoning process and put it on public display. Stephen is especially fun. I think he is more clever than he appears, and is actually enjoying the dance as much as we are.

Christenson says:

Re: Re: Re:4 Re:

Look in a mirror…people here are in the habit of thinking you are guilty of whatever you accuse others of.

And Terms of Service?? It’s a sophisticated crowd here, encouraged by “Funniest/Most Insightful Comments of the Week”, so, sorry if you can’t find explicit terms of service.

The terms amount to

“Don’t be an ass”.

Christenson says:

Re: Re: Re:6 Re:

You sure you want to do this???? I assure you there are more horse’s asses in the world than horses!

It’s always good to look in the mirror and ask:
“Am I being an ass???” “How could I be less of an ass?”

And BTW, I stole that rule from the very same website with “auto-nanny”, which shadow-banned 7 words but not nigger or faggot or queer! lol.

Jerry (profile) says:

shadowbanning etc.

shadowbanning is a problem because it’s covert and not public.

AS for definitions:
Hate Speech = SPEECH I HATE!
Fake News = FALSEHOODS THAT DISAGREE WITH MY RIGHTEOUS BELIEFS!

Good thought provoking article. One major problem is the tyranny of the masses Group Think has led to all sorts of weirdness e.g. cultural appropriation. IMHO the idiots should be free and untrammeled when they make it evident that they are idiots!

No compulsion in the world is stronger than the urge to edit someone else’s document.
H. G. Wells

Rezzo Lute says:

In the censored comments, anyone see outside common law?

Of course not. So why censored?

Because Masnick doesn’t want actual debate, only to spread the poisonous corporatist propaganda that we "natural" persons MUST accept new method of censorhip.

The real KEY is not a gov’t / corporate divide though, it’s that BOTH gov’t and corporate methods serve The Rich in their aim for total control.

Again, you are eventually not going to like corporatized censorship, even though you DO like it currently targeting Alex Jones. The system is made to control you, too. It’ll always need new targets, so if suppress "conservatives", then you’re next.

Corporations have no ideology except power.

Anonymous Coward says:

Re: In the censored comments, anyone see outside common law?

Yes of course you are correct in saying that corporations have no ideology except power, specifically, in the form of money. That is the purpose of their existence, and that’s ok as long as there are many corporations contending for that power, each outdoing the other in the products it offers, with a discerning public free to choose between them.

Go ahead and target Alex Jones – he has gotten more publicity in the last few days than he ever got before. Have at it, maybe he will be elected as the next president if they censor him enough – witness Hitler, as Mason Wheeler aptly pointed out at the top of this section.

Americans are capable of sorting this all out for themselves, as they publicly demonstrated with the election of Donald Trump. The left’s control of YouTube, Facebook, Twitter and Techdirt did them no good at all during the last election, and I guarantee you that their power will continue to diminish.

Nearly all men can withstand adversity, but if you want to test a man’s character, give him power. Witness the tyranny of Obama, Zuckerberg, Masnick and others, and compare them to Trump. Trump has more character than all those leftist tyrants combined, by demonstration. Don’t expect Masnick to change, and don’t try to take away what incredibly feeble power he wields. He is performing a public service by demonstrating the results of a tyrant in teapot. A very small teapot.

Anonymous Coward says:

Re: Re: In the censored comments, anyone see outside common law?

Mr. Masnick, if you would, regail us with testament to your own character. Tell us how you wield Techdirt Editorial Power for the betterment of your fellow man, and not just for your own selfish gain and those few that bow to your power. Explain your philosophy of censorship and how it makes the world a better place for others, and not just for yourself and your associated retards.

Defend your public validation and endorsement of convicted traitors and how you are helping to build a better world for everyone, not just the criminal few who set out to damage America. It is difficult to comprehend that you are as shallow, conniving, fearful and hateful of others as you seem, probably you have a reason for your rampant censorship of the educated while promoting the hateful and disgusting rhetoric of the ignorant and retarded.

Give us a few words, Mr. Masnick, and bring yourself into a perspective that normal people can understand. You are a talented writer, express your character as Lincoln suggested. You certainly have the power to do it, there is no denying that by anyone, this is your site, you can do whatever you want, legally, as you laid out many times. You have the ultimate corporate power here. Expose your character, we are all interested.

Anonymous Coward says:

Re: Re: Re: In the censored comments, anyone see outside common law?

I understand, Mr. Masnick, in this forum shunning is considered an appropriate alternative to public discussion. How long have you been at this? Have you ever wondered why it is becoming harder and harder to finance your enterprise? Have you planned your retirement yet? I don’t think you are going to make that money here. And if shiva is successful, your self-made future is bleak indeed.

Anonymous Coward says:

Re: Re: Re:2 In the censored comments, anyone see outside common law?

in this forum shunning is considered an appropriate alternative to public discussion.

Walking away from someone spouting nonsense or hatred had always been part of public discussion. If people are turning their back to you, it is your problem for driving them away.

NaBUru38 (profile) says:

Dear Mike, I share some of your concerns.

Hate speech is a vague concept. Censoring it would be a daunting task. There would be false negatives and false positives, which would cause severe problems.

I also think that online platforms have too much power in controlling speech. Google, Facebook and Twitter can prevent you from reaching the audience. Microsoft and Amazon can shut down your website.

If companies must comply with government censorship, it’s totalitarian. And if they censor by free will, it’s terrible too.

Yet with no censorship with all, liers are spreading their preach easier than ever. Finding solutions isn’t easy, especially when one half claims censorship is too hard and the orher half that it’s too soft.

Anonymous Coward says:

Re: Re:

Keep in mind amidst all this chaos and difficult choices, we have elected President Donald J. Trump. What does that tell you? Every internet company I can think of (YouTube, Facebook, Twitter, Techdirt and others) censor conservative speech and promote socialist speech. And the result? Look around! America is more conservative than ever!

Unlike Globalist Socialist Zombies, Americans, at their foundation, make their own decisions. Americans understand tyrannical attempts to control them. The more tyrannical the platform, the more Americans HATE them and elect their nemesis.

100 years of Conservative Rule. That’s my prediction. We have entered the ideological equivalent of WW I where everyone is dug into their trenches, and the line does not move. Short of a nuclear attack (which is in Trump’s full control) nothing is going to change for the foreseeable future.

Censorship is America is counterbalanced by American beliefs and ideology. Anyone who thinks otherwise is no longer in power anyway, so who cares? When I was in grade-school, we had lessons in critical thinking. They never really wear off.

Except for the retards, that is.

Stephen T. Stone (profile) says:

Re: Re:

Hate speech is a vague concept. Censoring it would be a daunting task. There would be false negatives and false positives, which would cause severe problems.

This is why American law does not recognize “hate speech” as a form of unprotected/illegal speech. No court has yet to come up with a working definition of the term that could strike down as narrow a selection of speech as possible while leaving room for, say, social commentary or satire.

I also think that online platforms have too much power in controlling speech.

Depends on what you mean by control.

Google, Facebook and Twitter can prevent you from reaching the audience.

If you want to discuss concerns about how Twitter, Facebook, etc. have “too much” cultural influence and what not? Go right ahead. But please avoid implying that those companies owe you an audience for your speech. You are not legally entitled to an audience of any size.

Microsoft and Amazon can shut down your website.

That is a legitimate and pressing concern, as are situations such as Cloudflare’s refusal to serve Stormfront and the ensuing fight over Stormfront’s domain names. (Even the Cloudflare CEO said as much.) Being unable to reach an audience on a specific service is not something I worry about. Being unable to reach any audience at all because someone said I could not? That should worry the shit out of everybody, regardless of whether that someone is a government agent or a corporation CEO.

(And yes, while I would typically refer to something like Amazon kicking your site off its service for violating the terms of service as “moderation”, it can also be censorship. Nuance!)

with no censorship with all, liers are spreading their preach easier than ever. Finding solutions isn’t easy, especially when one half claims censorship is too hard and the orher half that it’s too soft.

There is no ultimate “solution”—no magic problem-solver, no algorithm, nothing. Any moderation of a given platform is bound to run into mistakes and make bad calls. Automated tools are only as good as the people who make them, and human beings are irrational creatures. Those tools—and sometimes those people—might ignore the context of an “offensive” statement and smack down someone who, for example, uses “queer” as a self-identifying descriptor instead of an anti-gay slur.

We will always make mistakes. The goal, then, is to improve the systems we have so we can do better in the future. Better rules, better tools, better understandings of context—all things we can put into practice so we can make those systems better. We can never make a system “perfect”, but as an old aphorism says: “Perfect is the enemy of good.”

Christenson says:

Re: Re: Re:

I take the following as true:
Alex Jones and Infowars, through Facebook and Twitter, have done some serious harm that is worth addressing. The Atlantic has reported on some racial violence in India that is all Facebook and racial tension over a screw-up in a restaurant where the restaurant and the customer didn’t speak the same language.

Censoring that idiot Alex Jones only Streisands him and validates him…just as it did Mr Hitler before he started the second world war. Censoring screeds might have been a short-term victory, but it is a long term failure. Similar with those we down-moderate here in the comments.

Many feel that Twitter and Facebook, being of great influence, free, and open to everyone that is not “ill-behaved”, are in fact public accommodations that should not be arbitrarily censoring users. Cloudflare’s CEO said it well when he dropped Stormfront, noting that while he had every right to reject them as a customer because he woke up in a bad mood, that didn’t make it a good thing. Other commenters have noted that “nondisrcriminatory”, as originally intended, did not include a protected class…it meant everyone. Otherwise, it gave, for example, the telegraph operator or railroad (or city garbage collector) inordinate power.

Moderators disagree over the exact same content, and moderating content is very context sensitive.

The normal speed of the internet is a factor in all of this.

******
From a prophetic standpoint, congress is jumping up and down saying “Do Something!…Do Something!” in a panic. Something will change, the question is how to not end up with bigger problems than we started with.

From the evidence above, which is on reasonably firm ground, I take the following:

The difference between Techdirt and Facebook is one of scale. With user generated content, both decide which content will be most easily seen; that is speech by the website and not by the users, and currently protected by CDA 230.

One possible way to legally distinguish Techdirt and Facebook is the degree to which Facebook users live in a “filter bubble”. Essentially everyone sees the same Techdirt. No two people see the same Facebook or twitter, and you never see the same Facebook twice. The choices on Facebook are out of the end-users control.

I seriously don’t think removing Infowars (or any other post) actually helps with the underlying problems, which I think involve a sense of panic, a degree of credulity, and a lack of reflection. Recall that the US has recently made gay marriage a thing; while the liberals said “it’s about time!”, the conservatives felt like it was legislated immorality.

Perhaps speech in the opposite direction might make more sense.

I have an acquaintance who has been going on about #qanon for months, and another who think Antifa are an actual organized group. Not sure how to reach these folks, but prying the garbage from their cold, dead hands is just gonna make them grab it harder.

Stephen T. Stone (profile) says:

Re: Re: Re: Re:

The tricky thing about the Alex Jones situation lies largely with the Outrage Machine mentality of most social interaction networks: Expressing anger about something is both easier and more eyeball-grabbing than expressing positivity. (The exception is cute animal videos.) Jones, and the current POTUS to a similar-yet-larger extent, feed the Outrage Machine by saying a bunch of stupid bullshit and letting human nature take its course.

Now that Jones is gone from most of the high-profile SINs, his feeding of the Outrage Machine is stalled. Yes, he still says dumb bullshit on both Twitter and his own platform, but his reach and profile will most likely suffer as a result. Not all Streisanding leads to positive results for the person who would nominally benefit from one. Milo Yiannopolous went from being a high-profile name in the so-called “alt-right” movement to being a nothingburger practically overnight; that happened because he was deplatformed by Twitter for being a rulebreaking ass…and by several outlets to which he was contributing, although that was for his appearing to defend pedophilia. His profile dropped so hard and so fast that he is barely even a presence on either side of the Outrage Machine.

In regards to the filter bubble talk: Yeah, I got nothin’ on that one. A big problem with hyper-partisan news sites and news networks is the presenting of a specific set of facts under a specific lens that paints “the other side” as villains in a battle for the soul of humanity or the country or whatever. Fox News does this by presenting Democrats as monsters trying to get in the way of Republicans who absolutely want to do what is best for all Americans. MSNBC does much the same to Republicans, albeit to a somewhat lesser extent. We have to address both the hyper-partisanship of journalism and the filter bubble issue simultaneously, or else we will end up right back where we started…or worse.

(P.S. ~ That description of the difference between Techdirt and Facebook is outstanding.)

Anonymous Coward says:

Re: Re: Re:2 Re:

I think the best answer to these problems is to provide the tools for people to filter their own feeds. I would start by allowing them to have various groups for managing their their view into the system.

1)Friends where they can apply little or no filtering;
2)Acquaintances, where they can apply more selective and restrictive filters.
3)Commercial/business relationships where applicable, including companies that the person buys from.
3)Everybody else, where they can be more selective, including blocking identified sources.

The site should not be applying any filters other than blocking outright spam and fishing attacks.

Making the users responsible for their own moderation eliminates the the risk of activists abusing the moderation system to try and impose their morals and politics on society.

Stephen T. Stone (profile) says:

Re: Re: Re:3

I agree for the most part; any service should have tools in place to help users mitigate abusive behavior, and users should be better curators of their own experience. I also believe the service operators should always step in to moderate and guide the community by getting rid of toxic elements and applying consequences to bad behavior. If that means banning people on the basis of speech, so be it—Twitter, Facebook, etc. still have that right regardless of their size and cultural influence.

Anonymous Coward says:

Re: Re: Re:4 Re:

The problem is once you get to the size of FaceBook and Twitter it is no longer possible to manually moderate the site. One advanced option to deal with this is maybe allow people to set up and join moderated groups on the site. Such groups may be closed or open, with the latter effective being people maintaining a filter on content and/user id. Also, the site could possibly offer a blacklist which users can turn on or off at a global level, or at an individual id level.

That is always ensure that people can choose whether to see a moderated site, or carry out their own moderation. I always feel uncomfortable about driving those with extremist view underground, as that only feeds their extremism, while leaving them operate in public view does allow the less extreme try to modify their views, even if most of society decides to turn their back on them, and leave them to their little corner of a site.

Stephen T. Stone (profile) says:

Re: Re: Re:5

I always feel uncomfortable about driving those with extremist view underground, as that only feeds their extremism, while leaving them operate in public view does allow the less extreme try to modify their views, even if most of society decides to turn their back on them, and leave them to their little corner of a site.

On one hand, I understand the logic behind this thought—“better to shine a light on evil than to let it fester in the darkness”, or something like that. In an ideal world, I would eagerly agree.

On the other hand, this is not an ideal world. By treating racist, sexist, and anti-LGBT speech and ideas (amongst other things) as merely “something to debate”, platforms give those ideas credibility and help move the Overton Window a little closer to normalizing those ideas. Some ideas have no place in the public sphere; no privately-owned platform has an obligation to give those ideas an audience.

Yes, at a certain size, a platform like Twitter can no longer effectively moderate the service through people alone. The solution should not be to force free speech absolutism upon Twitter at the expense of making the user 100% responsible for their Twitter experience. (As I said before, users have a responsibility to curate their experience—but the service has a responsibility to watch out for and punish abuse carried out by bad actors.) A better solution would be to never let a single platform get that fucking big in the first place. “Protocols over platforms” and decentralization and all that jazz has a point: We do not need to be connected with the entire world just to communicate with friends and find new people.

(Hell, for my money, getting rid of “instant gratification” social interaction networks like Twitter and Facebook by way of abandonment would be the best thing possible. Those services thrive on a reactionary state of mind, whereas all the old, “slow” forms of Internet communication like email and forum replies and blog posts do not require us to have our eyeballs glued to our timelines for the latest outrage…or the latest cat video.)

Anonymous Coward says:

Re: Re: Re:6 Re:

One of the reasons that the platforms have grown so big is because of networks effects. One of the problems s that people will not do their own curation, but rather expect others to do it for them. There is also the problem of a very vocal minority who will go out of their way to find that which they consider offensive and try and it taken down. Such people will not do their own curation, but why should they be allowed to force curation onto others?

Christenson says:

Re: Re: Re:2 Re:

Some thoughts and issues:
0) One more way in which Techdirt differs: I can expect a ban, swift condemnation, and other action if I suggest someone do something bad to our favorite troll, OOTB. In fact, if a mob were to take up that banner, I would expect Techdirt to find it in the position of the ACLU: defending OOTB, who Techdirt likely detests, against his attackers in various ways.
1) Most users don’t even know what a “filter bubble” is…controlling your own experience is a lot of work!
2) If everyone controls their own filter, then there’s no way to stop the next cheetoh from feeding the outrage machine.
3) The hyper-partisanship was architected by Roger Ailes. Without a doubt, prior to Fox News, there was some liberal bias in the media because being afraid of that strange-looking, strange-talking person just doing their job is like a routine car accident: it’s just not dramatic enough to sell news. Bad things happening, typically to underdogs? That sells news through the outrage machine.

Quoting The Atlantic:
https://www.theatlantic.com/politics/archive/2018/08/the-battle-that-erupted-in-charlottesville-is-far-from-over/567167/
White nationalists win by activating white panic, by frightening a sufficient number of white people into believing that their safety and livelihoods can only be protected by defining American citizenship in racial terms, and by convincing them that American politics is a zero-sum game in which white people only win when people of color lose. While this dynamic has always been present in American politics, it has been decades since the White House has been occupied by a president who so visibly delights in exploiting it, aided by a right-wing media infrastructure that has come to see it as a ratings strategy.

I think we need to deal with that panic. These people are suffering from future shock. Most of them have no head for complex, nuanced reasoning. Hitler got his foothold when Germany was being made poor by England, and likewise these white nationalists are getting their footholds as the billionaires on wall street are making the rest of us poorer.

By the way, there’s a fire burning and getting close: climate change.

whodunnit (user link) says:

shadowbanning?

“other than shadowbanning, which some people pretend is somehow evil and unfair”

Um, it IS evil and unfair, because you’re deceiving people. They go about their business posting, uploading, etc. not realizing that less people (or none at all) are seeing them. At least they used to; most people at this point know how to tell if they’ve been shadowbanned, in relatively short time. Morality aside, it creates yet another layer of waste in the economy. Be forthright and timely with people about their behavior and they immediately have to adjust or put their energy towards other things altogether, whereas shadowbanning encourages people to keep pounding away at the keyboard for no reason.

Loupgarous (profile) says:

Mostly on target, "decentralization" an epic fail

In “Setting the Record Straight on Shadow Banning” https://blog.twitter.com/official/en_us/topics/company/2018/Setting-the-record-straight-on-shadow-banning.html, Two lead Twitter staffers deny Twitter shadow bans, which places them at odds with how wikipedia defines the process:

The staffers deny shadow banning Twitterers, but admit further in their post:

“We do not shadow ban. You are always able to see the tweets from accounts you follow (although you may have to do more work to find them, like go directly to their profile).”

Speaking in plain language, your Tweets can’t be seen by anyone on Twitter who doesn’t follow you, and they may have to do an unusual amount of work to see your Tweets.

That’s shadow banning people by the wikipedia definition:
“By making a user’s contributions invisible or less prominent to other members of the service, the hope may be that in the absence of reactions to their comments, the problematic or otherwise out-of-favour user will become bored or frustrated and leave the site”

How does Twitter decide to do this?

“Here are some of the signals we use to determine bad-faith actors:

Specific account properties that indicate authenticity (e.g. whether you have a confirmed email address, how recently your account was created, whether you uploaded a profile image, etc)

What actions you take on Twitter (e.g. who you follow, who you retweet, etc)

How other accounts interact with you (e.g. who mutes you, who follows you, who retweets you, who blocks you, etc)”

Let’s focus on “who mutes you, who follows you, who retweets you, who blocks you, etc” for a moment.

“Decentralizing policy enforcement” in this case puts at least some of the power to shadow ban Twitter accounts in the hands of “who mutes you, who follows you, who retweets you, who blocks you, etc”.

Part of the marked discrepancy between conservatives and liberals on Twitter’s shadow ban list may be that liberal political organizations famously began using social media on the Internet for organizing of all sorts.

Given the discrepancy exists, and few leftists are so banned, and that the criteria for shadow banning have been set forth by Twitter, it’s not unreasonable to suspect that the algorithms Twitter uses to decide who gets a shadow ban are being gamed. You wouldn’t have to have many followers to block or mute everyone on a list you draw up to get the algorithm anyone you don’t like.

Such lists exist. I’m on two of them, curated by Marethyu (@scathachultor), helpfully labeled “Scum” and “Propagandists/Liars”. Given his apparent social skills, it’s perhaps unsurprising Marethyu only has 92 followers.

The question, I guess, is how many blocks or mutes do you have to have to get shadow banned on Twitter? Twitter won’t say.

czrpb says:

The Shibboleth of Choice

RE: “As for me, … we need to move to a world of protocols instead of platforms, in which transparency rules and (importantly) control is passed down away from the centralized service to the end users. … Ideally, Facebook (and others) should open up so that third party tools can provide their own experiences — and then each person could *choose* the service or filtering setup that they want.”

Did I missing where the dangers associated with this will either (1) be mitigated by something; OR (2) are analyzed to be less dangerous than the current state?

(Basically did you address the emergent segregation from “choice”? https://en.wikipedia.org/wiki/Thomas_Schelling)

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...