Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019)

from the knitting-politics dept

Summary: When people think of content moderation and political debates, they may not think about knitting. However, the knitting community at the online site Ravelry has become a fascinating place to explore content moderation questions. This actually goes back many years, as Ravelry?s content moderation practices (handled by dozens of volunteer moderators) were studied for a PhD dissertation by Sheila Saden Pisa that was published in 2013, entitled: ?In search of a practice: large scale moderation in a massive online community.?

Knitting and Ravelry have also been quite political at times. All the way back in 2009, a blog post was written by someone who was kicked off of Ravelry, and she believed it was because of her conservative political views. After the election of Donald Trump in 2016, Ravelry was where the initial plans for the now famous ?pussyhats? (for the Women?s March protesting Trump?s Presidency) were first released and shared. Ioana Literat and Sandra Markus studied Ravelry?s role in online participation, civic engagement and ?craftivism.?

Still, it caught many people by surprise, in late 2019, when Ravelry declared a new policy, saying that it would no longer allow any posts supporting Donald Trump. From the announcement:

We are banning support of Donald Trump and his administration on Ravelry.

This includes support in the form of forum posts, projects, patterns, profiles, and all other content. Note that we will not destroy project notebook data. If a project needs to be removed from the site, we will make sure that you have access to your data. Even if you are permanently banned from Ravelry, you will still be able to access any patterns that you purchased. Also, we will make sure that you receive a copy of your data.

We cannot provide a space that is inclusive of all and also allow support for open white supremacy. Support of the Trump administration is undeniably support for white supremacy.

The Community Guidelines have been updated with the following language: ?Note that support of President Trump, his administration, or individual policies that harm marginalized groups, all constitute hate speech.?

The company noted that this was not a statement of support for other candidates, nor was it saying that it would ban people who (outside of Ravelry) supported Trump. It also made clear that it was not banning other political topics or statements in support of other candidates. Instead, it said: ?We are definitely not banning conservative politics. Hate groups and intolerance are different from other types of political positions.? The decision created quite a lot of attention with many supporters and detractors.

Decisions to be made by Ravelry:

  • How do you decide when one politician?s positions are so problematic to your community that you ban any support of that candidate?
  • How will this policy be enforced? Should it apply to earlier statements of support or just future ones?
  • How will attempts to get around the ban (such as with hints or euphemisms) be dealt with?
  • Can volunteer moderators be supporters of Trump?

Questions and policy implications to consider:

  • What are the pros & cons of banning support for Trump v. banning all talk of politics?
  • Would other approaches — such as moving all political talk, or all talk about Trump, to a specific area — work as effectively?
  • If, as has been suggested in some Section 230 reform bills, the laws change to require ?political neutrality? in content moderation, how will Ravelry?s moderation practices be impacted?

Resolution: After Donald Trump left the White House on January 20th, Ravelry reiterated that its policy remained the same, even though Trump was no longer President. A year and a half after Ravelry?s decision, the New Yorker published a long, and detailed article about the decision to ban Trump support on the site and how it is going, entitled ?How Politics Tested Ravelry and the Crafting Community.?

On the day of the ban, Kim Denise, one of the volunteer moderators, told me, ?I was, like, I?m so psyched. I?m so proud to be part of Ravelry.? Then the ban happened. ?And it was, like, Oh, my God. I wish we?d thought this through.? Right-wing trolls began signing up for Ravelry accounts and spamming threads with anti-Ravelry or pro-Trump sentiment. Denise described it as ?hordes of screaming people lining up to sling feces at us. . . . It was terrible.? Users scurried to help moderators by flagging posts for deletion. They recruited a retired moderator to help deal with trolls. Within a couple of months, most of the activity generated by the Trump ban had subsided. Conservative users banded together, in a movement hashtagged #RavelryExodus, deleting their accounts and shifting to other platforms to sell patterns.

The company?s founder also admitted that after the ban was announced, she realized the difficulty in figuring out the exact boundaries of enforcement:

Jessica admitted that Ravelry has struggled to pinpoint exactly what constitutes inappropriate content. ?Some of this stuff is so nuanced,? she said. ?Think about what tweet got Trump banned. It was not about attending the Inauguration.? She went on, ?We went through some pretty crazy rabbit holes: ?O.K., this is an eagle, but it isn?t really the Nazi eagle. Or is it?? It?s just, like, ugh.?

Originally posted to the Trust & Safety Foundation website.

Filed Under: , , ,
Companies: ravelry

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019)”

Subscribe: RSS Leave a comment
29 Comments
Vidiotsays:

No such content moderation worries for another group of textile-affinity hobbyists: the "Tiny Pricks Project" created a worldwide group of embroiderers who, presumably using tiny pricks, added inane quotes from The Orange One and his henchman to vintage pieces of embroidery they’d found at tag sales and thrift shops. At times, new submissions appeared within the hour of their utterance, mostly on Instagram… crafters working overtime to soothe their collective outrage.

Scary Devil Monasterysays:

Re: Re: Not Really

"Why do you lie?"

Because hewon’t get any sympathy if his argument of non-moderation is backed by what he really thinks; That he’s sad that nazis and Proud Boys are being kicked off Twitter and Facebook for chanting racist slogans the same way they’re being booted out of most bars.

That One Guysays:

Re: Re: Re: Not Really

The sad thing is it’s not like he’d get less sympathy if he was at least honest about it as it’s not like he’s fooling anyone about who/what he’s really arguing for at this point with the very telling silence anytime someone tries to pin him down on specifics about what exactly he thinks people are getting banned for that’s such a problem.

Scary Devil Monasterysays:

Re: Re: Re: Re: Not Really

"The sad thing is it’s not like he’d get less sympathy if he was at least honest about it…"

Yeah, the problem the "alt-right" of today has is that they’re both stupid and cowardly. They are just bright enough to realize people as a whole don’t like their views and too scared to fess up to what they really think because they’re so pathologically desperate for non-white-trash company they wriggle away like bloody pill bugs every time someone shines a light on the topic.

The original nazis were at least monsters who inspired the design of Bad Guys in filmography for generations. They had the fsking chutzpah to stand for what they believed in. Along with their fashion sense just about the only admirable thing about them.

The neo-nazis of today? Utterly pathetic. Semi-literate chimps afraid of their own shadows with Cadet Bonespur "leading" them by shouting encouragement from a safe distance. At least Hitler led his Beer Hall Coup from the front.

Anonymoussays:

When I’ve helped moderate topic-specific forums, our rule has been to delete all off-topic posts. I can understand moderators not deleting inoffensive posts–but if a gang of spammers offended the community, the moderators’ JOB is to STOP the spam, and to review the rules to make sure all moderators understand the problem.

If the posters had been doing nothing more aggressive than post patterns for trump-faced pumpkins (or vice versa) or MAGA caps, I am sure there would have been no rule changes. But that is not what happened. This isn’t about being political. This is about being deliberately offensive.

That One Guysays:

Re:

If the posters had been doing nothing more aggressive than post patterns for trump-faced pumpkins (or vice versa) or MAGA caps, I am sure there would have been no rule changes. But that is not what happened. This isn’t about being political. This is about being deliberately offensive.

I’ve no idea why you’d say that, as demonstrated post-ban the poor victimized Trump cultists are the height of class and civility, something I’m sure was the just as much true before the ban as it was afterwards and leaves the justification for said ban based upon nothing but hypotheticals with no basis in reality.

Anonymoussays:

Re:

Why do stupid people make this argument? Masnick makes it too. He is also very stupid. Very stupid.

Let’s look up the definition of censorship:

Censorship
noune
1a : the institution, system, or practice of censoring They oppose government censorship.
b : the actions or practices of censors especially : censorial control exercised repressively censorship that has ? permitted a very limited dispersion of facts ? Philip Wylie
2 : the office, power, or term of a Roman censor
3 : exclusion from consciousness by the psychic censor

Censorship does not require a person unable to express an idea. If that was the case, then the only true censorship would be killing someone.

Removing posts from an online board is censorship. It does not matter if it removing spam or removing content a person dislikes. It is censorship. It is also legal unless the government does it (some exemptions exist).

This is censorship. It is legal. The only reason you do not see it as censorship is because you are willfully stupid.

PaulTsays:

Re: Re:

The reason why moderation and censorship are used differently is because they have different implications. Moderation implies that the owner of the property is enforcing house rules. Censorship implies greater restriction.

This is a useful distinction in many ways. If the BBFC cuts or refused to certify a film, that’s censorship since a BBFC rating is legally required to distribute a film in the UK. If the MPAA make the same cuts or refuses to certify a film in the same way in the US, that’s not really censorship in the same way since the film can still be legally distributed. Same action, perhaps, but it’s useful to apply different terms. to describe the fundamental difference in what’s been done.

So it is with this discussion. Some people are desperate to pretend that getting their asses kicked off Twitter means that they’ve lost the ability to speak publicly and cry "censorship". Some of us prefer to leave that word for cases where someone has actually been prevented from speaking.

Anonymoussays:

Re: "content moderation"

Really? You’re really bitching about them limiting Trump’s audience of the knitting community? I’m sure that’s what cost him the popular vote twice.

What in the fucking fuck do you think that ignorant piece of shit or his mail-order whore have ever knitted?

This is why people like myself think you’re fucking stupid.

Anonymoussays:

If you ever find yourself being consistently moderated out of a particular venue–change your choice of topics, or your concept of an inoffensive approach to persuasion … or your venue … or expect everyone who knows you online to think you are stupider than yeast–at least yeast evolves to thrive in different environments, eh?

Your choice, your fate.

Add Your Comment

Your email address will not be published.

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop ┬╗

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
16:12 Content Moderation Case Studies: Snapchat Disables GIPHY Integration After Racist 'Sticker' Is Discovered (2018) (18)
15:30 Content Moderation Case Study: Tumblr's Approach To Adult Content (2013) (8)
15:41 Content Moderation Case Study: Twitter's Self-Deleting Tweets Feature Creates New Moderation Problems (2)
15:47 Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021) (16)
15:28 Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021) (33)
15:32 Content Moderation Case Study: Twitter Removes 'Verified' Badge In Response To Policy Violations (2017) (8)
15:36 Content Moderation Case Study: Spam "Hacks" in Among Us (2020) (8)
15:37 Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017) (11)
15:48 Content Moderation Case Study: Twitter Temporarily Locks Account Of Indian Technology Minister For Copyright Violations (2021) (8)
15:45 Content Moderation Case Study: Spotify Comes Under Fire For Hosting Joe Rogan's Podcast (2020) (64)
15:48 Content Moderation Case Study: Twitter Experiences Problems Moderating Audio Tweets (2020) (6)
15:48 Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020) (9)
15:35 Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011) (13)
15:23 Content Moderation Case Study: Instagram Takes Down Instagram Account Of Book About Instagram (2020) (90)
15:49 Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014) (2)
15:34 Content Moderation Case Study: Pretty Much Every Platform Overreacts To Content Removal Stimuli (2015) (23)
16:03 Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020) (0)
15:43 Content Moderation Case Study: Twitter Suspends Users Who Tweet The Word 'Memphis' (2021) (10)
15:35 Content Moderation Case Study: Time Warner Cable Doesn't Want Anyone To See Critical Parody (2013) (14)
15:38 Content Moderation Case Studies: Twitter Clarifies Hacked Material Policy After Hunter Biden Controversy (2020) (9)
15:42 Content Moderation Case Study: Kik Tries To Get Abuse Under Control (2017) (1)
15:31 Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020) (8)
15:40 Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019) (29)
15:50 Content Moderation Case Study: YouTube's New Policy On Nazi Content Results In Removal Of Historical And Education Videos (2019) (5)
15:36 Content Moderation Case Study: Google Removes Popular App That Removed Chinese Apps From Users' Phones (2020) (28)
15:42 Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020) (5)
15:47 Content Moderation Case Study: Apple Blocks WordPress Updates In Dispute Over Non-Existent In-app Purchase (2020) (16)
15:47 Content Moderation Case Study: Google Refuses To Honor Questionable Requests For Removal Of 'Defamatory' Content (2019) (25)
15:32 Content Moderation Case Study: Vimeo Sued For Encouraging Infringement Via 'Lipdubs' (2009) (4)
15:38 Content Moderation Case Study: Friendster Battles Fakesters (2003) (4)
More arrow
This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it