Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Ask.fm Responds After A Teen's Suicide Is Linked To Bullying On The Site (August 2013)

from the difficult-content-moderation-questions dept

Summary: After a UK teen took her own life in response to bullying on social networking site, ask.fm, her father asked both the site and the UK government to take corrective measures to prevent further tragedies. This wasn’t an isolated incident. Reports linked multiple suicides to bullying on the teen-centered site.

Ask.fm’s problems with bullying and other abuse appeared to be far greater than those observed on other social media sites. Part of this appeared to be due to the site’s user base, which was much younger than more-established social media platforms. This — combined with the option to create anonymous accounts — seemed to have made ask.fm a destination for abusive users. What moderation existed before these problems became headline news was apparently ineffective, resulting in a steady stream of horrific stories until the site began to make serious efforts to curb a problem now too big to ignore.

Ask.fm’s immediate response to both the teen’s father and UK Prime Minister David Cameron’s criticism (Cameron called for a boycott of the site) was to point to existing moderation efforts put in place to deter bullying and other terms of service violations.

After major companies pulled their advertising, ask.fm pledged to assist police in investigating the circumstances behind the teen’s suicide, as well as consult with a law firm to see if moderation efforts could be improved. It also hired more moderators and a safety officer, and made its “Report” button more prominent.

More than a year after ask.fm became the target of criticism around the world, the site implemented its first Safety Advisory Board. The group of experts on teens and their internet use was tasked with reducing the amount of bullying on the platform and making it safer for its young users.

More significantly, ask.fm’s founders — who were viewed as unresponsive to criticism — were removed by the site’s new owners, InterActiveCorp (IAC). IAC pledged to work more closely with US law enforcement and safety experts to improve moderation efforts.

Decisions to be made by ask.fm:

  • Should anonymous accounts be eliminated (or stop-gapped by gathering IP address/personal info) to limit abusive behavior?
  • Does catering to a younger user base create unique problems not found at sites that skew older?
  • Would more transparency about moderation efforts/features nudge more users towards reporting abuse?
  • Should the site directly intervene when moderators notice unhealthy/unwanted user interactions?

Questions and policy implications to consider:

  • Given the international reaction to the teen’s suicide, does a minimal immediate response make the perceived problem worse?
  • Does having a teen user base increase the risk of direct regulation or unfavorable legislation, given the increased privacy protections for minors in many countries?
  • Are moderation efforts resulting from user reports vetted periodically to ensure the company isn’t making bullying/trolling problems worse by allowing abusive users to get others suspended or banned?

Resolution: When immediate steps did little to deter criticism, ask.fm formed a Safety Committee and, ultimately, dismissed founders that appeared to be unresponsive to users’ concerns. The site made changes to its moderation strategies, hired more moderators, and made users more aware of the features they could use to report users and avoid unwanted interactions.

Filed Under: , , ,
Companies: ask.fm

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation Case Study: Ask.fm Responds After A Teen's Suicide Is Linked To Bullying On The Site (August 2013)”

Subscribe: RSS Leave a comment
5 Comments
Ann Brush (profile) says:

Has the bullying situation improved?

Following the suicide some assessment of bullying and reporting thereof would be very illuminating. I see what they have done and some policy and position considerations but what’s missing are metrics regarding wether the bullying situation improved in the time since the suicide during which all these efforts were made. Is what they are doing making any difference?

Bartonvqs (user link) says:

asian brides

Online Dating tips to help you Find Love

Everyone knows that you have to have a profile photo if you want people to take you seriously on online dating services. One <a href=https://www.love-sites.com/latin-women-date-online-dating-advice-for-men/>Latin women</a> photo is not always enough and you also need to change your photos often. If a person has had an online profile for a while or are starting a new one, You should plan to update your profile photo and add new photos at least every three months.

Just because you have one bad experience while trying online dating, doesn’t suggest you won’t find love on a different site or with another member. Some people join countless dating site at a time so they can meet more people. If you are new to online dating service, You might simply want to stick with one site until you get used to everything.

don’t waste your time or anyone else’s with dishonesty. If you are searching for a real relationship you have to be honest from the start. Even if you’re you are telling a harmless lie, It could return to haunt you later. The last thing you want to do is have to admit to a lie after you have established a great relationship with someone online.

These three tips can make online dating easier and better established and could even help you find love. peace of mind.

Click to express on Facebook (Opens in new windows)Click to share with you on LinkedIn (Opens in new window)Click to share with you on Reddit (Opens in new window case)Click to share with you on Twitter (Opens in new eye-port)Click to express on Tumblr (Opens in new wind shield)Click to share on Pinterest (Opens in new time frame)Click to express on Pocket (Opens in new truck’s window)Click to share on Telegram (Opens in new eye-port)Click to share on WhatsApp (Opens in new door)Click to talk about on Skype (Opens in new time frame).
[—-]

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:43 Content Moderation Case Study: Facebook Struggles To Correctly Moderate The Word 'Hoe' (2021) (21)
15:32 Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021) (1)
16:12 Content Moderation Case Studies: Snapchat Disables GIPHY Integration After Racist 'Sticker' Is Discovered (2018) (11)
15:30 Content Moderation Case Study: Tumblr's Approach To Adult Content (2013) (5)
15:41 Content Moderation Case Study: Twitter's Self-Deleting Tweets Feature Creates New Moderation Problems (2)
15:47 Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021) (14)
15:28 Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021) (33)
15:32 Content Moderation Case Study: Twitter Removes 'Verified' Badge In Response To Policy Violations (2017) (8)
15:36 Content Moderation Case Study: Spam "Hacks" in Among Us (2020) (4)
15:37 Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017) (11)
15:48 Content Moderation Case Study: Twitter Temporarily Locks Account Of Indian Technology Minister For Copyright Violations (2021) (8)
15:45 Content Moderation Case Study: Spotify Comes Under Fire For Hosting Joe Rogan's Podcast (2020) (64)
15:48 Content Moderation Case Study: Twitter Experiences Problems Moderating Audio Tweets (2020) (6)
15:48 Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020) (9)
15:35 Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011) (13)
15:23 Content Moderation Case Study: Instagram Takes Down Instagram Account Of Book About Instagram (2020) (90)
15:49 Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014) (2)
15:34 Content Moderation Case Study: Pretty Much Every Platform Overreacts To Content Removal Stimuli (2015) (23)
16:03 Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020) (0)
15:43 Content Moderation Case Study: Twitter Suspends Users Who Tweet The Word 'Memphis' (2021) (10)
15:35 Content Moderation Case Study: Time Warner Cable Doesn't Want Anyone To See Critical Parody (2013) (14)
15:38 Content Moderation Case Studies: Twitter Clarifies Hacked Material Policy After Hunter Biden Controversy (2020) (9)
15:42 Content Moderation Case Study: Kik Tries To Get Abuse Under Control (2017) (1)
15:31 Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020) (8)
15:40 Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019) (29)
15:50 Content Moderation Case Study: YouTube's New Policy On Nazi Content Results In Removal Of Historical And Education Videos (2019) (5)
15:36 Content Moderation Case Study: Google Removes Popular App That Removed Chinese Apps From Users' Phones (2020) (28)
15:42 Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020) (5)
15:47 Content Moderation Case Study: Apple Blocks WordPress Updates In Dispute Over Non-Existent In-app Purchase (2020) (18)
15:47 Content Moderation Case Study: Google Refuses To Honor Questionable Requests For Removal Of 'Defamatory' Content (2019) (25)
More arrow