Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Bumble Shuts Down Sharon Stone's Account, Not Believing It's Really Her (2019)

from the bees-and-catfish dept

Summary: Almost any platform that allows users to create accounts eventually has to deal with questions of identity and impersonation. Many platforms set up systems like ?verified? or ?trusted? users for certain recognizable accounts. Others focus on real name policies, or trying to verify all users. But services often discover challenges that come with celebrity users and verification.

While it?s one thing to do verified accounts on platforms like Twitter, Facebook, or Instagram that are often used for promotion and connection, dating site verification is a bit different and more complicated. Setting up fake personas on dating sites to lure people into misleading relationships (for a wide variety of reasons) is so common that it led to the creation of a whole new term: catfishing. Many dating sites now take user verification quite seriously, not just to avoid catfishing issues, but for the safety and protection of their userbase — who, by definition, are usually trying to meet someone new with the hope of getting together in person.

Bumble is a popular dating app which was built up around the premise of being safer, and more responsive to the needs of female daters. The site includes a verification feature that requests the user upload selfie poses that match poses in photos sent to the user — which are then reviewed by a team member. The idea is that if a user were faking images by pulling them from online profiles or generating them via AI, it?s much harder to match the pose.

Apparently, however, this form of verification ran into a problem when the actress Sharon Stone decided to use Bumble to meet potential dates. Users who matched with her, perhaps understandably, had difficulty believing that a famous Hollywood star would be using a dating app like Bumble, and they reported the account. Staff reviewers at Bumble were (again, reasonably) equally suspicious of the account, leading them to suspend it.

Bumble quickly restored the account, and did so in a good natured way, wishing her luck in ?finding your honey.?

Decisions to be made by Bumble:

  • What systems do you use to verify users are who they say they are?

  • How much weight should be given to user reports that people they matched with are not real?

  • How do you handle celebrities, whose accounts people may not believe are legitimate?

  • What appeals process should there be for blocked accounts that were deemed to be fake?

Questions and policy implications to consider:

  • On dating apps in particular, user safety is key, so should sites default towards overblocking, rather than being hesitant?

  • Are there other forms of verification that would alleviate problems similar to the one Stone faced here?

  • Stone was able to get her account reinstated quickly because of her fame; does the existing appeals process work as well for users who don?t have that pull?

Resolution: Bumble was pretty quick to restore Stone?s account after she tweeted about it, and major news organizations picked up the story. A few months later, Stone admitted that she suspected that she was reported by men who were upset she had turned them down on the platform.

?I think that I said no to a couple of people that thought that it would be a nice way to be not-so-kind back,? she explained. ?I think some people don?t like to hear, ?No, no I don?t want to go out with you.’?

She also noted that she has ?made some nice friends? on the site.

In the meantime, some have argued that Bumble purposely chose to block Stone in order to generate publicity. Of course, this would only work if the company knew that Stone would complain about the block publicly, which certainly was not guaranteed.

Originally posted to the Trust & Safety Foundation website.

Filed Under: , , ,
Companies: bumble

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation Case Study: Bumble Shuts Down Sharon Stone's Account, Not Believing It's Really Her (2019)”

Subscribe: RSS Leave a comment
10 Comments

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

Sharon Mathots says:

Re: Do you know Stone's part NOT guaranteed?

And F) This is no small item to aged geezers susceptible to her cheap "sex appeal". HIGH VALUE to the site. — And ONLY half dozen people tops would KNOW she’s paid.

G) TYPICAL PUBLICITY STUNT. — And Maz helps, witless and unsuspecting as ever, OR witful and sneaky, take your pick.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:43 Content Moderation Case Study: Facebook Struggles To Correctly Moderate The Word 'Hoe' (2021) (21)
15:32 Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021) (1)
16:12 Content Moderation Case Studies: Snapchat Disables GIPHY Integration After Racist 'Sticker' Is Discovered (2018) (11)
15:30 Content Moderation Case Study: Tumblr's Approach To Adult Content (2013) (5)
15:41 Content Moderation Case Study: Twitter's Self-Deleting Tweets Feature Creates New Moderation Problems (2)
15:47 Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021) (14)
15:28 Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021) (33)
15:32 Content Moderation Case Study: Twitter Removes 'Verified' Badge In Response To Policy Violations (2017) (8)
15:36 Content Moderation Case Study: Spam "Hacks" in Among Us (2020) (4)
15:37 Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017) (11)
15:48 Content Moderation Case Study: Twitter Temporarily Locks Account Of Indian Technology Minister For Copyright Violations (2021) (8)
15:45 Content Moderation Case Study: Spotify Comes Under Fire For Hosting Joe Rogan's Podcast (2020) (64)
15:48 Content Moderation Case Study: Twitter Experiences Problems Moderating Audio Tweets (2020) (6)
15:48 Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020) (9)
15:35 Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011) (13)
15:23 Content Moderation Case Study: Instagram Takes Down Instagram Account Of Book About Instagram (2020) (90)
15:49 Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014) (2)
15:34 Content Moderation Case Study: Pretty Much Every Platform Overreacts To Content Removal Stimuli (2015) (23)
16:03 Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020) (0)
15:43 Content Moderation Case Study: Twitter Suspends Users Who Tweet The Word 'Memphis' (2021) (10)
15:35 Content Moderation Case Study: Time Warner Cable Doesn't Want Anyone To See Critical Parody (2013) (14)
15:38 Content Moderation Case Studies: Twitter Clarifies Hacked Material Policy After Hunter Biden Controversy (2020) (9)
15:42 Content Moderation Case Study: Kik Tries To Get Abuse Under Control (2017) (1)
15:31 Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020) (8)
15:40 Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019) (29)
15:50 Content Moderation Case Study: YouTube's New Policy On Nazi Content Results In Removal Of Historical And Education Videos (2019) (5)
15:36 Content Moderation Case Study: Google Removes Popular App That Removed Chinese Apps From Users' Phones (2020) (28)
15:42 Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020) (5)
15:47 Content Moderation Case Study: Apple Blocks WordPress Updates In Dispute Over Non-Existent In-app Purchase (2020) (18)
15:47 Content Moderation Case Study: Google Refuses To Honor Questionable Requests For Removal Of 'Defamatory' Content (2019) (25)
More arrow