Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021)

from the content-moderation-is-everywhere dept

Summary: Content moderation questions can come from all sorts of unexpected places — including custom soda bottle labels. Over the years, Coca Cola has experimented with a variety of different promotional efforts regarding more customized cans and bottles, and not without controversy. Back in 2013, as part of its “Share a Coke” campaign, the company offered bottles with common first names on the labels, which angered some who felt left out. In Israel, for example, people noticed that Arabic names were left off the list, although Coca Cola’s Swedish operation said that this decision was made after the local Muslim community asked not to have their names included.

This controversy was only the preamble to a bigger one in the summer of 2021, when Coca Cola began its latest version of the “Share a Coke” effort — this time allowing anyone to create a completely custom label up to 36 characters long. Opening up custom labels immediately raised content moderation questions.

Some people quickly noticed some surprising terms and phrases that were blocked (such as “Black Lives Matter”) while others that were surprisingly not blocked (like “Nazis”).

As CNN reporter Alexis Benveniste noted, it was easy to get offensive terms through the blocks (often with a few tweaks), and there were some eye-opening contrasts:

For example, “Black Lives Matter,” is blocked. But “White Lives Matter” isn’t. Coke included a special rainbow label for pride month, but you can’t write “Gay Pride” on the bottle. However, you can write “I hate gays.” “Hitler” and “Nazi” are banned, but users can customize bottles with the phrases, “I am Hitler” or “I am a Nazi.” — Alexis Benveniste

The fact that “I am Hitler” was allowed while “Hitler” by itself was not suggests that Coca Cola was using a filter that included blocking entire phrases, rather than just a list of words (enabling simple adjustments to get through — which might explain why “Nazi” is blocked but “Nazis” apparently was not).

Coca Cola insisted that the automated blocks in its web tool were not the only system of review, just the first filter before being passed on to production, and that “actual bottles are not made with words that are inconsistent with the program’s intent.”

Company Considerations:

  • What kinds of tools, systems, staff, and processes should be put in place to deal with potential “abuse” of a custom labels program?
  • How should the “intent” of the program be communicated to consumers who want their own bottles, but may ask for problematic content on the labels?
  • How could the website more clearly inform consumers that the final text will still be reviewed by staff before production, so as not to let the public assume that if a word or phrase was not rejected in the web form, it will be printed?

Issue Considerations:

  • Customization systems are often put in place because they are considered fun and engaging, and a way for consumers to connect with a brand. How should companies weigh such benefits against the likelihood of abuse?
  • How should companies wishing to use these types of customization options consider the potential consumer backlash to what those users believe is both over-moderation and under-moderation?
  • As it becomes easier to mass produce customized products, how should companies set up campaigns to minimize possible abuses, while balancing the backlash if they disallow certain words or phrases that are important to certain groups?

Resolution: Coca Cola admitted to CNN that the process is constantly being adjusted. “We’re continuously refining and improving our Share A Coke personalization tool to ensure it is used only for its intended purpose.” The company also noted that it added language to the preview screen to say that “proposed language may require further review.” The company did not explain why terms like “Black Lives Matter” were not approved.

Originally posted to the Trust & Safety Foundation website.

Filed Under: , , , ,
Companies: coca cola

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021)”

Subscribe: RSS Leave a comment
14 Comments
Anonymous Coward says:

Re: Now with 100% more Section 230

Disclosure: Not Koby, but I’ll give it a whirl…

What’s wrong with this? Coke is just exercising it’s property rights. Why should they allow just any message to be on their labels? Coke owns their labels.

If you want to print a message that Coke doesn’t like, open your own soda bottling company, create a recipe that is loved worldwide, establish your own shipping partners, and send out your own vending machines. Simple. Quit demanding that third parties exist in the same world with a message that they hate.

/s

That Anonymous Coward (profile) says:

Re: Re:

Hitler did nothing wrong.
Gushing Grannies.

See also: Know your Meme – Dub the Dew
I don’t want moderation so find your own damn links.

4chan did it, Bronies did it, 4chan drove the Bronies out of the top 10, then the actual hacking of the site adding a rickroll & 911 conspiracy theories.

Anyone who puts one of these things forward as a marketing idea needs to be fired.
The internet will always win, you aren’t smarter than the last 7 idiots who did this same thing & ended up shutting it down after Hitler came to town.
The execs who approved this going forward should have to return any bonuses they earned because this was stupid & all they needed to do was ask a 14 yr old to look for any flaws.

Anonymous Coward says:

Over 50 trademarks on Black Lives Matter, fighting to the death

https://www.worldtrademarkreview.com/brand-management/controlling-black-lives-matter-the-battle-trademark-movement

I think we see now why "Black Lives Matter" is banned. Maybe Techdirt can file a trademark on White Lives Matter to see how long the site takes to block it? (I don’t think it actually has to stand up in court … trademarks are all about the fight; there is no truth)

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:43 Content Moderation Case Study: Facebook Struggles To Correctly Moderate The Word 'Hoe' (2021) (21)
15:32 Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021) (1)
16:12 Content Moderation Case Studies: Snapchat Disables GIPHY Integration After Racist 'Sticker' Is Discovered (2018) (11)
15:30 Content Moderation Case Study: Tumblr's Approach To Adult Content (2013) (5)
15:41 Content Moderation Case Study: Twitter's Self-Deleting Tweets Feature Creates New Moderation Problems (2)
15:47 Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021) (14)
15:28 Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021) (33)
15:32 Content Moderation Case Study: Twitter Removes 'Verified' Badge In Response To Policy Violations (2017) (8)
15:36 Content Moderation Case Study: Spam "Hacks" in Among Us (2020) (4)
15:37 Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017) (11)
15:48 Content Moderation Case Study: Twitter Temporarily Locks Account Of Indian Technology Minister For Copyright Violations (2021) (8)
15:45 Content Moderation Case Study: Spotify Comes Under Fire For Hosting Joe Rogan's Podcast (2020) (64)
15:48 Content Moderation Case Study: Twitter Experiences Problems Moderating Audio Tweets (2020) (6)
15:48 Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020) (9)
15:35 Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011) (13)
15:23 Content Moderation Case Study: Instagram Takes Down Instagram Account Of Book About Instagram (2020) (90)
15:49 Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014) (2)
15:34 Content Moderation Case Study: Pretty Much Every Platform Overreacts To Content Removal Stimuli (2015) (23)
16:03 Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020) (0)
15:43 Content Moderation Case Study: Twitter Suspends Users Who Tweet The Word 'Memphis' (2021) (10)
15:35 Content Moderation Case Study: Time Warner Cable Doesn't Want Anyone To See Critical Parody (2013) (14)
15:38 Content Moderation Case Studies: Twitter Clarifies Hacked Material Policy After Hunter Biden Controversy (2020) (9)
15:42 Content Moderation Case Study: Kik Tries To Get Abuse Under Control (2017) (1)
15:31 Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020) (8)
15:40 Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019) (29)
15:50 Content Moderation Case Study: YouTube's New Policy On Nazi Content Results In Removal Of Historical And Education Videos (2019) (5)
15:36 Content Moderation Case Study: Google Removes Popular App That Removed Chinese Apps From Users' Phones (2020) (28)
15:42 Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020) (5)
15:47 Content Moderation Case Study: Apple Blocks WordPress Updates In Dispute Over Non-Existent In-app Purchase (2020) (18)
15:47 Content Moderation Case Study: Google Refuses To Honor Questionable Requests For Removal Of 'Defamatory' Content (2019) (25)
More arrow