Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021)

from the appeasement dept

Summary: A major challenge for global internet companies is figuring how to deal with different rules and regulations within different countries. This has proven especially difficult for internet companies looking to operate in China — a country in which many of the most popular global websites are blocked.

In 2015, there was an article highlighting how companies like Evernote and LinkedIn had avoided getting blocked in China, mainly by complying with the Chinese government’s demands that they moderate certain content. In that article, LinkedIn’s then-CEO Jeff Weiner noted:

“We’re expecting there will be requests to filter content,” he said. “We are strongly in support of freedom of expression and we are opposed to censorship,” he said, but “that’s going to be necessary for us to achieve the kind of scale that we’d like to be able to deliver to our membership.”

Swedish journalist Jojje Olsson tweeted the article when it came out. Six years later LinkedIn informed Olsson that his own LinkedIn profile would no longer be available in China after referencing the Tiananmen square massacre in his profile.

In Olsson’s tweet, he explains that his LinkedIn profile mentions that for his degree, he wrote an essay about the Tiananmen Square massacre. It quickly became clear that LinkedIn was in the process of blocking access to multiple journalists’ and academics’ accounts in China, including CNN Beijing bureau chief Steve Jiang and the editor-in-chief of the Taiwan Sentinel, J. Michael Cole. The Wall Street Journal found at least 10 other LinkedIn accounts that were blocked in China around the same time, and highlighted that LinkedIn officials were reprimanded in March of 2021 for keeping certain accounts available in China.

China’s internet regulator summoned LinkedIn officials in March to tell them to better regulate its content, according to people familiar with the matter. The social-networking site was given 30 days to clean up the content and promised to better regulate its site going forward, the people said.

Shortly after, LinkedIn said in a statement on its website that it would be pausing new member sign-ups as the platform worked “to ensure we remain in compliance with local law.” — Liza Lin, Wall Street Journal

The NY Times report on that meeting noted that the 30-day pause on sign-ups was part of what Chinese officials ordered.

The users whose profiles were blocked received a notice from Linkedin about the block, saying “We will work with you to minimize the impact and can review your profile’s accessibility within China if you update the relevant sections of the profiles,” but also notes “the decision whether to update your profile is yours.” The notice also includes this paragraph:

While we strongly support freedom of expression, we recognized when we launched that we would need to adhere to the requirements of the Chinese government in order to operate in China. As a reminder, your profile will remain viewable throughout the rest of the countries in which LinkedIn is available.

It appears that LinkedIn was also directly removing some specific content as well. Former journalist Peter Humphrey told Bloomberg News that LinkedIn informed him that it had completely removed certain comments he made criticizing the Chinese government.

Company Considerations:

  • How important is it to remain accessible in China?
  • What compromises are worth making to remain accessible in China or other countries?
  • If the company agrees to take down, or block access to, certain content to appease government demands, how should those decisions be communicated to impacted users?
  • Under what conditions, if any, will the company push back on overbroad demands to block content in China?

Issue Considerations:

  • Is a censored, but still mostly available, US-based service better to be available than to have the entire service blocked in China?
  • Local regulations differ across every country. What kind of framework should a company use to determine where they draw the line, and what compromises they will agree to?

Resolution: Since the initial flurry of notices that got attention from May through July of 2021, it appears that even more journalists have found their profiles blocked in China. In September, Sophia Yan, the China correspondent for the UK’s Telegraph, noted that her LinkedIn profile was now blocked in China. In replying to Yan’s tweet, Liza Lin, the Wall Street Journal’s China correspondent, and author of the article quoted earlier discussing LinkedIn officials being reprimanded by Chinese officials, noted that she too had her profile blocked in China.

LinkedIn, for its part, has continued to make similar statements throughout, saying that it supports the principles of free speech but that in order to continue operating in China, it is required by the government to block access to these accounts.

Update: Just weeks after this case study was originally published, and LinkedIn was called out for even more such activity, the company announced that it was mostly exiting the country, as the demands for censorship were becoming too much.

Originally posted to the Trust & Safety Foundation website.

Filed Under: , , ,
Companies: linkedin, microsoft

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021)”

Subscribe: RSS Leave a comment
1 Comment
Tanner Andrews (profile) says:

Good Luck Making Money in Red China

I am not entirely sure that Linkedin has a good plan for Red China anyway. The costs of complying with governent censorship requirements have, as was inevitable, become onerous.

However, assume that they can find a path to operate there with a “limited” site. How do Linedin get real money, and how do they get it back to the main office in the States? The natives have, if anything, whatever is used in place of money over there.

Ultimately, the Linkedin stockholders are going to want actual US dollars. They would certainly prefer that their profits not be subject to nationalization by foreign powers, too.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:43 Content Moderation Case Study: Facebook Struggles To Correctly Moderate The Word 'Hoe' (2021) (21)
15:32 Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021) (1)
16:12 Content Moderation Case Studies: Snapchat Disables GIPHY Integration After Racist 'Sticker' Is Discovered (2018) (11)
15:30 Content Moderation Case Study: Tumblr's Approach To Adult Content (2013) (5)
15:41 Content Moderation Case Study: Twitter's Self-Deleting Tweets Feature Creates New Moderation Problems (2)
15:47 Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021) (14)
15:28 Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021) (33)
15:32 Content Moderation Case Study: Twitter Removes 'Verified' Badge In Response To Policy Violations (2017) (8)
15:36 Content Moderation Case Study: Spam "Hacks" in Among Us (2020) (4)
15:37 Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017) (11)
15:48 Content Moderation Case Study: Twitter Temporarily Locks Account Of Indian Technology Minister For Copyright Violations (2021) (8)
15:45 Content Moderation Case Study: Spotify Comes Under Fire For Hosting Joe Rogan's Podcast (2020) (64)
15:48 Content Moderation Case Study: Twitter Experiences Problems Moderating Audio Tweets (2020) (6)
15:48 Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020) (9)
15:35 Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011) (13)
15:23 Content Moderation Case Study: Instagram Takes Down Instagram Account Of Book About Instagram (2020) (90)
15:49 Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014) (2)
15:34 Content Moderation Case Study: Pretty Much Every Platform Overreacts To Content Removal Stimuli (2015) (23)
16:03 Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020) (0)
15:43 Content Moderation Case Study: Twitter Suspends Users Who Tweet The Word 'Memphis' (2021) (10)
15:35 Content Moderation Case Study: Time Warner Cable Doesn't Want Anyone To See Critical Parody (2013) (14)
15:38 Content Moderation Case Studies: Twitter Clarifies Hacked Material Policy After Hunter Biden Controversy (2020) (9)
15:42 Content Moderation Case Study: Kik Tries To Get Abuse Under Control (2017) (1)
15:31 Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020) (8)
15:40 Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019) (29)
15:50 Content Moderation Case Study: YouTube's New Policy On Nazi Content Results In Removal Of Historical And Education Videos (2019) (5)
15:36 Content Moderation Case Study: Google Removes Popular App That Removed Chinese Apps From Users' Phones (2020) (28)
15:42 Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020) (5)
15:47 Content Moderation Case Study: Apple Blocks WordPress Updates In Dispute Over Non-Existent In-app Purchase (2020) (18)
15:47 Content Moderation Case Study: Google Refuses To Honor Questionable Requests For Removal Of 'Defamatory' Content (2019) (25)
More arrow