Before Demanding Internet Companies 'Hire More Moderators,' Perhaps We Should Look At How Awful The Job Is
from the sacrificial-lambs dept
The moderators told me it’s a place where the conspiracy videos and memes that they see each day gradually lead them to embrace fringe views. One auditor walks the floor promoting the idea that the Earth is flat. A former employee told me he has begun to question certain aspects of the Holocaust. Another former employee, who told me he has mapped every escape route out of his house and sleeps with a gun at his side, said: “I no longer believe 9/11 was a terrorist attack.”
While it was a powerful and wonderfully written piece, as we noted in February, this wasn’t new to people following the space closely. There had been previous academic papers, a documentary, and even a big guest post here at Techdirt that highlighted some of the working conditions concerns of those in content moderation jobs.
Well, now, Newton is back with another powerful and heartbreaking story of more (former) Facebook moderators revealing the truly awful working conditions they faced. It opens with the story of a content moderator who died on the job of a heart attack at 42 years of age. And then discusses details revealed by many more content moderators, all of whom broke NDAs they signed to tell this story (good for them in doing so — such NDAs should not be allowed):
More than anything else, the contractors described an environment in which they are never allowed to forget how quickly they can be replaced. It is a place where even Keith Utley, who died working alongside them, would receive no workplace memorial — only a passing mention during team huddles in the days after he passed. “There is no indication that this medical condition was work related,” Cognizant told me in a statement. “Our associate’s colleagues, managers and our client were all saddened by this tragic event.” (The client is Facebook.)
There are all sorts of reasonable responses to this — many of which should be feeling horrified. Horrified that Facebook doesn’t have better control over these outside contracting firms it hires to staff its content moderation team. Horrified that Facebook is even outsourcing this stuff in the first place. Horrified at the sweatshop like setup of the content moderation efforts.
Marcus was made to moderate Facebook content — an additional responsibility he says he was not prepared for. A military veteran, he had become desensitized to seeing violence against people, he told me. But on his second day of moderation duty, he had to watch a video of a man slaughtering puppies with a baseball bat. Marcus went home on his lunch break, held his dog in his arms, and cried. I should quit, he thought to himself, but I know there’s people at the site that need me. He ultimately stayed for a little over a year.
Cognizant calls the part of the building where contractors do their work “the production floor,” and it quickly filled with employees. The minimum wage in Florida is $8.46, and at $15 an hour, the job pays better than most call center work in the area. For many content moderators — Cognizant refers to them by the enigmatic title of “process executive” — it was their first real job.
In its haste to fill the workplace, Cognizant made some odd staffing decisions. Early on, the company hired Gignesh Movalia, a former investment advisor, as a moderator. Cognizant conducts background checks on new hires, but apparently failed even to run a basic web search on Movalia. Had they done so, they would have learned that in 2015 he was sentenced to 18 months in prison for his involvement in a $9 million investment fraud scheme. According to the FBI, Movalia had falsely claimed to have access to shares of a fast-growing technology startup about to begin trading on the public market.
The startup was Facebook.
But part of the blame has to go back to everyone demanding that these companies must be the arbiters of truth and what’s okay and not okay online. Remember, just a couple months ago, a lot of people were totally up in arms over the fact that Facebook (and YouTube and Twitter) didn’t automagically delete anything having to do with the Christchurch shooting, as if it was easy to snap your fingers and get rid of all that content. The only way to do it is to hire a ton of people and subject them to absolutely horrific content over and over and over again. Just to keep everyone else pure.
That, of course, is unrelated to the horrible working conditions within these facilities. The two things need not go hand in hand — and there’s no reason why Facebook can’t create or demand better overall working conditions for content moderators it employs or contracts out to. However, the insane demand that social media platforms somehow be perfect, and how every error is held up as some sort of moral failing by the companies, means that these companies are simply pressured to hire more and more and more rapidly — leading to these kinds of crazy awful situations like the ones Newton describes in his article.
The result is a raucous workplace where managers send regular emails to the staff complaining about their behavior on the site. Nearly every person I interviewed independently compared the Tampa office to a high school. Loud altercations, often over workplace romances, regularly take place between co-workers. Verbal and physical fights break out on a monthly basis, employees told me. A dress code was instituted to discourage employees from wearing provocative clothing to work — “This is not a night club,” read an email to all employees obtained by The Verge. Another email warned employees that there had been “numerous incidents of theft” on the property, including stolen food from the office refrigerator, food from vending machines, and employees’ personal items.
Michelle Bennetti and Melynda Johnson both began working at the Tampa site in June 2018. They told me that the daily difficulty of moderating content, combined with a chaotic office environment, made life miserable.
“At first it didn’t bother me — but after a while, it started taking a toll,” Bennetti told me. “I got to feel, like, a cloud — a darkness — over me. I started being depressed. I’m a very happy, outgoing person, and I was [becoming] withdrawn. My anxiety went up. It was hard to get through it every day. It started affecting my home life.”
Both of these things can be true:
- Facebook should do a much better job with the working conditions for its moderators… and
- Continually demanding that Facebook (and others) somehow present only a perfect, squeaky clean internet is creating incentives for this rapid and chaotic hiring spree that creates more and more problems.
The worse, however, are people who both keep demanding that Facebook “do more” and then who also attack the company for these practices, as if the two things are not related. If we want the big internet companies to be these online morality police, that’s one thing. But if that’s going to be the case, then we need to have a much bigger and much more open discussion on what that means, who is involved, and how it will all work. Because if protecting “the public” from “bad” content online means subjecting tens of thousands of low-waged workers to that content all day, every day, we should at least consider if that’s a reasonable and acceptable trade-off.
Employees told me about other disturbing incidents at the Tampa site. Among them:
- An employee who used a colostomy bag had it rupture while she was at work, spilling some waste onto the floor. Senior managers were overheard mocking her. She eventually quit.
- An employee who threatened to “shoot up the building” in a group chat was placed on paid leave and allowed to return. He was fired after making another similar threat. (A Cognizant spokesperson said the company has security personnel on site at all hours. “Our goal is to ensure that our employees feel assured that they work in a safe environment,” he said.)
- Another employee broadcast himself on Facebook Live talking about wanting to bash a manager’s head in. Another manager determined that he was making a joke, and he was not disciplined.
Yes, almost everything in the article is horrific. And, yes, we should all demand that Facebook do a better job. But that’s just treating the symptoms and not the larger cause behind this. We can’t keep brushing all of this under the rug by covering it up with a “well, the platforms need to do better.” If you’re demanding that the internet services protect you from “bad content,” we should have a much more open discussion about what that means, and what it will mean for the people doing the work.