Facebook Content Moderators Around The World Reportedly Suffer Severe PTSD From Being Exposed To Insanely Disturbing Content On A Daily Basis

DANIEL IRUNGU. Shutterstock Images.

Guardian - More than 140 Facebook content moderators have been diagnosed with severe post-traumatic stress disorder caused by exposure to graphic social media content including murders, suicides, child sexual abuse and terrorism.

The moderators worked eight- to 10-hour days at a facility in Kenya for a company contracted by the social media firm and were found to have PTSD, generalised anxiety disorder (GAD) and major depressive disorder (MDD), by Dr Ian Kanyanya, the head of mental health services at Kenyatta National hospital in Nairobi.

The mass diagnoses have been made as part of lawsuit being brought against Facebook’s parent company, Meta, and Samasource Kenya, an outsourcing company that carried out content moderation for Meta using workers from across Africa.

The images and videos including necrophilia, bestiality and self-harm caused some moderators to faint, vomit, scream and run away from their desks, the filings allege.

Damn. What do you think the vibes are like on a typical workday at Samasource Kenya HQ? Probably so bad. Sometimes I feel like I need a cold shower after scrolling the Twitter 'For You' page. And that's the stuff deemed acceptable by our brave content gatekeepers. I can't even fathom how large of a 151 & Xanax cocktail I would need to take down every morning to survive a 10-hour shift as a Kenyan Facebook content moderator. 

Facebook went through the same thing with American employees back in 2018, and eventually had to pay out $52M to content moderators in a class-action lawsuit.

BBC - Facebook has agreed to pay $52m (£42m) to content moderators as compensation for mental health issues developed on the job.


In 2018, a group of US moderators hired by third-party companies to review content sued Facebook for failing to create a safe work environment.


The moderators alleged that reviewing violent and graphic images - sometimes of rape and suicide - for the social network had led to them developing post-traumatic stress disorder (PTSD).

That's such a fucked up job no matter what country you're doing it in. And it's not something that requires a unique set of skills, so these companies don't have to pay much. I'm sure when employees get that job they think they'll be able to handle it no problem. People who are really good at compartmentalizing things, and are somehow able to remove themselves from their own brain while at work probably do. Others probably feel like they're handling it fine until they randomly wake up screaming in the middle of the night, or start sleep beating their wives for no apparent reason. Or don't put two-and-two together that their heroin addiction perfectly coincides with the time they started watching rape videos professionally.

I totally get sedating yourself to get through that job. Trust me. But a specific part of this paragraph about the Kenyan moderators was crazy to me.

Guardian - At least 40 of the moderators in the case were misusing alcohol, drugs including cannabis, cocaine and amphetamines, and medication such as sleeping pills.

"Fuck another day at the hell factory. Time to get JACKED THE FUCK UP and really focus in on this bestiality."

Would be really nice if Meta, a trillion dollar company, would be slightly less terrible and actually pay some employees a healthy wage to do this work. Instead of outsourcing it all to cheap 3rd party companies. They wouldn't even have to pay anything crazy. But maybe instead of $60k per year (or less), they could pay them $120k. They'd get better employees who are probably more fit to actually handle the job. Or at least employees who have the resources to be able to step away from the job if they feel like they're starting to lose their fucking minds. It really shouldn't be anyone's full time job. It feels like a job that nobody should be allowed to do for more than a week at a time. Maybe that's a solution. Let people sign up to work a week at a time. Pay them a couple grand for 40-50 hours of work. They'd have plenty of people sign up, and nobody would have to be exposed to that heinous stuff for an extended period of time.

But that's probably unrealistic. It's insane to suggest that a company like Meta would do something like overpay lower level employees when they don't absolutely have to. Idk why I even had that thought.

A lot of people have negative opinions of AI. They're scared that someday it will grow too powerful and take over the world. Or even worse, take their job. And I agree that in a lot of cases, when it comes to AI, we point our research in the wrong direction. So to all the AI-engineers out there who've dedicated their lives to research that will someday allow us to type, "Alec Baldwin fucking a cactus" into ChatGPT and have it spit out an "Alec Baldwin fucking a cactus" video that's indistinguishable from real life. Maybe re-direct your focus to teaching your robots to identify necrophilia and child porn so that we don't have to pay people $60k per year to do a job that costs $40k per year in drugs to do without killing yourself.