Trigger warning: This article contains descriptions of violent and abusive videos. Reader discretion is advised.
“The devil of this job is that you get sick slowly—without even noticing it… You think it’s not a big deal, but it does affect you.”
These are the words of former content moderator, Wisam, as he shared with online news site Business Insider about his experiences working for TikTok.
The critical job of overseas moderators
When you watch a video on Meta (formerly called Facebook), Instagram, TikTok, or YouTube, among other prominent social media sources, you don’t have to worry about watching a group of teenagers beat an old man with an axe, or a man use a hunting gun to shoot himself, or someone violently murder a cat.
The reason you don’t have to do that is because people like Wisam and his coworkers are doing it for you.
But why? Why can’t social media sites use cool tech like artificial intelligence and machine learning to moderate social media content?
While TikTok and other social media sites do use artificial intelligence to help review content, the technology isn’t always perfect—especially when it comes to videos that feature languages other than English.
For this reason, humans are still used to review a majority of the most horrifying videos on the platform. And their work is essential, ensuring that advertisements from reputable companies like Nike don’t appear alongside porn or violent material.
The moderating experiences of Imani and Samira
Imani was a 25-year-old when she took a job in September 2020 offered by temporary hire agency Majerol as a content moderator for TikTok. Living in a single-bedroom apartment in Casablanca, Morocco, Imani was brought in to help support the company’s growth in the Middle East.
Although she had a bachelor’s degree in English, she took the $2 an hour job because she was struggling to find work during the first part of the pandemic and because her husband, a technician, could not support their infant daughter on his own.
The work was so mentally distressing that she left her job just as quickly as she started and says she’s still dealing with the effects of the work about two years later!
Imani is not alone in her story.
Nine current and former content moderators in Morocco who worked for TikTok through Majerol described experiences to Business Insider of severe psychological distress as a result of their jobs.
Samira, 23, was one of them. In addition to the distress from viewing gruesome content, Samira stated that her and her colleagues were treated like “robots.”
She was tasked with reviewing 200 videos every hour while maintaining an accuracy score of 95%. This score was calculated by how close her tags were to those of more senior content moderators who watched the same videos.
However, three months into her job, the goalpost shifted, and her manager bumped up her video-per-hour quota to 360. In other words, Samira had 10 seconds to review videos with an extremely high level of accuracy requirement. Understandably, she ended up leaving, too.
It’s a sad truth, but it seems to be the case that the cost of keeping social media sites safe is to victimize others in the process—just ask Imani and Samira
Why is toxic content uploaded to sites like TikTok?
It’s pretty simple, actually. This content exists because there’s a massive demand for it.
In particular, one type of content that tends to be heavily demanded is “underage nude children,” according to Ashley Velez, a Las Vegas-based moderator. Unfortunately, Velez’ experiences moderating are supported by a ton of other sources.
Forbes identified the social platform as a “magnet for predators,” while the National Center on Sexual Exploitation named TikTok to its Dirty Dozen List for 2020 because of the platform’s reputation for enabling easy access to children to be groomed, abused, and trafficked by strangers.
The BBC also did a separate investigation into TikTok’s toxic sexual content. They found whole communities that encouraged “soliciting images of boys and girls” and hashtags specifically for sharing nudes. In addition, the investigation revealed hundreds of sexually explicit comments on videos of children as young as nine years old.
Clearly, this is a massive issue—especially given how porn particularly does a lot to normalize, promote, and facilitate the existing and persistent issue of extreme content online.
Why this matters
Whether it’s gory, sexually violent or otherwise, user-generated content on platforms often leaves room for unthinkable, illicit, explicit content. As TikTok and other content platforms like it grow, the graphic content will, too—and hopefully the moderating force with it.
But for the moderating force to grow and improve, will the mental and emotional well-being of exploited individuals be sacrificed in the process?
What it comes down to is this: if there was no demand for explicit content, it wouldn’t be uploaded, and if it wasn’t uploaded, people like Imani and Samira might still have their jobs without the negative mental health effects.
To learn more about how much porn is on each social media platform, click here.
Your Support Matters Now More Than Ever
Most kids today are exposed to porn by the age of 12. By the time they’re teenagers, 75% of boys and 70% of girls have already viewed itRobb, M.B., & Mann, S. (2023). Teens and pornography. San Francisco, CA: Common Sense.Copy —often before they’ve had a single healthy conversation about it.
Even more concerning: over half of boys and nearly 40% of girls believe porn is a realistic depiction of sexMartellozzo, E., Monaghan, A., Adler, J. R., Davidson, J., Leyva, R., & Horvath, M. A. H. (2016). “I wasn’t sure it was normal to watch it”: A quantitative and qualitative examination of the impact of online pornography on the values, attitudes, beliefs and behaviours of children and young people. Middlesex University, NSPCC, & Office of the Children’s Commissioner.Copy . And among teens who have seen porn, more than 79% of teens use it to learn how to have sexRobb, M.B., & Mann, S. (2023). Teens and pornography. San Francisco, CA: Common Sense.Copy . That means millions of young people are getting sex ed from violent, degrading content, which becomes their baseline understanding of intimacy. Out of the most popular porn, 33%-88% of videos contain physical aggression and nonconsensual violence-related themesFritz, N., Malic, V., Paul, B., & Zhou, Y. (2020). A descriptive analysis of the types, targets, and relative frequency of aggression in mainstream pornography. Archives of Sexual Behavior, 49(8), 3041-3053. doi:10.1007/s10508-020-01773-0Copy Bridges et al., 2010, “Aggression and Sexual Behavior in Best-Selling Pornography Videos: A Content Analysis,” Violence Against Women.Copy .
From increasing rates of loneliness, depression, and self-doubt, to distorted views of sex, reduced relationship satisfaction, and riskier sexual behavior among teens, porn is impacting individuals, relationships, and society worldwideFight the New Drug. (2024, May). Get the Facts (Series of web articles). Fight the New Drug.Copy .
This is why Fight the New Drug exists—but we can’t do it without you.
Your donation directly fuels the creation of new educational resources, including our awareness-raising videos, podcasts, research-driven articles, engaging school presentations, and digital tools that reach youth where they are: online and in school. It equips individuals, parents, educators, and youth with trustworthy resources to start the conversation.
Will you join us? We’re grateful for whatever you can give—but a recurring donation makes the biggest difference. Every dollar directly supports our vital work, and every individual we reach decreases sexual exploitation. Let’s fight for real love: