Skip to main content
Blog

Twitter Sued for Reportedly Distributing and Profiting from Child Abuse Images

Plaintiff John Doe was horrified to find out explicit of himself—made at age 13 under duress by traffickers—were posted to Twitter.

Portions of the following article were originally shared in a press release by the National Center on Sexual Exploitation.

Disclaimer: Fight the New Drug is a non-religious and non-legislative awareness and education organization. Some of the issues discussed in the following article are legislatively-affiliated. Including links and discussions about these legislative matters does not constitute an endorsement by Fight the New Drug. Though our organization is non-legislative, we fully support the regulation of already illegal forms of pornography and sexual exploitation, including the fight against sex trafficking.

The National Center on Sexual Exploitation Law Center (NCOSE), The Haba Law Firm, and The Matiasic Firm have jointly filed a federal lawsuit against Twitter on behalf of a minor who was trafficked on the social media platform that boasts over 330 million users.
 
The plaintiffs, John Doe #1 and John Doe #2, minors, were reportedly harmed by Twitter’s distribution of material depicting his sexual abuse, and by Twitter’s knowing refusal to remove the images of their sexual abuse (child pornography) when notified by the plaintiff and the plaintiffs’ parents. The case, John Doe v. Twitter, was filed in the United States District Court for the Northern District of California.

Related: Apple Fights Child Abuse Images By Scanning Users’ Uploaded iCloud Photos

The 18-year-old John Doe #1 and John Doe #2 say they were 13 years old when a sex trafficker posing as a 16-year-old girl tricked them into sending pornographic videos of themselves through the social media app Snapchat. A few years later when they were in high school, links to those videos began appearing on Twitter in January 2020.

Store - General

The plaintiffs say they alerted law enforcement about the tweets and urgently requested that Twitter remove them. Using Twitter’s reporting system, which according to its policies is designed to catch and stop illegal material like child sexual abuse material (CSAM) from being distributed, the Doe family verified that John Doe was a minor and the videos needed to be taken down immediately.

Instead of the videos being removed, NCOSE reports that Twitter did nothing, even reporting back to John Doe that the video in question did not in fact violate any of their policies.

Reportedly, Twitter refused to take down the content until nine days later when a Department of Homeland Security agent contacted Twitter and urged action. At that point, the lack of care and proper attention resulted in the posts receiving 167,000 views and 2,223 retweets, according to the lawsuit.

“As John Doe’s situation makes clear, Twitter is not committed to removing child sex abuse material from its platform. Even worse, Twitter contributes to and profits from the sexual exploitation of countless individuals because of its harmful practices and platform design,” said Peter Gentala, senior legal counsel for the National Center on Sexual Exploitation Law Center. “Despite its public expressions to the contrary, Twitter is swarming with uploaded child pornography and Twitter management does little or nothing to prevent it.”

The John Does are now suing Twitter for its involvement in and profiting from his sexual exploitation, which violates the Trafficking Victims Protection Reauthorization Act and various other protections afforded by law.

In August 2021, a federal judge found Twitter may have benefitted financially from ad revenue generated by tweets containing child sexual abuse material.

Read NCOSE’s full press release here.

BHW - General

Is this the first time Twitter has shared child abuse images?

Accessing CSAM used to be difficult, like finding a needle in a haystack. Today, child exploitation is shared through P2P (file sharing) networks, encrypted messaging applications like WhatsApp, social media, adult pornography sites, and even suggested as a search option on Microsoft Bing. It’s even easily accessible on Twitter these days, as this lawsuit clearly shows.

It seems obvious that such abuse should be eradicated. The question is, how? Is such a mission even possible? And if so, whose responsibility is it to end child porn?

These are urgent questions that have not only been made worse by child abusers and exploiters sharing CSAM on the platform, but the adult industry at large, too.

Related: How Mainstream Porn Is Connected To Arrests For Child Abuse Images

Since the start of the COVID-19 pandemic, “not-safe-for-work” subscription-based site OnlyFans has shown how prolific child exploitation images are on Twitter, specifically.

Followers on OnlyFans pay a monthly subscription fee to sexual content creators that ranges anywhere from $4.99 to $49.99 a month. Creators can also charge a minimum of $5 tips or paid private messages, which is the real money maker for those with a loyal subscriber base. And while OnlyFans has an age verification system that tries to ensure sexual content creators are over 18, it can be easily bypassed.

Many OnlyFans creators use Twitter to advertise selling nudes and drive traffic to their profiles—particularly through trending hashtags like #teen and #barelylegal. And while there clearly are underage creators on OnlyFans, on the flip side, many adult creators give the illusion of being under 18 to grow their fan base.

Live Presentations

In the BBC’s 2020 documentary about underage content on OnlyFans, Yoti—a platform that helps individuals prove their identity online—did a scan of 20K Twitter accounts to detect how many users were underages using the hashtags #nudesforsale and #buymynudes, which are commonly used to direct followers to OnlyFans. In just one day, out of 7,000 profiles where faces could be detected and analyzed they found that 33%, or over 2,500 profiles, were very likely underage.

Related: Report Reveals One-Third Of Online Child Sex Abuse Images Are Posted By Kids Themselves

Clearly, the rise in popularity of OnlyFans is causing an influx of underage content generation—legally defined as child exploitation imagery—even outside of the platform itself.

While this data helps us understand the scale of the issue when it comes to underage girls being attracted to and exploited on these platforms, it’s clearly just the tip of the iceberg.

Click here to learn how to report child sexual images.

Your Support Matters Now More Than Ever

Most kids today are exposed to porn by the age of 12. By the time they’re teenagers, 75% of boys and 70% of girls have already viewed itRobb, M.B., & Mann, S. (2023). Teens and pornography. San Francisco, CA: Common Sense.Copy —often before they’ve had a single healthy conversation about it.

Even more concerning: over half of boys and nearly 40% of girls believe porn is a realistic depiction of sexMartellozzo, E., Monaghan, A., Adler, J. R., Davidson, J., Leyva, R., & Horvath, M. A. H. (2016). “I wasn’t sure it was normal to watch it”: A quantitative and qualitative examination of the impact of online pornography on the values, attitudes, beliefs and behaviours of children and young people. Middlesex University, NSPCC, & Office of the Children’s Commissioner.Copy . And among teens who have seen porn, more than 79% of teens use it to learn how to have sexRobb, M.B., & Mann, S. (2023). Teens and pornography. San Francisco, CA: Common Sense.Copy . That means millions of young people are getting sex ed from violent, degrading content, which becomes their baseline understanding of intimacy. Out of the most popular porn, 33%-88% of videos contain physical aggression and nonconsensual violence-related themesFritz, N., Malic, V., Paul, B., & Zhou, Y. (2020). A descriptive analysis of the types, targets, and relative frequency of aggression in mainstream pornography. Archives of Sexual Behavior, 49(8), 3041-3053. doi:10.1007/s10508-020-01773-0Copy Bridges et al., 2010, “Aggression and Sexual Behavior in Best-Selling Pornography Videos: A Content Analysis,” Violence Against Women.Copy .

From increasing rates of loneliness, depression, and self-doubt, to distorted views of sex, reduced relationship satisfaction, and riskier sexual behavior among teens, porn is impacting individuals, relationships, and society worldwideFight the New Drug. (2024, May). Get the Facts (Series of web articles). Fight the New Drug.Copy .

This is why Fight the New Drug exists—but we can’t do it without you.

Your donation directly fuels the creation of new educational resources, including our awareness-raising videos, podcasts, research-driven articles, engaging school presentations, and digital tools that reach youth where they are: online and in school. It equips individuals, parents, educators, and youth with trustworthy resources to start the conversation.

Will you join us? We’re grateful for whatever you can give—but a recurring donation makes the biggest difference. Every dollar directly supports our vital work, and every individual we reach decreases sexual exploitation. Let’s fight for real love: