fbpx Skip to main content

Why Did Twitter Refuse to Remove Videos of This Sex Trafficked Minor?

A lawsuit alleges Twitter made money off of child sexual abuse material of a trafficking victim after the platform refused to take down the explicit content. More specifically, the suit alleges the images and videos were permitted to remain up on the site after a Twitter investigation “didn’t find a violation” of the company’s “policies.”

By January 27, 2021No Comments

On January 20, 2021, a federal suit was filed by a sex trafficking victim and his mother in the Northern District of California against social media giant Twitter.

According to the suit, Twitter made money off of child porn (also called “child sexual abuse material,” CSAM for short) images and videos of the trafficking victim after the platform refused to take down the explicit content. More specifically, the suit alleges the images and videos were permitted to remain up on the site after a Twitter investigation “didn’t find a violation” of the company’s “policies.”

Here’s what happened.

A New York Post article detailing the events that led up to the suit shared that the teen in the images and video, now a 17-year-old living in Florida, was between 13 and 14 years old when sex traffickers posing as a 16-year-old female classmate began chatting with him using Snapchat. They used this leverage to get him to share explicit photos of himself before they eventually blackmailed him.

If that’s not bad enough, the story only gets worse from there.

Before we go on though, we want to highlight to you the two big issues at play here: first is the issue of sextortion, and second is why Twitter would refuse to remove videos of a minor’s sexual abuse. We’re going to address both of these issues in this article.

Let’s get started with the sextortion piece of the equation.

Related: Twitter Sued By Trafficking Survivor For Distributing And Profiting From Child Abuse Images

Store - General

What is “sextortion?”

Sextortion occurs when non-physical forms of coercion are used to acquire sexual content from, engage in sex with, or obtain money from the victim. And, as exhibited by the story we’re sharing here, it’s not as uncommon as one might think.

The teen in this story, called “John Doe” in the suit to protect his identity, allegedly exchanged a number of nude photos with the traffickers before the traffickers began to blackmail him. They told him that if he didn’t send more sexually graphic photos and videos, they would share the nude photos he’d previously sent to them with his “parents, coach, [and] pastor,” among others.

With nowhere else to turn, Doe complied with traffickers’ requests for more explicit content of himself and even provided requested content of himself with another child.

When the teen attempted to distance himself from the traffickers by blocking them, they stopped harassing him. However, shortly after that, the images and videos he’d sent them showed up on Twitter under two accounts that were notorious for sharing CSAM.

Related: The Tragic Case Of Tevan: Why Digital Sextortion Cases Are On The Rise

The revenge tweets were brought to Doe’s attention in January 2020 by classmates who teased, harassed, and bullied him mercilessly for his role in them. It’s no surprise that court records also showed this treatment led the teen to become suicidal.

Based on the suit’s allegations, Twitter had an opportunity to do the responsible thing by deleting the tweets. Sadly, that didn’t happen.

Twitter’s failure to respond

Following Doe’s initial complaint to Twitter, a support agent reached out and requested a copy of his ID to prove the images and videos in the Tweets were of him. But even after quick compliance from the teen, there was no response from the agent for a week.

Around that same time, Doe’s mother filed multiple complaints with Twitter that reported the same explicit tweets. She had similar luck as her son—no response for a week.

In the meantime, the views and retweets of the explicit tweets continued to increase.

When Twitter finally responded to say they wouldn’t be taking any action because they “didn’t find a violation of [their] policies” in the reviewed content, the tweets had been viewed over 167,000 times and had been retweeted 2,223 times.

Doe’s frustration with Twitter’s response was readily apparent in the suit.

What do you mean you don’t see a problem? We both are minors right now and were minors at the time these videos were taken. We both were 13 years of age. We were baited, harassed, and threatened to take these videos that are now being posted without our permission. We did not authorize these videos AT ALL and they need to be taken down,” the teen wrote back to Twitter.

Even though Doe’s response to the tech platform included his case number from a local law enforcement agency his parents had complained to earlier, Twitter did nothing to get rid of the CSAM tweets.

Related: “Sextortion” Scams Dramatically Increase Since COVID-19 Quarantines Began

Only after a mutual contact connected the teen’s mother with a Department of Homeland Security agent and the federal agent made a “take-down demand…did Twitter suspend the user accounts that were distributing the CSAM and report the CSAM to the National Center on Missing and Exploited Children.”

While Twitter supposedly has “zero-tolerance for any material that features or promotes child sexual exploitation” and does everything they can to “remove content, facilitate investigations, and protect minors from harm…,” the story of Doe suggests the contrary.

BHW - General

Why this matters

Sextortion is an epidemic, especially in our digitally-driven dating world. Young men and women, like Doe, are backed into a corner through no fault of their own and are forced to fight desperately on their own to get out.

And that fact is worsened by the fact that the social media giants that should be helping to keep this digital playground safe seem to significantly fail at doing just that.

Related: Study Reveals Image-Based Abuse Victims Suffer Similar Trauma As Sexual Assault Victims

If you or someone else you know find yourself in a similar position to Doe or knows someone like Doe who is being extorted for intimate images, we recommend you check out the below tips based on what victim advocates say:

Report it. Report what’s happened to the authorities in your state or country. They can help you assess your situation and may also encourage you to consider reporting to other agencies like the police. Remember, anyone can be a victim of sextortion, and you are not alone.

Don’t panic. Reach out instead—get support from a trusted friend or family member as well as an expert counseling support service if you are feeling anxious or stressed.

Stop all contact with the perpetrator. Block them and ask your friends to do the same. Consider temporarily deactivating your social media accounts (but don’t delete them as you may lose evidence that way).

Collect evidence. Keep a record of all contact from the perpetrator, particularly any demands or threats and make a note of everything you know about the perpetrator. This could include the Skype name and ID, Facebook URL and Money Transfer Control Number (MTCN).

Secure your accounts. Change the passwords for your social media and online accounts, and review the privacy and security settings of your accounts. Notify the relevant social media platform. Notify Skype, YouTube, or whichever app or social media service was used. You can find helpful links about reporting abuse to social media platforms on each particular website.

You can also utilize online resources from organizations like Thorn and the National Center for Missing and Exploited Kids, who both defend children from sexual abuse.

And always, always remember that you are never alone.

To report an incident involving the possession, distribution, receipt, or production of CSAM, file a report on the National Center for Missing & Exploited Children (NCMEC)’s website at www.cybertipline.com, or call 1-800-843-5678.