Reddit app seen on the corner of mobile phone.

Ascannio/Shutterstock (Licensed)

Reddit knowingly left up child sexual abuse content, lawsuit alleges

The suit aims to become a class-action.


Andrew Wyrich


Published Apr 26, 2021   Updated Apr 27, 2021, 10:33 am CDT

A woman has filed a lawsuit against Reddit alleging the social media site did not take enough action to stop her ex-boyfriend from repeatedly posting pornographic material of her as a 16-year-old.

Featured Video Hide

The lawsuit, which was filed in the United States District Court for the Central District of California, alleges that Reddit has taken “virtually no action” in removing child pornography.

Advertisement Hide

In the suit, “Jane Doe” claims that she repeatedly asked Reddit to take down “illegal videos and images depicting Plaintiff in a sex act as a minor.” The videos, the suit claims, were then uploaded to Reddit by an abusive ex-boyfriend.

When she became aware of the videos on Reddit, she “immediately reported them to the moderators of the individual subreddits” and made clear that she was the woman in the videos, the lawsuit notes. After bringing the videos to Reddit’s attention, the lawsuit claims, it would sometimes take “several days” for them to be taken down.

However, the videos would be reposted by her ex-boyfriend “often to the exact same subreddit” which would force her to go through the flagging of the videos over again. The suit also claims Reddit knows illegal child pornography is posted on the platform.

The suit seeks to become a class-action lawsuit for anyone who has had photos or videos posted of them on Reddit when they were under 18 years old.

It also asserts that Reddit violated SESTA-FOSTA, a highly controversial law that amends Section 230 of the Communications Decency Act, because the social media site ran ads in the subreddits where the videos of her were posted. SESTA-FOSTA removes Section 230 liability protections for websites in regards to sex-trafficking material. The law has been deeply criticized for its chilling impact on sex workers’ rights.

Advertisement Hide

In a statement to Gizmodo, Reddit said child sexual abuse material “has no place” on the platform and that it uses “both automated tools and human intelligence” to stop that kind of material from appearing on it.

Share this article
*First Published: Apr 26, 2021, 2:11 pm CDT