Tech

Reddit knowingly left up child sexual abuse content, lawsuit alleges

The suit aims to become a class-action.

Photo of Andrew Wyrich

Andrew Wyrich

Reddit app seen on the corner of mobile phone.

A woman has filed a lawsuit against Reddit alleging the social media site did not take enough action to stop her ex-boyfriend from repeatedly posting pornographic material of her as a 16-year-old.

Featured Video

The lawsuit, which was filed in the United States District Court for the Central District of California, alleges that Reddit has taken “virtually no action” in removing child pornography.

In the suit, “Jane Doe” claims that she repeatedly asked Reddit to take down “illegal videos and images depicting Plaintiff in a sex act as a minor.” The videos, the suit claims, were then uploaded to Reddit by an abusive ex-boyfriend.

When she became aware of the videos on Reddit, she “immediately reported them to the moderators of the individual subreddits” and made clear that she was the woman in the videos, the lawsuit notes. After bringing the videos to Reddit’s attention, the lawsuit claims, it would sometimes take “several days” for them to be taken down.

Advertisement

However, the videos would be reposted by her ex-boyfriend “often to the exact same subreddit” which would force her to go through the flagging of the videos over again. The suit also claims Reddit knows illegal child pornography is posted on the platform.

The suit seeks to become a class-action lawsuit for anyone who has had photos or videos posted of them on Reddit when they were under 18 years old.

It also asserts that Reddit violated SESTA-FOSTA, a highly controversial law that amends Section 230 of the Communications Decency Act, because the social media site ran ads in the subreddits where the videos of her were posted. SESTA-FOSTA removes Section 230 liability protections for websites in regards to sex-trafficking material. The law has been deeply criticized for its chilling impact on sex workers’ rights.

In a statement to Gizmodo, Reddit said child sexual abuse material “has no place” on the platform and that it uses “both automated tools and human intelligence” to stop that kind of material from appearing on it.

Advertisement

Read more of the Daily Dot’s tech and politics coverage

Nevada’s GOP secretary of state candidate follows QAnon, neo-Nazi accounts on Gab, Telegram
Court filing in Bored Apes lawsuit revives claims founders built NFT empire on Nazi ideology
EXCLUSIVE: ‘Say hi to the Donald for us’: Florida police briefed armed right-wing group before they went to Jan. 6 protest
Inside the Proud Boys’ ties to ghost gun sales
‘Judas’: Gab users are furious its founder handed over data to the FBI without a subpoena
EXCLUSIVE: Anti-vax dating site that let people advertise ‘mRNA FREE’ semen left all its user data exposed
Sign up to receive the Daily Dot’s Internet Insider newsletter for urgent news from the frontline of online.
 
The Daily Dot