Article Lead Image

Why Reddit’s creepy photos of women are still available on Imgur

Leaving r/creepshots’ secretly-snapped photos online invites criticism, but taking them down could set a bad precedent.


Lauren Rae Orsini


Popular image-sharing site Imgur has removed a directory featuring surreptitious snapshots of women, but the photos themselves remain online.

The images come from Reddit’s latest controversial group, r/creepshots, which recently gained widespread notoriety after one member was fired from his substitute teaching job for posting upskirt photos of his underage students.

Since that incident, there has been much discussion about whether the creepshots creeps’ activities are legal, particularly when they don’t know whether the women they’re photographing are of age.

Now r/creepshots moderators are telling contributors, “No photos taken on/in/around school settings or of ‘school girls’ unless you can confirm that they are not minors.”

The controversy apparently has not affected Imgur’s stance on hosting r/creepshots images.

On Friday, Imgur told the Daily Dot, “the r/creepshots subsection has been removed.” But the decision to remove the r/creepshots subsection from Imgur did not include deleting the photos themselves. Days later, all hosting for r/creepshots photos is still active, and the subreddit continues to function as before.

Today, an Imgur spokesperson described the company’s previous statement as a “miscommunication” and added the following clarification:

“It was brought to our attention that minors were being sexualized on r/creepshots. Our response has been to close the r/creepshots subsection on Imgur, thereby closing image browsing for that area and disallowing new images to be added.

“We do not block anyone from using Imgur and will not do so in the future.”

When it comes to illegal images uploaded to Imgur—whether they be of underage school girls or unmistakable child porn—Imgur won’t take action until those images have been reported.

“Images are associated with the uploader, not by where they have been shared, so we require a removal request and/or a DMCA takedown request for image removal,” Sarah Schaaf, an Imgur employee, told the Daily Dot.

A takedown request under the Digital Millenium Copyright Act only covers violations of copyright law, and an Imgur removal request takes roughly seven days to process. Under Imgur’s policy on reporting images, you must list each individual image that you want removed. If a person wants to submit a request for an entire gallery of questionable images, each photo still has to be listed individually and processed on a case-by-case basis.

Although Imgur has acknowledged that the r/creepshots photos might be problematic by removing its own creepshots directory, proactively taking down images before they’re reported could open the floodgates of legal liability. If Imgur takes responsibility for policing one subreddit for illegal images, it may be stuck policing them all.

However, Imgur has made exceptions in particularly egregious cases. In February, Imgur removed all suggestive photos of minors that had been uploaded to Reddit’s r/jailbait—months after Reddit banned the subreddit itself.

“We do not allow or facilitate the sharing of child pornography,” Schaaf told the Daily Dot at the time.

Photo via Hansol/Flickr

The Daily Dot