- Bernie Sanders wins Nevada Caucuses Saturday 6:54 PM
- MSNBC is out of its mind over Sanders leading Nevada Saturday 5:20 PM
- Kim Kardashian dragged for using makeup to darken her hands Saturday 4:13 PM
- TikTok users show how they turned their vehicles into incredible tiny homes Saturday 3:44 PM
- Woman iconically pranks man who sent her an unsolicited d*ck pic Saturday 2:25 PM
- ‘Terrifying’ deepfake puts Jeff Bezos and Elon Musk in ‘Star Trek’ Saturday 1:06 PM
- A 36-year-old called the cops after being booted from parents’ phone plan Saturday 12:16 PM
- People think novelist Dean Koontz predicted the coronavirus in 1981 thriller Saturday 10:22 AM
- Twitter suspends 70 pro-Bloomberg accounts Saturday 9:15 AM
- In documentary ‘Modern Whore,’ a former escort takes control of her own narrative Saturday 6:30 AM
- Cara Delevingne calls out Justin Bieber for ‘ranking’ wife Hailey’s friends Friday 9:07 PM
- Fans defend Jenna Marbles after some people claimed she mistreated her dogs in a recent video Friday 8:37 PM
- ‘Friends’ gets reunion special on HBO Max, fans go wild Friday 7:37 PM
- Why you should drop everything and start reading ‘Lore Olympus’ Friday 6:27 PM
- ‘Boogaloo’ memes are trying to organize a second civil war—and they’re spreading fast Friday 3:48 PM
Why Reddit’s creepy photos of women are still available on Imgur
Leaving r/creepshots’ secretly-snapped photos online invites criticism, but taking them down could set a bad precedent.
Popular image-sharing site Imgur has removed a directory featuring surreptitious snapshots of women, but the photos themselves remain online.
The images come from Reddit’s latest controversial group, r/creepshots, which recently gained widespread notoriety after one member was fired from his substitute teaching job for posting upskirt photos of his underage students.
Since that incident, there has been much discussion about whether the creepshots creeps’ activities are legal, particularly when they don’t know whether the women they’re photographing are of age.
Now r/creepshots moderators are telling contributors, “No photos taken on/in/around school settings or of ‘school girls’ unless you can confirm that they are not minors.”
The controversy apparently has not affected Imgur’s stance on hosting r/creepshots images.
On Friday, Imgur told the Daily Dot, “the r/creepshots subsection has been removed.” But the decision to remove the r/creepshots subsection from Imgur did not include deleting the photos themselves. Days later, all hosting for r/creepshots photos is still active, and the subreddit continues to function as before.
Today, an Imgur spokesperson described the company’s previous statement as a “miscommunication” and added the following clarification:
“It was brought to our attention that minors were being sexualized on r/creepshots. Our response has been to close the r/creepshots subsection on Imgur, thereby closing image browsing for that area and disallowing new images to be added.
“We do not block anyone from using Imgur and will not do so in the future.”
When it comes to illegal images uploaded to Imgur—whether they be of underage school girls or unmistakable child porn—Imgur won’t take action until those images have been reported.
“Images are associated with the uploader, not by where they have been shared, so we require a removal request and/or a DMCA takedown request for image removal,” Sarah Schaaf, an Imgur employee, told the Daily Dot.
A takedown request under the Digital Millenium Copyright Act only covers violations of copyright law, and an Imgur removal request takes roughly seven days to process. Under Imgur’s policy on reporting images, you must list each individual image that you want removed. If a person wants to submit a request for an entire gallery of questionable images, each photo still has to be listed individually and processed on a case-by-case basis.
Although Imgur has acknowledged that the r/creepshots photos might be problematic by removing its own creepshots directory, proactively taking down images before they’re reported could open the floodgates of legal liability. If Imgur takes responsibility for policing one subreddit for illegal images, it may be stuck policing them all.
However, Imgur has made exceptions in particularly egregious cases. In February, Imgur removed all suggestive photos of minors that had been uploaded to Reddit’s r/jailbait—months after Reddit banned the subreddit itself.
“We do not allow or facilitate the sharing of child pornography,” Schaaf told the Daily Dot at the time.
Photo via Hansol/Flickr
Lauren Rae Orsini is a web culture reporter who specializes in anime and the business of fandom. Her work has been published by Forbes and Business Insider.