Article Lead Image

Months later, Imgur takes action on child porn

After a critical report by the Daily Dot, the image-hosting site took action to remove racy images of teenage girls.

 

Kevin Morris

IRL

Posted on Feb 13, 2012   Updated on Jun 2, 2021, 9:28 pm CDT

Image-hosting site Imgur has followed in Reddit’s footsteps, deleting hundreds of photographs that many alleged were child pornography.

The move came hours after the Daily Dot published a story critical of the hosting site and its role in the social news site’s ongoing controversy over links to photos of underaged girls.

“We have removed the subsection r/jailbait and all other subsections with images of child pornography,” Imgur staff member Sarah Schaaf wrote in an email to the Daily Dot. “This was a simple decision, as we, of course, care deeply about the safety and protection of minors.”

That deep sense of caring seems to be a relatively recent discovery for Imgur. In October, Imgur founder Alan Schaaf told the Daily Dot that he didn’t care to discuss Reddit’s controversial teen-pics section, r/jailbait. He mentioned the section by name, suggesting that he was aware of the section and the links posted to images of underage girls in suggestive poses uploaded by Imgur users.

In our story, we pointed out that Reddit, a link-sharing and discussion site, did not host the images in question. That distinction fell to Imgur. We wrote that Imgur “refused” to delete the photos.

Sarah Schaaf wrote that she thought our choice of language was “defamatory.”

“At no point did any employee at Imgur ‘refuse’ to delete the images,” she wrote.

Rather than play word games, as Imgur would have us do, we’ve clarified the language in our original story to reflect Alan Schaaf’s months of knowledge and inaction.

When asked if the site would continue removing jailbait-style images in the future, Sarah Schaaf replied: “Yes, all pornographic images are removed when they are reported.”

She also said that Imgur’s r/jailbait sections were automatically created when Reddit users uploaded images for posting to those subreddits, or Reddit sections, and were only manually deleted today after Imgur became aware that they’d been banned.

In fact, r/jailbait was banned once last August, a decision which made international news, and again, a final time, in October.

Imgur would have us believe, in other words, that Alan Schaaf, who created the site specifically to cater to fellow redditors, was completely unaware of this fact.

As for Sarah Schaaf’s claim that Imgur was unable to take swifter action because it operates on Pacific Time, we’d note that plenty of companies seem able to operate on the weekends. Like Reddit, which made its decision on Sunday. Or, say, Imgur, whose Twitter account was active through the weekend and early Monday morning

We’ve copied the email from Imgur below.

“We have removed the subsection r/jailbait and all other subsections with images of child pornography. This was a simple decision, as we, of course, care deeply about the safety and protection of minors. We operate on normal business hours PST and are not  directly affiliated with Reddit, so the news of these subsections and Reddit’s subsequent banning came to us this early afternoon, at which point all of the subsections were immediately removed.

“I have since read your article ‘The real villain of Reddit’s child-porn scandal: Imgur.’ At no point did any employee at Imgur ‘refuse’ to delete the images. This is a defamatory statement with no basis, and I would ask that you retract it in the name of true and fair journalism.

“Our Reddit-specific subsections are automatically created from the data that we receive from Reddit in order to create a simple gallery where one can easily browse the images. It was not a choice to create or support these subsections specifically, although the choice to remove them was clear, easy and completed as soon as we were alerted to the issue.”

Photo by matrianklw

Share this article
*First Published: Feb 13, 2012, 8:47 pm CST