Horse with YouTube logo on forehead

smerikal/Flickr (CC-BY-SA) Remix by Jason Reed

YouTube’s content moderation system missed graphic images of bestiality

The images are clickbait, but they've lived on YouTube for months.


Audra Schroeder


Published Apr 25, 2018   Updated May 21, 2021, 5:32 pm CDT

You can add bestiality to the growing list of issues YouTube is facing.

Featured Video Hide

A new report from BuzzFeed found that YouTube hosted graphic thumbnail images depicting bestiality up until this week, and that these images were fairly easy to find. A search for “girl and her horse” brings up more than 12 million results, and though many of the images BuzzFeed cited no longer show up (YouTube reportedly started deleting them after being contacted by BuzzFeed on Monday) clicking on one of the top results, “Wow! Fantastic! A Beautiful Girl and Her Horse and the Horse Park,” still brings up some questionable “Up next” videos involving women with horses and dogs. Many of those videos have millions of views, though they do not appear to feature actual bestiality.

Advertisement Hide

BuzzFeed points out that one account, SC Daily, features graphic thumbnail images of bestiality alongside content aimed at kids. Disturbing children’s content is one major issue YouTube has been trying to address in the last year. Advocacy groups recently accused the platform of collecting data on viewers under 13.

A senior YouTube employee told BuzzFeed that many of the videos look similar to clips from a Cambodian content farm that was removed from YouTube in late 2017. The employee said the accounts were probably using graphic thumbnail images to get more clicks, in hopes of later monetizing the videos. However, YouTube’s thumbnail monitoring system apparently didn’t catch these images, since they don’t have the same components as pornography. People gaming YouTube with misleading clickbait titles and thumbnails is another issue facing the platform: YouTuber touchdalight recently faced backlash for using rape and incest as clickbait.

In a recent New York Times article, published the same day as BuzzFeed’s report, YouTube claimed it removed more than 8 million videos in 2017, and that the majority of those videos were flagged by its AI. A spokesperson told BuzzFeed: “We’re working quickly to do more than ever to tackle abuse on our platform, and that includes developing better tools for detecting inappropriate and misleading metadata and thumbnails so we can take fast action against them.”

Advertisement Hide

H/T BuzzFeed 

Share this article
*First Published: Apr 25, 2018, 10:55 am CDT