Pixabay (Public Domain)

YouTube’s algorithm is reportedly helping pedophiles find home videos of kids

YouTube is changing its livestreaming policy in response.


Mikael Thalen


YouTube’s algorithm is reportedly facilitating pedophiles by recommending videos of young children to them. YouTube announced Monday that it’s changing its livestreaming policy amid news of the reports.

Someone who watches sexually themed content on the platform could slowly be presented with videos of younger and younger women before eventually landing on videos of children, according to a team of researchers at the Harvard University’s Berkman Klein Center for Internet and Society

A Brazilian woman was informed that an innocent home video of her 10-year-old daughter and a friend playing in a backyard pool suddenly garnered more than 400,000 views after being recommended to those who had watched “videos of prepubescent, partially clothed children,” according to one of the researchers’ examples.

“It’s YouTube’s algorithm that connects these channels,”  research co-author Jonas Kaiser, wrote. “That’s the scary thing.”

After being alerted to the issue, the company banned younger minors, who are 13 and younger, from livestreaming without an adult present.

YouTube also removed several videos and appeared to alter its algorithm, according to the New York Times.

A YouTube spokesperson directed the Daily Dot to its blog post addressing the discovery. The post details the company’s new livestreaming policy and other work to “protect minors and families.”

In March, it disabled comments on videos of children after it was learned that pedophiles were sharing time stamps to sections of video that showed nudity.

“The vast majority of videos featuring minors on YouTube, including those referenced in recent news reports, do not violate our policies and are innocently posted—a family creator providing educational tips, or a parent sharing a proud moment,” the company wrote.

The company also reduced recommendations of “borderline content,” including videos featuring minors in risky situations.

The researchers argue that YouTube refuses to disable its recommendation system altogether on videos of children because doing so would hurt its independent creators.

The issue is not the first related to minors for the website.

The discovery of a “softcore pedophile ring” on the platform by a YouTuber in February led Disney to stop purchasing ads on the site.

A spokesperson told the Daily Dot at the time that it hired child safety experts who are dedicated to catching those who wish to harm children. The company also said it terminates thousands of underage accounts each week to help combat the problem.


Got five minutes? We’d love to hear from you. Help shape our journalism and be entered to win an Amazon gift card by filling out our 2019 reader survey.

H/T Jezebel

Share this article

*First Published:

The Daily Dot