how to delete youtube account

Illustration by Jason Reed

Did YouTube’s recommended videos swing the election for Trump?

A former Google engineer weighs in.


Josh Katzowitz

Layer 8

Published Feb 2, 2018   Updated May 22, 2021, 2:16 am CDT

Did the algorithm that shapes YouTube’s recommended videos give Donald Trump the extra push to become president? According to a former Google engineer and his software that intensely studied the issue, it’s entirely plausible.

Guillaume Chaslot—who worked for months with YouTube engineers on the system that recommends videos that immediately follow the content a user is currently watching—said the algorithm is skewed to make watchers spend more time on the site. No matter what.

“YouTube is something that looks like reality, but it is distorted to make you spend more time online,” Chaslot told the Guardian. “The recommendation algorithm is not optimizing for what is truthful, or balanced, or healthy for democracy … Watch time was the priority. Everything else was considered a distraction.”

YouTube countered that narrative to the Guardian, saying that the company no longer factors in only watch time. Instead, Google said it began in 2016 giving more weight to a user’s “satisfaction” and looking at how many “likes” a video has earned. It also said it’s tried to halt videos that contain “inflammatory religious or supremacist” content.

Google fired Chaslot in 2013, apparently over his performance issues, but he said he upset the company because he wanted to change how the algorithm worked, to halt fake news videos, and to make a person’s “up next” videos more diverse. After his stint working at YouTube, Chaslot wrote software that mimics a YouTube user who watches one video and then continuously follows the recommended videos that pop up when the previous video has finished. From the Guardian:

Over the last 18 months, Chaslot has used the program to explore bias in YouTube content promoted during the French, British and German elections, global warming and mass shootings, and published his findings on his website, Each study finds something different, but the research suggests YouTube systematically amplifies videos that are divisive, sensational and conspiratorial.

Chaslot also discovered that, during the 2016 election, YouTube amplified anti-Hillary Clinton videos and pro-Trump messages.

“It was strange,” Chaslot said. “Wherever you started, whether it was from a Trump search or a Clinton search, the recommendation algorithm was much more likely to push you in a pro-Trump direction.”

Chaslot said there were dozens of videos that featured anti-Clinton conspiracy theories—that included everything from a Clinton mental breakdown to her having Parkinson’s disease, from secret sexual affairs to accusations that Bill Clinton had raped a 13-year-old.

According to the Guardian, “There were too many videos in the database for us to watch them all, so we focused on 1,000 of the top-recommended videos. We sifted through them one by one to determine whether the content was likely to have benefited Trump or Clinton. Just over a third of the videos were either unrelated to the election or contained content that was broadly neutral or even-handed. Of the remaining 643 videos, 551 were videos favoring Trump, while only 92 favored the Clinton campaign.

Said YouTube: “We have a great deal of respect for the Guardian as a news outlet and institution. We strongly disagree, however, with the methodology, data and, most importantly, the conclusions made in their research … Our search and recommendation systems reflect what people search for, the number of videos available, and the videos people choose to watch on YouTube. That’s not a bias towards any particular candidate; that is a reflection of viewer interest.”

In Chaslot’s database, those election videos were watched more than 3 billion times before the 2016 election. Combine that with Twitter’s admission that 1.4 million users engaged with Russian propaganda and Facebook’s continued battle with fake news, and it’s apparent that Trump benefitted from tainted algorithms.

Read the Guardian’s entire report here.

Share this article
*First Published: Feb 2, 2018, 1:17 pm CST