- Trump peddles right-wing ‘prayer rug’ conspiracy 12 Months Ago
- Summit1G reportedly overtakes Ninja as king of Twitch subscribers 12 Months Ago
- FCC’s request to postpone net neutrality case denied by federal court Today 11:02 AM
- School employee investigated for yelling, ‘Build the wall,’ at picketing teachers Today 10:50 AM
- Netflix announces staggering viewership for ‘You’—but many are skeptical Today 10:50 AM
- YouTuber Jesus Christ responds to sexual misconduct allegation Today 10:32 AM
- Pro-Trump Twitter blasts BuzzFeed report claiming Trump directed Cohen to lie to Congress Today 10:24 AM
- Mexican airline trolls Americans with DNA tests—and discounts Today 9:58 AM
- There are apparently ASMR videos for role-playing with a fictional boyfriend Today 9:08 AM
- Let’s take a look at Captain Pike’s Starfleet service record in ‘Star Trek: Discovery’ Today 8:25 AM
- How to watch Demetrius Andrade vs. Artur Akavov for free Today 7:00 AM
- The end of an era: RIP, Critique My Dick Pic Today 6:30 AM
- Ariana Grande’s new single ‘7 Rings’ has Twitter feeling broke Today 6:30 AM
- Mastodon is crumbling—and many blame its creator Today 6:00 AM
- ‘Star Trek: Discovery’ hints at Spock’s new role in the season 2 premiere Today 6:00 AM
Did the algorithm that shapes YouTube’s recommended videos give Donald Trump the extra push to become president? According to a former Google engineer and his software that intensely studied the issue, it’s entirely plausible.
Guillaume Chaslot—who worked for months with YouTube engineers on the system that recommends videos that immediately follow the content a user is currently watching—said the algorithm is skewed to make watchers spend more time on the site. No matter what.
“YouTube is something that looks like reality, but it is distorted to make you spend more time online,” Chaslot told the Guardian. “The recommendation algorithm is not optimizing for what is truthful, or balanced, or healthy for democracy … Watch time was the priority. Everything else was considered a distraction.”
YouTube countered that narrative to the Guardian, saying that the company no longer factors in only watch time. Instead, Google said it began in 2016 giving more weight to a user’s “satisfaction” and looking at how many “likes” a video has earned. It also said it’s tried to halt videos that contain “inflammatory religious or supremacist” content.
Google fired Chaslot in 2013, apparently over his performance issues, but he said he upset the company because he wanted to change how the algorithm worked, to halt fake news videos, and to make a person’s “up next” videos more diverse. After his stint working at YouTube, Chaslot wrote software that mimics a YouTube user who watches one video and then continuously follows the recommended videos that pop up when the previous video has finished. From the Guardian:
Over the last 18 months, Chaslot has used the program to explore bias in YouTube content promoted during the French, British and German elections, global warming and mass shootings, and published his findings on his website, Algotransparency.com. Each study finds something different, but the research suggests YouTube systematically amplifies videos that are divisive, sensational and conspiratorial.
Chaslot also discovered that, during the 2016 election, YouTube amplified anti-Hillary Clinton videos and pro-Trump messages.
“It was strange,” Chaslot said. “Wherever you started, whether it was from a Trump search or a Clinton search, the recommendation algorithm was much more likely to push you in a pro-Trump direction.”
Chaslot said there were dozens of videos that featured anti-Clinton conspiracy theories—that included everything from a Clinton mental breakdown to her having Parkinson’s disease, from secret sexual affairs to accusations that Bill Clinton had raped a 13-year-old.
According to the Guardian, “There were too many videos in the database for us to watch them all, so we focused on 1,000 of the top-recommended videos. We sifted through them one by one to determine whether the content was likely to have benefited Trump or Clinton. Just over a third of the videos were either unrelated to the election or contained content that was broadly neutral or even-handed. Of the remaining 643 videos, 551 were videos favoring Trump, while only 92 favored the Clinton campaign.
Said YouTube: “We have a great deal of respect for the Guardian as a news outlet and institution. We strongly disagree, however, with the methodology, data and, most importantly, the conclusions made in their research … Our search and recommendation systems reflect what people search for, the number of videos available, and the videos people choose to watch on YouTube. That’s not a bias towards any particular candidate; that is a reflection of viewer interest.”
In Chaslot’s database, those election videos were watched more than 3 billion times before the 2016 election. Combine that with Twitter’s admission that 1.4 million users engaged with Russian propaganda and Facebook’s continued battle with fake news, and it’s apparent that Trump benefitted from tainted algorithms.
Read the Guardian’s entire report here.
Josh Katzowitz is the Weekend Editor for the Daily Dot and covers the world of YouTube. His work has appeared in the New York Times, Wall Street Journal, Washington Post, and Los Angeles Times. He’s also a longtime sports writer, covering the NFL for CBSSports.com and boxing for Forbes. His work has been noted twice in the Best American Sports Writing book series.