- ‘Due to personal reasons’ meme enables questionable behavior Monday 3:36 PM
- Why do white rappers write lyrics about being good hypothetical dads? Monday 3:29 PM
- Roger Stone posts, then deletes, Instagram of his judge with small crosshairs next to her Monday 2:32 PM
- People are Googling Rihanna and their birthday in a Twitter challenge Monday 2:13 PM
- Here are all of the Fortnite earthquake cracks thus far Monday 1:21 PM
- New Apex Legends characters leaked by data miners Monday 12:36 PM
- Ken Jeong falls back on crude humor and lazy stereotypes in ‘You Complete Me, Ho’ Monday 12:24 PM
- 14 artsy cartoon mugs that’ll help make your days more creative Monday 12:15 PM
- Netflix cancels ‘Jessica Jones’ and ‘The Punisher’ Monday 11:26 AM
- YouTube is fueling the rise in flat earth believers Monday 11:04 AM
- Review: Crackdown 3 is not a world worth saving Monday 11:00 AM
- Scathing privacy report calls Facebook a ‘digital gangster’ Monday 10:50 AM
- 21 Savage goes deep on 21 Savage memes Monday 10:49 AM
- Everyone is debating the number of towels you should own Monday 10:47 AM
- How to unlock the Fortnite Prisoner stage 4 skin Monday 10:45 AM
Facebook and YouTube’s algorithms push a lot of anti-vaccine content.
A simple Facebook search for “vaccination” shows that many of the top autofill suggestions are for anti-vaccination, vaccine re-education, or the Vaccine Information Network. Groups like the Vaccine Information Network spread false information about the risks of vaccinating children and have thousands of members. The Vaccination Re-education Discussion Forum, a closed group, has over 140,000 members.
YouTube shows a similar situation; a search for the word “vaccine” comes up with autofill suggestions like “vaccines are toxic,” “vaccination the silent killer,” and “vaccine injury.” As the Guardian reports, even when users watch a video to get sound medical information, the platform’s algorithm suggests misinformation for the next videos.
Both companies are trying to deal with misinformation that can have harmful consequences in the real world. YouTube announced last month its plans to reduce recommendations for conspiracy theory and misinformation videos like those ”promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11,” according to the company’s blog. The videos would still be available, just recommended less often.
Facebook told the Guardian that the company is looking at ways to deal with vaccination misinformation on the platform, but said that anti-vaccine misinformation did not violate their community guidelines. And since anti-vaccine groups buy advertising on Facebook, it makes the misinformation more visible.
YouTube told the Gaurdian that some anti-vaccine videos would be considered harmful content under the new approach, but did not say which videos.
The news comes amidst a surge of measles outbreaks in Washington state and New York. The World Health Organization has named the anti-vaccination movement one of the top global health threats this year.
H/T The Guardian
Ellen Ioanes is the FOIA reporter at the Daily Dot, where she covers U.S. politics. She is a graduate of Columbia Journalism School, and her work has appeared in the Guardian, the Center for Public Integrity, HuffPost India, and more.