A simple Facebook search for “vaccination” shows that many of the top autofill suggestions are for anti-vaccination, vaccine re-education, or the Vaccine Information Network. Groups like the Vaccine Information Network spread false information about the risks of vaccinating children and have thousands of members. The Vaccination Re-education Discussion Forum, a closed group, has over 140,000 members.
YouTube shows a similar situation; a search for the word “vaccine” comes up with autofill suggestions like “vaccines are toxic,” “vaccination the silent killer,” and “vaccine injury.” As the Guardian reports, even when users watch a video to get sound medical information, the platform’s algorithm suggests misinformation for the next videos.
Both companies are trying to deal with misinformation that can have harmful consequences in the real world. YouTube announced last month its plans to reduce recommendations for conspiracy theory and misinformation videos like those ”promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11,” according to the company’s blog. The videos would still be available, just recommended less often.
Facebook told the Guardian that the company is looking at ways to deal with vaccination misinformation on the platform, but said that anti-vaccine misinformation did not violate their community guidelines. And since anti-vaccine groups buy advertising on Facebook, it makes the misinformation more visible.
YouTube told the Gaurdian that some anti-vaccine videos would be considered harmful content under the new approach, but did not say which videos.
The news comes amidst a surge of measles outbreaks in Washington state and New York. The World Health Organization has named the anti-vaccination movement one of the top global health threats this year.
H/T The Guardian