Tech

Facebook asks users to flag fake news

Facebook is set on an automated solution.

Photo of Jay Hathaway

Jay Hathaway

Article Lead Image

Facebook has finally admitted it has a problem with fake news—and some responsibility to stop the spread of misinformation and propaganda on its platform. 

Featured Video

To that end, the social media giant is rolling out a new feature that shifts the work of flagging false stories to you, the Facebook user. 

Advertisement

Surveys have started popping up below stories on Facebook, asking users to  evaluate how misleading an article’s language is or whether it leaves out “key details.” The feature doesn’t seem to be widespread yet, but the Verge reports it’s already been spotted on stories from Rolling Stone, the Philadelphia Inquirer, and a U.K. comedy site, Chortle.

https://twitter.com/_tomaf/status/804781844105461760

Asking users to identify false stories fits with Facebook’s general approach to solving its problems through algorithms, not through hands-on human curation. Earlier this year, Facebook axed the team of contractors who curated its trending topics feed, opting for an automated list instead. False stories started appearing in the trending topics feature immediately afterward. 

Advertisement

Now the company seems to be leaning on the wisdom of crowds to solve the fake news problem. Some are already questioning the move, considering those are the same crowds who’ve been duped by the fake stories in the first place. 

https://twitter.com/_tomaf/status/804781844105461760

John Herrmann, veteran social media commentator and David Carr Fellow at the New York Times, tweeted that the new surveys are part and parcel of Facebook’s belief that it’s not a media company, it’s a platform and a self-regulating marketplace:

https://twitter.com/jwherrman/status/805913127334346752

Advertisement

https://twitter.com/jwherrman/status/805914312367894532

This is all happening in the same week that Facebook’s Eliott Schrage said the presidential election made the company realize it has “a role in assessing the validity of content people share.” 

But not a role where Facebook employees take personal responsibility for vetting stories. Rather, Schrage said, Facebook will attempt to change user behavior with a “think-before-you-share” program and better tools to flag fake news.

Facebook did not immediately respond to our request for comment.

Advertisement

Whether that will be enough, when the demand for fake or nakedly partisan news is seemingly insatiable, remains to be seen.

 
The Daily Dot