- Jason Momoa stands by his Khaleesi after the ‘Game of Thrones’ finale 2 Years Ago
- Airbnb, 23andMe partner for creepy heritage travel recommendations Today 3:26 PM
- Rep. Katie Porter goes viral again for trouncing Ben Carson (updated) Today 3:26 PM
- This deepfake takes Bill Hader’s Schwarzenegger impression to the next level Today 2:58 PM
- Wanda Sykes rails against Trump and offers much-needed perspective in ‘Not Normal’ Today 2:41 PM
- Man arrested after allegedly threatening to shoot YouTube employees Today 2:13 PM
- Some House Dems are backing away from the Save the Internet Act Today 1:40 PM
- Thousands sign petition calling for Danny DeVito to play Wolverine Today 1:02 PM
- Jason Mitchell fired from ‘Desperados’ and ‘The Chi’ after misconduct allegations Today 12:36 PM
- Police raid Black woman’s house after white neighbor complains about loud Malcolm X speeches Today 12:20 PM
- ‘Transfixed’ says it’s a ‘breakthrough’ series, but it still fetishizes trans bodies Today 11:04 AM
- Senator proposes Do Not Track bill to allow consumers to opt out of data gathering Today 10:54 AM
- The Queen of the North likes to Juul Today 10:36 AM
- Nearly half of Juul’s Twitter followers can’t legally buy the product, study says Today 10:26 AM
- New Facebook Messenger scam tricks people into thinking they donated to ISIS Today 10:26 AM
Facebook is set on an automated solution.
To that end, the social media giant is rolling out a new feature that shifts the work of flagging false stories to you, the Facebook user.
Surveys have started popping up below stories on Facebook, asking users to evaluate how misleading an article’s language is or whether it leaves out “key details.” The feature doesn’t seem to be widespread yet, but the Verge reports it’s already been spotted on stories from Rolling Stone, the Philadelphia Inquirer, and a U.K. comedy site, Chortle.
Asking users to identify false stories fits with Facebook’s general approach to solving its problems through algorithms, not through hands-on human curation. Earlier this year, Facebook axed the team of contractors who curated its trending topics feed, opting for an automated list instead. False stories started appearing in the trending topics feature immediately afterward.
Now the company seems to be leaning on the wisdom of crowds to solve the fake news problem. Some are already questioning the move, considering those are the same crowds who’ve been duped by the fake stories in the first place.
John Herrmann, veteran social media commentator and David Carr Fellow at the New York Times, tweeted that the new surveys are part and parcel of Facebook’s belief that it’s not a media company, it’s a platform and a self-regulating marketplace:
in a platform purist’s view, social media is just a high-frequency marketplace. its problems are just supercharged ebay problems
— John Herrman (@jwherrman) December 5, 2016
This is all happening in the same week that Facebook’s Eliott Schrage said the presidential election made the company realize it has “a role in assessing the validity of content people share.”
But not a role where Facebook employees take personal responsibility for vetting stories. Rather, Schrage said, Facebook will attempt to change user behavior with a “think-before-you-share” program and better tools to flag fake news.
Facebook did not immediately respond to our request for comment.
Whether that will be enough, when the demand for fake or nakedly partisan news is seemingly insatiable, remains to be seen.
Jay Hathaway is a former senior writer who specialized in internet memes and weird online culture. He previously served as the Daily Dot’s news editor, was a staff writer at Gawker, and edited the classic websites Urlesque and Download Squad. His work has also appeared on nymag.com, suicidegirls.com, and the Morning News.