- High school cheerleading team put on probation for waving Trump banner during a game 3 Years Ago
- ‘Battlestar Galactica’ is getting a reboot from the creator of ‘Mr. Robot’ Today 9:17 AM
- Sean Spicer is already alleging judges are out to get him on DWTS Today 8:52 AM
- Netflix’s ‘Jupiter’s Legacy’ loses showrunner halfway through filming Today 7:36 AM
- ‘Disenchantment’ season 2 starts strong but falls into familiar trappings Today 7:00 AM
- Are Ben Shapiro fans organizing against his opponents on Twitter? Today 6:30 AM
- iPhone overloaded? Here’s how to cancel app subscriptions Monday 11:02 PM
- Fan-created ‘app’ lets users experience the final moments of the ill-fated Jeremy Renner app Monday 10:00 PM
- Milo Yiannopoulos receives lifetime ban from furry convention Monday 7:49 PM
- Snapchat just made all political ads purchased publicly available Monday 6:12 PM
- How to stream Barcelona vs. Borussia Dortmund in Champions League action Monday 5:39 PM
- How to stream Liverpool vs. Napoli in Champions League action Monday 5:19 PM
- How to make real money with Amazon’s Mechanical Turk Monday 5:03 PM
- How to stream Chelsea vs. Valencia in the Champions League Monday 4:47 PM
- ‘SNL’ fires Shane Gillis for racist, homophobic comments Monday 4:41 PM
To that end, the social media giant is rolling out a new feature that shifts the work of flagging false stories to you, the Facebook user.
Surveys have started popping up below stories on Facebook, asking users to evaluate how misleading an article’s language is or whether it leaves out “key details.” The feature doesn’t seem to be widespread yet, but the Verge reports it’s already been spotted on stories from Rolling Stone, the Philadelphia Inquirer, and a U.K. comedy site, Chortle.
Asking users to identify false stories fits with Facebook’s general approach to solving its problems through algorithms, not through hands-on human curation. Earlier this year, Facebook axed the team of contractors who curated its trending topics feed, opting for an automated list instead. False stories started appearing in the trending topics feature immediately afterward.
Now the company seems to be leaning on the wisdom of crowds to solve the fake news problem. Some are already questioning the move, considering those are the same crowds who’ve been duped by the fake stories in the first place.
John Herrmann, veteran social media commentator and David Carr Fellow at the New York Times, tweeted that the new surveys are part and parcel of Facebook’s belief that it’s not a media company, it’s a platform and a self-regulating marketplace:
in a platform purist’s view, social media is just a high-frequency marketplace. its problems are just supercharged ebay problems
— John Herrman (@jwherrman) December 5, 2016
This is all happening in the same week that Facebook’s Eliott Schrage said the presidential election made the company realize it has “a role in assessing the validity of content people share.”
But not a role where Facebook employees take personal responsibility for vetting stories. Rather, Schrage said, Facebook will attempt to change user behavior with a “think-before-you-share” program and better tools to flag fake news.
Facebook did not immediately respond to our request for comment.
Whether that will be enough, when the demand for fake or nakedly partisan news is seemingly insatiable, remains to be seen.
Jay Hathaway is a former senior writer who specialized in internet memes and weird online culture. He previously served as the Daily Dot’s news editor, was a staff writer at Gawker, and edited the classic websites Urlesque and Download Squad. His work has also appeared on nymag.com, suicidegirls.com, and the Morning News.