Article Lead Image

Illustration by Max Fleishman

Why Facebook should kill Trending News once and for all

Facebook can't replace editorial curation with algorithms—and it should stop trying.

 

Gillian Branstetter

Tech

Posted on Sep 1, 2016   Updated on May 26, 2021, 3:12 am CDT

It’s generally not a great sign of competency to cover up an embarrassing mistake with an even more embarrassing one. But just a few month after critics accused Facebook’s Trending News editorial team of bias, the social network cut its editorial staff and let its algorithm run wild with questionable results. Over the last week, the Trending section has lifted conspiracy theories about Megyn Kelly to the front of the site and let them sit for two hours, piled on attention to a video of a man sodomizing a chicken sandwich, and pushed a cute dog video as an advertisement for the upcoming video game Watch Dogs 2.

A sample trending topic from this week

A sample trending topic from this week

Facebook

That would all be silly and in good fun if Facebook did not have so much control over where and how people get their news. A misleading or outright false article reaching the front of Trending effectively gives it more publicity than if it were published on CNN, the New York Times, or Fox News. Whether curated by a professional editorial staff or left to the cold logical hands of an algorithm, Facebook has proven itself completely unwilling and unqualified to match the responsibility it now has to the average news consumer with any sense of editorial authority.

That’s why Facebook should abandon the Trending section altogether.

When Facebook first introduced Trending, it might have been easy to assume it was managed through sheer mathematical rigor and analysis of Facebook users, much like Twitter’s list of trending topics upon which Trending is largely based. But as a Gizmodo report from last May makes clear, Trending was largely managed by a team of very human editors and reporters selecting links from those most frequently shared across the platform.

As the report notes, however, the disdain Facebook held for media types came across in the treatment of these curators, who described “grueling work conditions, humiliating treatment, and a secretive, imperious culture in which they were treated as disposable outsiders.” These same editors probably didn’t help themselves when they revealed they had routinely acted against conservative news sources, but Facebook nonetheless fired them last week in a bid to cleanse itself of the issue.

That contempt for editors is proving ironic, however, because it’s exactly the soft and controlled hand of a good editor that Trending needs most. Although the company claims humans are still very much a part of the process in choosing which stories get elevated to the list, the promotion of the Megyn Kelly story—which proposed the Fox News anchor is a double agent for the Clinton campaign—suggests those workers are either asleep at the switch or denied the level of control they would need to be useful.

Facebook’s Trending is mostly an attempt to show you what’s happening in your world—not the world. 

Which largely leads you to question Facebook’s commitment to managing Trending at all. It doesn’t seem like a particularly difficult mission—selecting the most popular stories from credible sources and listing them daily. Google News does it every day and manages to avoid hosting nutjob conspiracy theories. Facebook’s difficulty in doing so highlights either its complete lack of journalistic ethics or its ambivalence towards managing Trending with the seriousness it deserves.

Even if Facebook wanted out of managing a team of editors, it should certainly put less faith in whatever secretive algorithm is managing the list now. It’s a common saying in the study of artificial intelligence that it’s easier to design a software that can run a restaurant than it is a robot that can bus a table. Jobs that require relatively little mental processing can be difficult for a machine lacking the coordination and physical memory of a human. Although editing is not quite a physical task, it still requires a level of complex understanding and contextual observance that machines have often not provided.

By now, this is an old debate in media circles—the struggle between computer aggregation and human curation. But its very public occurrence within Facebook is given extra weight by the site’s near complete control over the news industry. According to a Pew poll published earlier this year, a third of Americans get their news from Facebook and half of American millennials called it “the most important” way they get their news. It’s a level of control many in the media are uncomfortable with, given the company’s immense wealth and their own dependency upon it for page views—which now surpasses their dependency on Google.

That’s arguably fine if Facebook is just a tool of promotion used by the media—a megahorn for websites to attract the attention of its 1.7 billion users. But when Facebook exerts more control over what kind of content it chooses to promote to that massive audience, that relationship can grow shaky. See, for example, the reactions of media companies to the announcement by Facebook it would be reengineering the algorithm that manages users News Feeds to focus less on news and articles and focus more on users’ family and friends.

The potential impact of such a change—however innocuous the alteration may seem to a layman—is a reflection of Facebook’s ability to shape how people consume their news. As the Verge noted, the mishap with the Megyn Kelly “story” is serious not simply because the story is false but also because Facebook promoted the propaganda of a hyper-partisan website in the middle of one of the most contentious and fact-free elections in modern memory. Americans are already fatigued from separating the false claims put out there by candidates and their surrogates; they shouldn’t have to fear wondering if such lunacy is legitimized or not by a social network.

If Facebook doesn’t want that responsibility, then it shouldn’t manage news aggregation at all—through humans or algorithms. I’s failed the competency test, and you have to wonder what the purpose of Trending is supposed to be. Twitter’s own trending topics list adds to the communal feel of the site. Because topics of debate and conversation move so fast on Twitter, it’s helpful to have an organized guideline to keep new users up to speed (something Twitter officially recognized with the introduction of Moments).

Facebook, on the other hand, is a far more insular experience than Twitter. The use case for the site is not following the news as much as it is following your friends and family. It’s certainly a place of (often obnoxious) political debate, but it’s mostly an attempt to show you what’s happening in your world—not the world. Its refocus of the News Feed shows how much Facebook understands that core aspect of their users’ experience. But further attempts to justify the existence of Trending seem likely to be as misguided and humiliating as they’ve been thus far. 

Share this article
*First Published: Sep 1, 2016, 3:00 am CDT