Illustration by Max Fleishman (Licensed)
If this headline appeared as a Facebook post, Facebook might not show it to you. The company is doubling down once again on punishing clickbait—the kind of content that over-promises and under-delivers, or comes at you with headlines that require you to click to know the full story.
Facebook says its users want to see fewer clickbait headlines. Facebook defines these as headlines that "intentionally leave out crucial information, or mislead people, forcing people to click to find out the answer."
Facebook was once the breeding ground for clickbait. Sites like Upworthy built massive followings and grew exponentially because of the way it teased readers with clicky headlines like "Bullies Called Him Pork Chop. He Took That Pain With Him And Then Cooked It Into This."
The early days of clickbait capitalized on the curiosity gap, our insatiable desire to pursue knowledge we don't already have. The clickbait theory is backed by science—the rewards center in our brains is activated when we read and click on clickbait. Our desire to satiate curiosity is similar to our needs to satisfy hunger or sex.
And yet, people hate it. Or they think they hate it. Because often clickbait is not as satisfying as what is promised, and we feel guilty after clicking on a story we know will probably not bring much value to our lives. Once you finish a story, you might think to yourself, "No, this isn't all that shocking."
Facebook began cracking down on clickbait, the very same content it nurtured and grew, two years ago. Reaction was swift. Publishers felt the burn of Facebook's turnaround as traffic fell on sites who relied on the curiosity gap to drive an audience. Since then, multiple updates continue to push publishers to deliver more straightforward news and information.
The actual definition of clickbait is somewhat murky, because it's impossible to contain all relevant information in a headline, so you have to click on something to fully understand a story. And while some clickbait genuinely under-delivers, often "clickbait" can be synonymous with, "I didn't like this article; therefore, it was clickbait."
Facebook neglects to call itself a media company, but so much of what the company does directly impacts the information people consume online. According to Pew Research, 63 percent of users get news on the site. Any tweaks to Facebook's algorithm directly impact what people see in their news feeds, including news from Pages or links shared by friends.
Recently, Facebook came under fire for its Trending section, and critics claimed Facebook was unfairly (and potentially prejudicially) promoting certain stories for more people to see. The relationship between Facebook and publishers is imbalanced—publishers rely on the massive audience on Facebook, while Facebook puts its own desires first, and, at the touch of an algorithm, could send traffic plummeting.
To combat clickbait, Facebook built something of a spam filter for your news feed. Facebook explains the system "looks at the set of clickbait headlines to determine what phrases are commonly used in clickbait headlines that are not used in other headlines." Facebook knows which posts are clickbait and where they're coming from; thus these posts will be punished by not showing up in your feed. Facebook says if a Page stops posting clickbait, content will stop being pushed down.
With the increased clickbait manipulation, Facebook also said Pages who post clickbait should expect distribution to decrease. Is anyone actually shocked?