Tech

Facebook’s new algorithm will help pick the next president

It’s Zuck’s world, we’re just voting in it.

Photo of Aaron Sankin

Aaron Sankin

Article Lead Image

Facebook, CEO Mark Zuckerberg would have you believe, is like a chair. It’s a simply a platform for people to park their butts for a while, reconnect with old friends, get in impassioned arguments with loose acquaintances, and watch watermelons explode

Featured Video

Chairs don’t have political opinions. They don’t favor one presidential candidate over another. They’re fundamentally neutral spaces upon which human beings can spout off any topic they wish.

That’s why, when former Facebook contractors accused the social network of suppressing conservative media in its trending topics, Zuckerberg personally met with a cadre of right-leaning media figures in an attempt to assuage any fears about bias. Facebook officials steadfastly denied any accusations of intentionally slanting the news leftward—but still, perception matters. Political conservatives make up about half of the United States population; if they stop trusting the platform’s neutrality and consequently stop sharing “Hillary for Prison” memes, it could be disastrous for the company’s bottom line.

“We’ve built Facebook to be a platform for all ideas,” Zuckerberg wrote in a Facebook post immediately following the summit. “Our community’s success depends on everyone feeling comfortable sharing anything they want. It doesn’t make sense for our mission or our business to suppress political content or prevent anyone from seeing what matters most to them.”

Advertisement

Whether Facebook employees—who reportedly asked Zuckerberg if they should use the power of their platform to stop presumptive Republican nominee Donald Trump from becoming president—are actually baking #ImWithHer right into the site’s DNA is difficult to know for sure. However, looking at the grand scheme of how Facebook operates, the possibility of employees tweaking the platform to specifically favor one candidate over the other remains in the realm of the theoretical, since actually proving that type of bias from the outside would be nearly impossible.

What is measurable is the recently announced (but likely long in the works) shift in how the algorithm Facebook uses to determine what pops up in users’ News Feed works—and it could make all the difference this November. 

Last week, the company said it would prioritize content shared by users’ friends over articles posted directly to Facebook by news sites and other publishers. As the New York Times notes, the change will result in “significantly less traffic to the hundreds of news media sites that have come to rely on Facebook.”

Research conducted by Facebook itself shows that the frequency with which its users see news articles has a significant effect on their likelihood to vote. This change, made out of a desire to counter a dip in sharing by Facebook users, will have an effect on turnout in elections across the globe and, therefore, hand victories to some candidates and losses to others. If the fight between Trump and his Democratic rival, former Secretary of State Hillary Clinton, comes down to the wire, business decisions made by Facebook have the potential to decide the outcome. Not because Facebook’s brass, let’s say, differs with Trump on immigration, but rather because decisions it made with an eye toward meeting its quarterly revenue goals has the inadvertent effect of encouraging certain demographic groups, which tend to favor a certain candidate, to vote while not doing the same to the other side—at least to the same degree.

Advertisement

In 2012, Facebook conducted an experiment that involved treating news stories shared by friends in the same way as if those friends had announced the birth of their first born. Facebook has a feature where, if its algorithm determined a post about someone’s major life event—such as a wedding engagement, new job, birth—that post would instantly shoot to the top of their friends’ feeds. As Facebook executive Adam Mosseri wrote in a blog post detailing this week’s change, “family and friends come first.”

The stories shown to users in the 2012 were ones they could have seen anyway, since the sites has always always given preference to stories recommended by friends, but what was different was their prominence. Facebook’s researchers wanted to see if exposing a user to more hard news increased that user’s proclivity to vote.

After the election, Facebook sent a survey to the affected users and about 12,000 responded. The respondents reported an increased likelihood to have voted in November. The spike in turnout was greater for infrequent Facebook users than for people who use the site every day.

Unlike a 2010 experiment showing that its deployment of an “I Voted” button increased turnout, Facebook never published the results of its 2012 inquiry in a scientific journal. However, the results were detailed in a presentation by company data scientist Lada Adamic that was posted to YouTube. When Personal Democracy Media co-founder Micah Sifry, who detailed his investigations into Facebook’s effect on voting in a 2014 article for Mother Jones, contacted Facebook about the study, the company took down the clip. Luckily, Sifry took his own recording of the video and posted that to YouTube.

Advertisement


If Facebook ends up showing a smaller number of news articles to its users, it could decrease turnout, but it wouldn’t necessarily swing the election in one direction or the other. However, Facebook’s user base of millions of Americans isn’t exactly representative of the U.S. population as whole.

According to a 2015 study of American internet users conducted by the Pew Research Center, 77 percent of women used the social network compared to 66 percent of men. Facebook use was also strongly correlated with age—younger internet users (age 18-29) were nearly twice as likely to use Facebook than the oldest age group (65+).

Women have tendency to skew toward the Democratic party—especially after the GOP nominated Trump, who has a 70 percent disapproval rating among female voters. Younger people have a similar tendency to favor Democratic candidates. A recent survey by the Case Foundation found slightly more millennials self-identified as conservative than liberal; however, even among conservative millennials, Clinton was outperforming Trump by 14 points.

Advertisement

Put simply: When Facebook shows more hard news articles to all of its American users, it helps Democrats because Democratic-leaning voters make up a larger percentage of its American user base than do Republican-leaning voters. When the company decreases the amount of news users see, the result is the opposite.

Facebook has an enormous amount of power over the online news ecosystem, which, in turn, has an enormous amount of power over the political process. Right now, there’s no way to regulate any of this. Facebook could intentionally only show its users listicles about Clinton’s accomplishments as secretary of state and memes depicting Trump with a Hitler mustache, and it would be entirely legal. Even in the midst of the “trending topics” scandal, polls showed a meager 11 percent of Americans were comfortable with the government imposing regulations regarding content on social networking sites like Facebook.

There’s a good chance we’ve already been living in this reality for a while. “Facebook does tend to experiment with changes before announcing them, so it’s possible that you’ve already seen the worst of the impact,” wrote Jim Anderson, the CEO of the social media optimization platform SocialFlow in an email. “This is particularly true given the decline in Facebook reach we’ve already seen this year.”

Since the trending topics blow-up, Facebook officials have repeatedly insisted upon Facebook’s neutrality, charging the social network isn’t favoring one type of content over any other. “We don’t favor specific kinds of sources—or ideas,” read Facebook’s statement about the recent change. “Our aim is to deliver the types of stories we’ve gotten feedback that an individual person most wants to see. We do this not only because we believe it’s the right thing but also because it’s good for our business.”

Advertisement

However, there’s no such thing as a truly neutral platform. If anything built by human beings is structured to favor some types of things over others and Facebook is no different. Even minor shifts in the type of content can have seismic effects.

“It is certainly hard to make predictions about how it might affect the outcome given that there are so many factors at play—especially in an election as high-profile as this one—but, given Facebook’s powerful influence, it is certainly something to consider,” noted Carrie Brrown, the director of the Social Journalism Program at the CUNY Graduate School of Journalism. “It is not exactly a surprise that Facebook would change its algorithm (it has always been volatile) or that it would favor your social connections over news. In many ways, this means that publishers need to work harder to understand their audience and find ways to serve them in ways that make them want to like and share news.”

The site’s single-handed ability to tilt this year’s presidential election to one candidate or the other will ultimately come down to whether or not the race comes right down to the wire. If stats guru Nate Silver’s judgment on the general election is better than it was for the Republican Primary, Clinton will beat Trump by such a landslide nothing Facebook could ever do will fundamentally alter the result. But there are thousands of other races around the country that will certainly come down to a narrow margin. In those instances, Facebook’s latent effect on turnout could easily decide the result.

It’s likely that Facebook’s influence on the November election will result more from strategies employed to ensure the company meets its revenue goals than anything having to do with which candidate any company executive personally wants to see in the Oval Office.

Advertisement

In a way, that’s comforting. But, in another, it’s much more unsettling. 

 
The Daily Dot