Photo via Gage Skidmore/Flickr (CC-BY-SA)
The Facebook founder didn't mention the 2016 Republican presidential front-runner by name during his keynote address at the annual F8 conference on Tuesday, but there was no doubt that Zuckerberg was using his platform to hammer Trump's rhetoric on policies ranging from immigration to foreign trade.
“As I look around the world, I’m starting to see people and nations turning inward, against the idea of a connected world and a global community,” Zuckerberg said. “I hear fearful voices calling for building walls and distancing people they label as ‘others.’ I hear them calling for blocking free expression, for slowing immigration, for reducing trade, and in some cases even for cutting access to the Internet.”
Zuckerberg delivered his veiled public criticism of Trump just weeks after, as Gizmodo reported Friday, Facebook employees asked Zuckerberg, “What responsibility does Facebook have to help prevent President Trump in 2017?”
Were Zuckerberg to decide that the social media giant has that responsibility, it raises a key question for the American people: Could Facebook actually prevent Trump from winning the White House?
The answer is almost certainly yes—and there may be no way to stop it.
Zuckerberg vs. Trump
Trump and Zuckerberg have been butting heads from a distance since the real-estate heir launched his presidential campaign on an anti-immigrant platform last summer.
“I hear fearful voices calling for building walls and distancing people they label as ‘others.’”
Trump's proposal to take an extremely hard line on China could spark a trade war, which would undoubtedly hamper Facebook's efforts to get the Chinese government to allow its citizens to use the social networking platform. In addition to Trump's hostility to undocumented immigration, which has seen him calling for the construction of a wall on the U.S.-Mexico border and praising a widely derided mass deportation effort from the 1950s called Operation Wetback, Trump has also been publicly hostile to certain forms of legal immigration that Zuckerberg wants to expand.
Both personally and through his nonprofit immigration reform group, Fwd.us, Zuckerberg has been a major proponent of the H1-B visa program, which allows U.S. companies to temporarily give residency to highly skilled foreign workers who can fill specific needs for a business. The use of H1-B visas is popular among Silicon Valley tech firms looking for skilled workers at a moment when the market for certain types of engineering talent has truly become global, and maybe also catching a break on labor costs.
Trump, however, charged that the program disadvantages American workers at the expense of their foreign competitors, and he proposed changing the program to make it more difficult for companies to bring in workers from overseas. In a policy paper posted to Trump's website last year, the candidate made a dig directly at Zuckerberg, charging, “Mark Zuckerberg’s personal senator, Marco Rubio, has a bill to triple H-1Bs that would decimate women and minorities.”
In response to Zuckerberg's F8 comments, a Trump campaign spokesperson equated undocumented immigration to crime, telling CNBC, “I'll take Mark Zuckerberg seriously when he gives up all of his private security—move out of his posh neighborhood and come live in a modest neighborhood near a border town. Then I'm sure his attitude would change.”
While social media platforms like Facebook have been instrumental to Trump's political success, there's clearly no love lost between the candidate and the social networking wunderkind. Zuckerberg clearly sees a Trump presidency as dangerous for the future of the country. If Zuckerberg wanted to throw his full weight behind the #NeverTrump movement, he has a tool far more powerful in influencing the outcome of the election than his billions of dollars.
Zuckerberg has Facebook.
Tipping the scale
If Trump emerges from what's shaping up to be an unprecedentedly chaotic Republican convention with the nomination, he's already going to be facing a stiff headwind—veteran political analyst Larry Sabato sees Hillary Clinton crushing Trump by a margin of 156 electoral votes. But elections are unpredictable. If that gap closes, powerful platforms like Facebook have the ability to move the dial in the number of important ways.
Before getting into the specifics, it's crucial to point out that there's no indication that Facebook has ever deliberately targeted a candidate, nor do they have any stated intention to do so. Facebook's power is largely in the network effect gained from its ubiquity, how it's used by a broad cross-section of the public. If people thought the social network went out of its way to alter the course of an election, it would enrage the supporters of the candidate it moved against and likely incentivize them to use website less frequently or even abandon it altogether.
“Voting is a core value of democracy and we believe that encouraging civic participation is an important contribution we can make to the community. We’re proud of our work on this,” a Facebook spokesperson told the Daily Dot. “While we encourage any and all candidates, groups, and voters to use our platform to engage on the elections, we as a company are neutral and have not used our products in a way that attempts to influence how people vote.”
As a platform, Facebook has an enormous ability to influence public perception in a way subtle enough that it may be impossible to detect. Regardless of whether or not Facebook would actively discriminate against Trump or any other candidate—and there's a strong incentive for the company not to do so—it's abundantly clear that it could.
Facebook first systematically looked at its ability to influence voting behavior on Election Day in 2010. As detailed in a study published two years later, researchers at Facebook showed some users a button at the top of their news feeds allowing them to tell their friends that “I Voted” and encouraging them to do their democratic duty if they hadn't yet done so, while other users weren't shown the same message. The researchers compared the data to the actual voter rolls and discovered the feature had a significant effect in boosting turnout—not just for people who were shown the button; there was a ripple effect among their friends as well.
“Our results suggest that the Facebook social message increased turnout directly by about 60,000 voters and indirectly through social contagion by another 280,000 voters, for a total of 340,000 additional votes,” the researchers wrote.
When compared to the 2010 electorate a whole, this number is not particularly large It represents about 0.14 percent of the 236 million Americas eligible to vote in that year's midterm election. Still, presidential elections have been decided by slimmer margins—the 2000 Bush v. Gore contest, for example. Additionally, the “I Voted” effect was the result of a single item being shown on a single day. How would it affect voting behavior if Facebook consistently altered what appeared in people's news feeds over time? Two years later, the company aimed to find out.
“While we encourage any and all candidates, groups, and voters to use our platform to engage on the elections, we as a company are neutral.”
In 2012, Facebook's researchers wanted to determine if increased exposure to hard news about politics affected users' propensity to vote. They tweaked the news feeds of approximately 2 million users so that if any of their friends had shared a news story, that story would be boosted to the top of their feed. These are stories the targeted users had a probability of seeing anyway. If one of your Facebook friends likes or shares a piece of content normally, there's a chance it will end up in your feed. This experiment all but guaranteed that exposure for a certain type of content.
Facebook then polled those users and received 12,000 responses. The respondents reported an increased likelihood to follow politics and were more likely to report voting in the November election. Interestingly, the effect was more pronounced for infrequent Facebook users than it was for people who logged in every day like clockwork.
Facebook data scientist Lada Adamic detailed the 2012 experiment in a public presentation. But when Personal Democracy Media co-founder Micah Sifry contacted Facebook about the study while doing research for a 2014 article he published in Mother Jones, the clip was quickly taken down from YouTube.
Sifry later uploaded a video of that video to YouTube:“First of all, Facebook has pretty much routinized the use of its voter megaphone tool and has been deploying it in countries around the world when there's a democratic election. They've really offered no additional transparency about how the tool works,” Sifry told the Daily Dot, nodding to Facebook's famously opaque algorithm. “I consider it to be an ongoing problem that we basically have to trust the engineers at Facebook to use this [ability to put a] finger on the scale in a completely neutral way.”
Engineers, remember, who may very well feel that the company should place that finger.
Sifry recalls back in 2007, when President Barack Obama, then a U.S. senator, was still in early stages of his hunt for the keys to the Oval Office. When Facebook launched Platform, a toolkit that allowed third-parties to develop applications for the site, the Obama team was the first political campaign to gain access. At the time, this move raised questions about why Obama got early access while other campaigns, like those of Sen. John McCain (R-Ariz.) and or then-New York Sen. Hillary Clinton, didn't get the same early access. Should that exclusive access be viewed as an in-kind contribution from Facebook to the Obama campaign? Or was it just one promising, young, tech-savvy organization offering the opportunity to beta test its product to another promising, young, tech-savvy organization?
The core of the issue is that, 12 years after its birth in a Harvard dorm room, Facebook has come to play a such a crucial role in how people around the world communicate, virtually everything it does threatens to have an effect on politics. Through its ubiquity, Facebook is often viewed as a utility akin the telephone company, rather than just another website adrift on an seemingly endless and indifferent Internet.
What they don't know, they can't regulate
The power to theoretically decide the outcome of election isn't limited to Facebook. Similar concerns have been raised about the power of Google to swing election results. A 2015 study published in the Proceedings of the National Academy of Sciences found that manipulating the order links appeared in the results page of a search engine could have a dramatic effect on undecided voters' perceptions of political candidates.
This problem naturally invites the question of whether regulation is needed to police how these web giants can affect elections. When contacted by the Daily Dot, representatives from the Federal Elections Commission, Federal Trade Commission, and the Federal Communications Commission all indicated this issue wasn't covered under their specific jurisdictions.
Paul Ryan, deputy executive director at the election watchdog group the Campaign Legal Center, said that if Facebook or Google were directly coordinating with a candidate to manipulate what their users saw for the campaign's benefit, then it would be considered an in-kind contribution, effectively a donation. However, that rule only applies if there's direct coordination.
“If Facebook or Google or another Internet business were to manipulate their public interface for the benefit of a candidate, the company would be sailing in uncharted legal waters.”
“If it’s doing these things independently of candidates, and they stop short from expressly advocating a candidate’s election or defeat (e.g., Google stops short of including a message like 'Vote for Trump' at the top of its list search results for a term like 'presidential election'),” Ryan said, “then federal campaign finance laws wouldn’t apply.”
“At any rate, I’m quite certain that the Federal Election Commission has never considered in any formal way (rule-making, advisory opinions, enforcement actions) how federal campaign finance laws would or would not apply to such activities,” he continued. “So if Facebook or Google or another Internet business were to manipulate their public interface for the benefit of a candidate, the company would be sailing in uncharted legal waters.”
Any attempt to regulate how online platforms affect voter turnout is extremely tricky. Due to the demographics of its user base in the United States—which tends toward the young, the female, and the urban—even if Facebook equally boosted turnout across the board, it would advantage Democrats over Republicans, because those groups have a tendency lean Democratic rather than Republican. Facebook encouraging more people to vote is an unequivocal good, but even when the company applies civic pressure to the public uniformly, there's likely be a partisan advantage. In that context, it's impossible to impose regulation without prohibiting the social network from making any moves to boost voter turnout.
Facebook publishing the results of its experiments is a rare occurrence, but the firm—like every other Internet company worth its salt—is constantly running experiments, tweaks in the design of its platform to see how they affect user behavior. The Internet makes this so-called A-B testing relatively easy, and requiring companies to check with government regulators before each test, just to ensure its not boosting one side over the other, is unreasonably onerous.
Of course, enforcing any rules is dependent on determining that this sort of manipulation is even occurring in the first place. Large teams of people working in tandem across the country on Election Day might be able tell if only Democrats see the “I Voted” button, but no one outside of Facebook's administrators would be able to tell if Republicans were slightly more likely than Democrats to see the type of hard news stories that would activate them to vote.
So, what if Facebook's algorithms determined the type of news stories that increased civic participation also triggered more sharing, but only for Democrats and not Republicans? In that case, would even Facebook know what effect it was having on democracy?
Maybe. But, then again, maybe not.