Article Lead Image

Illustration by Max Fleishman

On Facebook’s role in the election, Mark Zuckerberg says the country should ‘go work even harder’

At least he still feels 'hopeful.'


Josh Katzowitz


Posted on Nov 10, 2016   Updated on May 25, 2021, 2:51 pm CDT

If you thought Mark Zuckerberg was going to apologize for Facebook influencing the 2016 election and handing a victory to Donald Trump, you would be mistaken. Despite a new theory that the rise in fake news appearing on Facebook and then going viral influenced would-be voters, Zuckerberg hasn’t allowed his company to be blamed.

In a Facebook message he posted on Wednesday, Zuckerberg said progress does not move in a straight line and that the country should “go work even harder.” His status update officially was: “Feeling hopeful.”

[Placeholder for embed.]

Yet others have had no problems blaming Facebook’s news-gathering operation for helping change people’s minds and possibly turning Hillary Clinton voters into those who pulled the lever for Trump.

The Pew Research Center wrote this week that 20 percent of social media users say “they’ve modified their stance on a social or political issue because of material they saw on social media and 17 percent say social media has helped to change their views about a specific political candidate.”

Pew also reported that Democrats (and, in particular, liberal Democrats) were more likely to say they had changed their views because of something they saw on social media.

And that could have been because many fake stories deriding Democratic nominee Hillary Clinton have gone viral on Facebook.

Facebook VP of Product Management Adam Mosseri, in fact, told Tech Crunch that the company needed to do a better job of filtering out the fake stories:

“We take misinformation on Facebook very seriously,” Mosseri said in a statement. “We value authentic communication, and hear consistently from those who use Facebook that they prefer not to see misinformation. In Newsfeed, we use various signals based on community feedback to determine which posts are likely to contain inaccurate information and reduce their distribution. In Trending, we look at a variety of signals to help make sure the topics being shown are reflective of real-world events, and take additional steps to prevent false or misleading content from appearing. Despite these efforts we understand there’s so much more we need to do, and that is why it’s important that we keep improving our ability to detect misinformation. We’re committed to continuing to work on this issue and improve the experiences on our platform.”

During the summer, Facebook changed the way its Newsfeed was curated, leading the Daily Dot’s Aaron Sankin to write a story in July with the headline, “Facebook’s new algorithm will help pick the next president.” 

After he and his colleagues were fired so the company could rely completely on an algorithm, a former Facebook part-timer who curated the news told Gizmodo they had been suppressing conservative-leaning websites from appearing in Facebook’s Trending section. Zuckerberg than had to meet with leading conservatives to assure them that Facebook was trying to be objective and that he was trying to give everyone a voice.

But that certainly was not the end to the Newsfeed and Trending problems. Tech Crunch asked Facebook a number of questions—including whether Facebook had a “specific response to Buzzfeed’s investigation of websites in Macedonia being used to generate large numbers of fake news stories that were placed into the Newsfeed?”—but that one-paragraph statement was all Mosseri had to say.

Here’s hoping it’s not the end of Facebook answering questions about its role in providing (and making) the news moving forward.

Share this article
*First Published: Nov 10, 2016, 7:56 pm CST