Article Lead Image

How Facebook manipulated users’ News Feeds—and their emotions

Mark, you have a lot of 'splainin' to do. 

 

EJ Dickson

Tech

Posted on Jun 29, 2014   Updated on May 31, 2021, 1:20 am CDT

With its long history of invading its users’ privacy, Facebook is not exactly known as a paragon of morally upright behavior. But many say the social network went too far with its latest stunt: A new paper reveals that Facebook intentionally manipulated at least 700,000 users’ News Feeds, in an effort to assess how changes to the website affected their emotions.

Published in the Proceedings of the National Academy of Sciences, the paper reveals that Facebook conducted a massive experiment to determine the “emotional contagion” effect, by testing whether reducing the number of positive posts you saw on your News Feed would make you less happy.

To do this, Facebook tweaked its algorithm to make sure some users saw primarily positive posts, some saw negative posts, and some saw neutral posts in their News Feeds. They then waited to see whether the emotional content of the posts in users’ News Feeds had any effect on what they subsequently posted.

The result? Yes, it totally does: The researchers, who were from Facebook, Cornell, and the University of California-San Francisco, all revealed that users who saw positive posts in their News Feeds were more likely to post positive posts themselves, and those who saw predominantly negative posts were more likely to produce negative content.

In terms of assessing how our friends’ moods on social media affect our own emotional well-being, the study was invaluable. But was it ethical? Eh, probably not so much.

Because Facebook did not obtain express consent from its victims—sorry, subjects—many bioethicists and law experts are questioning whether the study breached social scientific ethical standards. “If you are exposing people to something that causes changes in psychological status, that’s experimentation,” James Grimmelmann, a professor of technology and law at the University of Maryland, told Slate. “This is the kind of thing that would require informed consent.”

For their part, Facebook insists that the study was perfectly legal, and that users supplied implicit consent by agreeing to Facebook’s Data Use Policy (scroll down toward bottom of page), which grants Facebook access to your data “for internal operations, including troubleshooting, data analysis, testing, research, and service improvement.” It’s that “research” component in the website’s fine print that Facebook is currently relying on to get away with the study.

But the study has some pretty troubling implications for how far the social network is willing to go to collect data from its users. It’s pretty clear from this paper that Facebook is not content with just toying with our ads—it wants to toy with our emotions, as well.


H/T Forbes | Photo by Kris Krug/Flickr (CC BY-SA 2.0)

Share this article
*First Published: Jun 29, 2014, 11:10 am CDT