- No, the first words of Trump’s tweets don’t match up to lyrics of ‘Break My Stride’ Sunday 10:28 PM
- White woman demanding strangers ‘repent’ for Christ sparks conversation on mental illness and racism Sunday 9:27 PM
- Amtrak employee asked a NAACP lawyer to move from her train seat Sunday 7:54 PM
- Billie Eilish fans riot after being referred to as ‘Avocados’ Sunday 4:37 PM
- Beyhive coming for Sainsbury’s supermarket over Ivy Park shade Sunday 3:17 PM
- Antique store blasted for selling ‘white only’ signs Sunday 1:45 PM
- DaBaby explains altercation with hotel employee after video goes viral Sunday 12:32 PM
- Kanye faces backlash for headlining Christian event with anti-LGBTQ leaders Sunday 10:31 AM
- Why is Yennefer of Vengerberg so different in Netflix’s ‘The Witcher’? Sunday 10:00 AM
- Actress slammed for ‘acid attack-face’ TikTok challenge Sunday 9:46 AM
- ‘Weathering With You’ blends fantasy and realism in a magical love story Saturday 6:18 PM
- Kidnapped teen used Snapchat to get rescued Saturday 4:35 PM
- What fans do and don’t want to see in future ‘Far Cry’ installments Saturday 4:26 PM
- Aaron Carter accused of stealing lion art for merch Saturday 3:10 PM
- Instagram’s hidden like counts were inspired by a ‘Black Mirror’ episode Saturday 2:06 PM
How Facebook manipulated users’ News Feeds—and their emotions
Mark, you have a lot of ‘splainin’ to do.
With its long history of invading its users’ privacy, Facebook is not exactly known as a paragon of morally upright behavior. But many say the social network went too far with its latest stunt: A new paper reveals that Facebook intentionally manipulated at least 700,000 users’ News Feeds, in an effort to assess how changes to the website affected their emotions.
Published in the Proceedings of the National Academy of Sciences, the paper reveals that Facebook conducted a massive experiment to determine the “emotional contagion” effect, by testing whether reducing the number of positive posts you saw on your News Feed would make you less happy.
To do this, Facebook tweaked its algorithm to make sure some users saw primarily positive posts, some saw negative posts, and some saw neutral posts in their News Feeds. They then waited to see whether the emotional content of the posts in users’ News Feeds had any effect on what they subsequently posted.
The result? Yes, it totally does: The researchers, who were from Facebook, Cornell, and the University of California-San Francisco, all revealed that users who saw positive posts in their News Feeds were more likely to post positive posts themselves, and those who saw predominantly negative posts were more likely to produce negative content.
In terms of assessing how our friends’ moods on social media affect our own emotional well-being, the study was invaluable. But was it ethical? Eh, probably not so much.
Because Facebook did not obtain express consent from its victims—sorry, subjects—many bioethicists and law experts are questioning whether the study breached social scientific ethical standards. “If you are exposing people to something that causes changes in psychological status, that’s experimentation,” James Grimmelmann, a professor of technology and law at the University of Maryland, told Slate. “This is the kind of thing that would require informed consent.”
For their part, Facebook insists that the study was perfectly legal, and that users supplied implicit consent by agreeing to Facebook’s Data Use Policy (scroll down toward bottom of page), which grants Facebook access to your data “for internal operations, including troubleshooting, data analysis, testing, research, and service improvement.” It’s that “research” component in the website’s fine print that Facebook is currently relying on to get away with the study.
But the study has some pretty troubling implications for how far the social network is willing to go to collect data from its users. It’s pretty clear from this paper that Facebook is not content with just toying with our ads—it wants to toy with our emotions, as well.
H/T Forbes | Photo by Kris Krug/Flickr (CC BY-SA 2.0)
EJ Dickson is a writer and editor who primarily covers sex, dating, and relationships, with a special focus on the intersection of intimacy and technology. She served as the Daily Dot’s IRL editor from January 2014 to July 2015. Her work has since appeared in the New York Times, Rolling Stone, Mic, Bustle, Romper, and Men’s Health.