- Lizzo’s thong dress breaks the internet 5 Years Ago
- Pixel Buds 2 or Apple AirPods 2: Which are right for you? 5 Years Ago
- It’s 2019: Make your holiday cards online, for free this year 5 Years Ago
- Fighting over the ‘Marriage Story’ fight scene becomes a meme 5 Years Ago
- ‘Trump is innocent!’: InfoWars correspondent interrupts impeachment hearing Today 12:12 PM
- Video shows runner smacking reporter’s butt on live TV Today 11:46 AM
- 27 senators call on Trump to fire Stephen Miller Today 11:13 AM
- Conservatives are fighting over whether porn is OK Today 10:39 AM
- The best in tech gifts for women this year Today 10:39 AM
- Why do the Golden Globes keep sidelining women filmmakers? Today 10:37 AM
- Netflix dominates with 34 Golden Globe nominations across TV and film Today 10:27 AM
- Ethan Klein has declared war on K-pop—and K-pop fans Today 10:22 AM
- People are not happy with Steve Harvey’s cartel comment to Miss Colombia Today 10:21 AM
- The decade conspiracy theories overtook the truth Today 9:14 AM
- Marianne Williamson duped into believing Trump pardoned Charles Manson Today 8:55 AM
Facebook added key section to ToS 4 months after controversial experiment
In the ongoing fallout from Facebook’s controversial “emotional contagion” study, the social network and its defenders have relied heavily on claims that a section buried deep inside Facebook’s terms of service gave the company permission to utilize user data for research purposes.
It now seems, however, that Facebook didn’t even go that far before deciding to let researchers tamper with the emotions of nearly 700,000 users by manipulating their News Feeds to either show mostly positive or mostly negative updates from friends.
Forbes reports that ToS language granting Facebook the permission to take user information and use it for “internal operations” that included “research,” was not added until May of 2012—four months after the experiment was conducted.
In a screenshot taken by Forbes, the relevant change is clear. In May of 2012, Facebook added the line about how it might use data “for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”
Screengrab via Forbes
But even if the clause had been part of Facebook’s privacy terms prior to the experiment, it’s still not clear if that gives them proper cover for their actions. Legal experts say there is a big difference between Facebook saying it plans to use user data for research and actually manipulating its users.
“If you are exposing people to something that causes changes in psychological status, that’s experimentation,” James Grimmelmann, a professor of technology and the law at the University of Maryland, told Slate. “This is the kind of thing that would require informed consent.”
Informed consent would have involved telling users about an experiment beforehand with the option to opt-out, then disclosing the specifics of the research as quickly as possible once the experiment had concluded.
In a post published earlier this week, Facebook researcher Adam Kramer, defended the experiment, and insisted that the study impacted a “relatively” small subset of Facebook users.
‟The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.”
The backlash against Facebook has been intense in the days since news of the experiment broke over the weekend. Many say this is the “final straw” for a company that has played fast and loose with user privacy in the past.
Photo by kris krüg/Flickr (CC BY-SA 2.0)
Tim Sampson is a reporter who focused on the technology, business, and politics beats. He's also an established comedy writer, with work on Comedy Central and in The Onion and ClickHole.