facebook-friends.png (1440×720)

“Worse, it perpetuates the presumption that research is dangerous.”

When word spread that Facebook had altered our News Feeds in the name of science, it triggered a tidal wave of backlash condemning the social network as unethical. But in spite of popular sentiment, a group of research ethicists just mounted a defense of the controversial social experiment in a column on Nature.

The six co-authors, writing on behalf of 27 similarly minded academics from a variety of disciples, express concerns that the intensely negative popular response could discourage future studies:

We are making this stand because the vitriolic criticism of this study could have a chilling effect on valuable research. Worse, it perpetuates the presumption that research is dangerous.

The paper goes on to assert that users don’t realize that Facebook is tinkering around with its News Feed formula constantly. The News Feed, far from an objective log of user activity, is constantly manipulated as the company optimizes the content in displays in order to boost user engagement.

As the column argues:

When the average user logs on, Facebook automatically chooses 300 status updates from a possible 1,500 to display in his or her feed. Such manipulation, which often determines how likely people are to view emotionally charged content, aims to optimize user engagement and activity and is how Facebook is able to offer a free service but still make a profit. But how does this affect users’ moods?

There’s certainly an argument that Facebook’s study (in partnership with Cornell) wasn’t properly examined by a research ethics group, known as an Institutional Review Boards (IRB). And Facebook itself provides documentation that language around users consent for research purposes wasn’t added until the study was already underway—something that would never fly in a traditional research setting.

Still, any researcher would love to get their hands on Facebook’s incomparably large sample of the population. Given how much time users spend on the social network, a data-backed understanding of Facebook could empower users as much as it creeps them out.

H/T io9 Illustration via Jason Reed

Taylor Hatmaker

Taylor Hatmaker

Taylor Hatmaker has reported on the tech industry for nearly a decade, covering privacy and government. Most recently, she was the Debug editor of the Daily Dot. Prior to that, she was a staff writer and deputy editor at ReadWrite, a tech and business reporter for Yahoo News, and the senior editor of Tecca. Her editorial interests include censorship, digital activism, LGBTQ issues, and futurist consumer tech.