Joining the chorus of dissenters objecting to Facebook’s recent “emotional contagion” experiment, the editor-in-chief of the academic journal that published the findings of the research project is now saying he has concerns about having done so.
Inder M. Verma, editor of the Proceedings of the National Academy of Sciences, said he shares concerns about a widely perceived ethical lapse in the way researchers sought to discover whether the moods of Facebook users could be affected by the moods of their friends.
For their experiment, the researchers deliberately manipulated the newsfeeds of nearly 700,000 Facebook users to highlight either positive or negative postings in order to measure the effect on the person’s mood. The experiment took place over the course of a week back in January 2012.
Participants were not informed of their involvement in the experiment before hand nor given any sort of ability to opt out. Facebook users only found out about the experiment when the study was published last month.
“Based on the information provided by the authors, PNAS editors deemed it appropriate to publish the paper,” Verma wrote in a letter to the Washington Post. “It is nevertheless a matter of concern that the collection of the data by Facebook may have involved practices that were not fully consistent with the principles of obtaining informed consent and allowing participants to opt out.”
Facebook has defended the experiment, arguing that a clause in the site’s terms of service informs users that their data may be used for research purposes. But that policy wasn’t added until four months after the experiment took place. And many argue that doesn’t give Facebook the authority to manipulate users for research purposes.
This statement of concern comes amid mounting backlash against the study. Last week, the Electronic Privacy Information Center (EPIC) filed a formal complaint with the Federal Trade Commission, asking that sanctions be imposed on Facebook for this breach of user trust. EPIC is also asking for the public disclosure of the algorithms Facebook uses to determine what appear in users’ news feeds.
In defending his work, lead researcher Adam Kramer wrote that, “The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone.”
However, it’s since come out that one of the credited researchers on the project has ties to a Department of Defense initiative researching “civil unrest,” suggesting other motives for the research.
Photo via Morguefile.com (PD)/Remix by Jason Reed