Facebook has apologized for performing psychological experiments on its users without their informed consent—but researchers still aren’t happy with the changes the social networking giant is proposing.
In June 2014, news emerged that Facebook had been conducting research into “emotional contagion” between users of their service by manipulating what appeared on News Feeds and measuring the resulting emotional response. Almost 700,000 Facebook users were involved in the study, entirely without their knowledge, in what critics damned as a serious breach of ethical standards.
One British MP even called for a formal investigation, saying that Facebook was “manipulating material from people’s personal lives and I am worried about the ability of Facebook and others to manipulate people’s thoughts in politics and other areas.”
At the time, a researcher acknowledged that they “didn’t clearly state [their] motivations” in the paper, but now a blog post written by Facebook CTO Mike Schropfer admits that “it is now clear that there are things we could have done differently.”
“Non-experimental” research methods should have been considered, writes Schroepfer, and “the research would also have benefited from more extensive review by a wider and more senior group of people.”
“Last,” he adds, “in releasing the study, we failed to communicate why and how we did it.” Schroepfer says new guidelines will be introduced to regulate any future research carried out on Facebook’s users—but PC World reports that researchers believe the company still “has a ways to go yet toward injecting transparency and ethics into its research practices.”
For starters, “ethics” were not referred to once in the blog post, and the guidelines were not laid out in any extensive detail. “Facebook needs to be more transparent about its research methods and ethics,” ethicist Elizabeth Buchanan told PC World, “and we didn’t get that today.”
Another ethics researcher, Irina Raicu, said that “outside involvement is important,” and that Facebook’s new review panel involving “senior subject-area researchers … people from engineering, research, legal, privacy and policy teams” is “not the answer.”
Facebook is also debuting a new Research hub to centralize their studies, available at research.facebook.com. Schroepfer signs off by arguing that research “helps us build a better Facebook,” and that it’s “important to engage with the academic community … [because] Facebook can help us understand more about how the world works.”
“We want to do this researching a way that honors the trust you put in us,” he says, and that “we will conitune to learn and improve as we work toward this goal.” Let’s just hope this “learning” involves actually asking people’s permission before experimenting on them next time.