Robert Scoble/Flickr (CC-BY)
The company admits it made a mistake.
In what may be the biggest fail of the year, Facebook, the social media network that just released an app for children under 13, asked users if they think it’s OK for a male pedophile to request sexual pictures from a 14-year-old girl.
The shocking question was asked in a survey sent to users about how the company should respond to certain behavior. This is the full question as it appeared in the survey:
“There are a wide range of topics and behaviours that appear on Facebook. In thinking about an ideal world where you could set Facebook’s policies, how would you handle the following: a private message in which an adult man asks a 14-year-old girl for sexual pictures.”
Participants could then correctly respond with “This content should not be allowed on Facebook, and no one should be able to see it” or choose three other very wrong options: “This content should be allowed on Facebook, and I would not mind seeing it,” “I have no preference on topic,” or “This content should be allowed on Facebook, and I would not mind seeing it.”
That’s not all. A follow-up question asked users who should be responsible for this decision: Facebook, external experts, or users. The correct option: the law, was not listed.
“This is a stupid and irresponsible survey,” Yvette Cooper, a member of British Parliment and chair of the Home Affairs Select Committee, told the Guardian. “Adult men asking 14-year-olds to send sexual images is not only against the law, it is completely wrong and an appalling abuse and exploitation of children.”
Facebook admitted the inconceivable question was a “mistake” after the Guardian first reported it. The company’s vice president of product, Guy Rosen, took to Twitter to offer an apology of sorts.
We run surveys to understand how the community thinks about how we set policies. But this kind of activity is and will always be completely unacceptable on FB. We regularly work with authorities if identified. It shouldn't have been part of this survey. That was a mistake.
— Guy Rosen (@guyro) March 4, 2018
“We run surveys to understand how the community thinks about how we set policies,” he said. “But this kind of activity is and will always be completely unacceptable on FB. We regularly work with authorities if identified. It shouldn’t have been part of this survey. That was a mistake.”
Not only is the survey clearly in poor taste, it also asks the question of whether Facebook would consider letting a man beg a 14-year-old for sexually explicit images. The company dispelled those concerns in an official response to the Guardian.
“We understand this survey refers to offensive content that is already prohibited on Facebook and that we have no intention of allowing so have stopped the survey,” the company wrote. “We have prohibited child grooming on Facebook since our earliest days; we have no intention of changing this and we regularly work with the police to ensure that anyone found acting in such a way is brought to justice.”
For the past few months, Facebook has been deflecting criticism for its alleged effect on children and young adults. In November, former president of Facebook Sean Parker argued the social network exploits its user base by administering a “dopamine hit” in the form of new features. He said the company he once worked for gained success by “exploiting a vulnerability in human psychology.” Not long after, another former executive, Chamath Palihapitiya, said Facebook is destroying society.
It’s no surprise then that the world’s largest social media service was slammed after announcing Messenger Kids, a new messaging app that circumvents laws put in place to prevent children from accessing social media. More than 110 health experts denounced the company, urging it to abandon the app. Facebook refused, arguing the app helps kids keep in touch with friends and parents who live elsewhere.
The bewildering survey question won’t help its image. Fortunately, Facebook has reportedly scrapped the survey altogether. It will now need to use its own judgment to decide if pedophilia is OK on its platform.