Facebook has clarified its positions on hate speech on its platform, according to documents obtained by Motherboard.
In the wake of the Charlottesville, Virginia, protests in August, Facebook added to its training manual for moderators on the social networking site, focusing on white supremacists. The new guidelines differentiate between white supremacy—not allowed—and white nationalism or separatism, which is allowed on Facebook.
According to the documents, Facebook groups accounts and their content according to strong, medium, and weak “signals.” For example, a KKK leader or other avowed member of a white supremacist group would be a strong signal and would require action from moderators.
Facebook judges content on a number of signals, including whether or not the account or post has called for violence against protected groups. But it won’t take action against accounts for calling for a white ethnostate, stating that white nationalism is “extreme right movement and ideology, but it doesn’t seem to be always associated with racism (at least not explicitly).”
But Facebook does acknowledge that white supremacist and white nationalist or separatist sentiments do converge, thus making some speech difficult to classify.
“Overlaps with white nationalism/separatism, even orgs and individuals define themselves inconsistently,” the manual says.
Facebook has previously faced criticism for its handling of hate speech. In April, Zuckerberg apologized for the company’s hate speech policies which activists said led to violence against the Rohingya people of Myanmar. And a 2017 ProPublica report showed how the social network’s censors were trained to remove offensive content—an algorithm that ultimately allowed hate speech against multiple minority groups.
See the full report here.