Poor Facebook. Do they even know what they want to censor anymore, or why?
It’s slightly baffling to see a social media company crack down on the exposure of breasts in any form (including breastfeeding photos and suspicious-looking elbows) while continually excusing the dissemination of decapitation videos. But that’s where we’re at right now! Add it to the list of ways in which we protect young people from the horrors of sex while ensuring they have access to staggering displays of violence.
Beheading imagery had circulated on the network until May, when, under pressure from users, Facebook decided to double back on a community standard that held such disturbing content to be permissible as long as people were denouncing, not glorifying, the violence depicted: “We will remove instances of these videos that are reported to us while we evaluate our policy and approach to this type of content,” the company said at the time, which sounded halfway reasonable.
After taking the time to re-evaluate, however, they shifted back to the original, ridiculous guideline—and offered an all-too-familiar defense when they recently declined to take down a video, supposedly taken in Mexico, of a man cutting a woman’s head off. A spokeswoman, per the BBC:
Facebook has long been a place where people turn to share their experiences, particularly when they’re connected to controversial events on the ground, such as human rights abuses, acts of terrorism and other violent events. People are sharing this video on Facebook to condemn it. If the video were being celebrated, or the actions in it encouraged, our approach would be different.
Oh, I see how it works: We can allow Facebook’s entire user base—all theoretically aged a sturdy 13 and older, though it’s estimated that 38% of registered kids fall short of this requirement—to witness and share any potentially traumatic or scarring material if it’s framed in a negative light. I guess that means that you could upload a photo of forty nude women having a pillow fight as long as the caption slut-shamed them and linked to a website about bed safety. Context is everything, you see.
Of course, in the case of the beheading video at issue, people weren’t condemning the murder so much as they were blasting Facebook for allowing such obviously inappropriate material to fester on the site. “This needs to be banned!” one user commented, with another insisted that Facebook should take it down whether or not it was fake. But let’s not get picky! Condemnation is condemnation, right?
Predictably, many were quick to point out the hypocrisy of a policy that, among other things, bans hate speech but stands proudly behind hateful acts. U.K. Prime Minister David Cameron was one such voice of reason:
It’s irresponsible of Facebook to post beheading videos, especially without a warning. They must explain their actions to worried parents.
— David Cameron (@David_Cameron) October 22, 2013
Appearing to change course once again, Facebook then deigned to delete the beheading video that had reignited the debate, announcing yet another policy change that wasn’t much of a change at all.
“We will consider whether the person posting the content is sharing it responsibly, such as accompanying the video or image with a warning and sharing it with an age-appropriate audience,” the company stated. In other words, the extreme content still isn’t banned, and they still won’t to bother to moderate it unless complaints about a specific post start rolling in.
And what’s stopping them from implementing similar, mildly enforced rules about warnings and age restrictions when it comes to content that, for whatever reason, they consider pornographic? Nothing. Facebook simply refuses to pin down the rules of the playground with any logical consistency. They can’t tell you what’s offensive, but they know it when they see it, making only minor capitulations when the public disagrees with their view.
If that sounds like a mighty convenient way to keep your censorship capricious and unaccountable, that’s hardly an accident. More and more, Facebook’s policies appear designed for maximum flexibility on their end and total incoherence on ours.