- Tom Holland rescues fan getting squashed by autograph hounds Tuesday 7:14 PM
- What is incel ‘Chadfishing’? Tuesday 6:36 PM
- Facebook to give France data on users suspected of hate speech Tuesday 5:29 PM
- This 89-year-old man is stunned by all the technology around him—in 1930 Tuesday 5:21 PM
- Wayfair refuses to stop furnishing migrant detention centers Tuesday 4:48 PM
- Woah! How did Keanu Reeves get so small? Tuesday 4:37 PM
- The centrist argument against Sanders’ student loan plan is getting ripped apart Tuesday 4:08 PM
- Jonathan Frakes confirms that you’re right in yet another meme Tuesday 3:56 PM
- Meryl Streep, Ariana Grande set to star in Netflix’s ‘The Prom’ Tuesday 3:35 PM
- ‘Stranger Things’ Season 3 goodies are here just in time Tuesday 3:01 PM
- Kim Kardashian’s shapewear line Kimono is already getting called out Tuesday 2:11 PM
- ‘Aggretsuko’ tones down the rage in season 2 Tuesday 1:13 PM
- TikTok is being used to call out predators Tuesday 12:41 PM
- Republican congressman wants to defund PBS over the gay rat wedding Tuesday 12:39 PM
- Elizabeth Warren calls for sweeping overhaul of U.S. elections Tuesday 11:47 AM
Over 70 activists groups call on Mark Zuckerberg to reveal Facebook’s censorship policies
‘Greater transparency means Facebook can no longer operate in secrecy.’
In an open letter to Facebook founder Mark Zuckerberg, a coalition of 73 activist group urged the social network to provide greater transparency about its policies for removing user content upon government request.
The letter, which was singed by the leadership of progressive activist organizations like SumOfUs, the Center for Media Justice, and Daily Kos, argues that Facebook, particularly since the launch of its Facebook Live streaming feature, has become a full-fledge media outlet with “an increasingly central role in controlling media that circulates through the public sphere. News is not just getting shared on Facebook: It’s getting broken there.”
“When Facebook unilaterally censors user content that depicts police brutality at the request of the authorities, it sets a dangerous precedent that further hurts and silences marginalized communities, particularly communities of color,” reads the letter. “With the safety check-in feature, profile solidarity filters, and in countless speeches, you and others in your company present Facebook’s value of human life at the center of its public-facing image. However, Facebook’s repeated silencing of marginalized communities that attempt to make their stories and struggles known proves otherwise.”
Some of recent incidents of Facebook removing potentially sensitive content that sparked the letter include:
- The temporary removal of a Facebook Live video showing the bloody aftermath of police in Falcon Heights, Minnesota, fatally shooting Philando Castile, which the company blamed on a “technical glitch.”
- The blocking of the Instagram account of Korryn Gaines, a 23-year-old Baltimore woman who used the popular Facebook-owned photo-sharing service to post videos of her stand-off with police in real time, before being shot dead by law enforcement officials.
- The blocking of a Pulitzer Prize-winning photograph depicting a nine-year old Vietnamese girl who had been burned by napalm.
- Reports of the censorship of livestreams showing the protests against the construction of the Dakota Access Pipeline in Standing Rock, North Dakota.
- The suspension of the Facebook accounts of several prominent Palestinian journalists.
- Facebook granting data access to tools, such as Geofeedia and Snaptrends, which are marketed to law enforcement agencies as a way to surveil activists and protesters.
The letter called on Facebook to make publicly accessible its guidelines for censoring content. Specifically, the groups want Facebook to reveal the technical and policy details about the company’s internal system for handling censorship requests on individual pieces of content from law enforcement, intelligence agencies, and other government entities. It also urged Facebook to create a public appeal platform for users to contest content removals, institute a blanket policy of only turning over user data to governments when required to do so by the force of law, and undergo an external audit of its content- and data-sharing polices.
“Transparency is a first step. Greater transparency means Facebook can no longer operate in secrecy,” SumOfUs Campaigner Reem Sulieman explained in an email. Sulieman notes that, while Facebook does release regular reports on much of its interactions with governments around the world, much of the company’s decision-making process about how it handles these interactions remains, for many activists, frustratingly vague. “Facebook increasingly decides what the public sees and has become a news source for many. So if Facebook is making decisions about what news reaches the public, then it needs to transparent about and be accountable for how those decisions are getting made.”
A Facebook spokesperson told the Daily Dot that the company has received the letter and is in the process of reviewing it. Sulieman added that some of the signatories to the letter have had conversations with Facebook officials regarding the issue, but those talks have yet to translate to concrete, public-facing actions on the part of the company.
In a recent blog post, Facebook officials noted that setting standards for what content is removed in different countries around the world is a complex process. “Whether an image is newsworthy or historically significant is highly subjective. Images of nudity or violence that are acceptable in one part of the world may be offensive—or even illegal—in another,” the post read. “Respecting local norms and upholding global practices often come into conflict. And people often disagree about what standards should be in place to ensure a community that is both safe and open to expression.”
Facebook vowed to “begin allowing more items that people find newsworthy, significant, or important to the public interest—even if they might otherwise violate our standards,” the blog post continued. “We will work with our community and partners to explore exactly how to do this, both through new tools and approaches to enforcement. Our intent is to allow more images and stories without posing safety risks or showing graphic images to minors and others who do not want to see them.”
Aaron Sankin is a former Senior Staff Writer at the Daily Dot who covered the intersection of politics, technology, online privacy, Twitter bots, and the role of dank memes in popular culture. He lives in Seattle, Washington. He joined the Center for Investigative Reporting in 2016.