- George Lucas helped direct ‘Game of Thrones’—and he’s not a Jon Snow fan 2 Years Ago
- Conspiracy theorists got me suspended from Twitter—and now I understand them better Today 7:29 AM
- What is the Golden Company, Cersei’s new mercenary army? Today 7:00 AM
- The ultimate cord-cutting guide for sports fans Today 6:53 AM
- ‘Little Teeth’ is a tribute to 20-something queer growing pains Today 6:30 AM
- Here’s what’s coming to Amazon Prime in May 2019 Today 5:30 AM
- Beyoncé’s ‘Lemonade’ is finally coming to Spotify, Apple Music Wednesday 8:48 PM
- Ubisoft is offering Assassin’s Creed Unity for free to support Notre Dame Wednesday 8:25 PM
- Are teens really eating foods with the ‘shells on’ for a new viral challenge? Wednesday 6:39 PM
- The new Samsung Galaxy Fold already seems to be falling apart Wednesday 4:17 PM
- Think the ‘Game of Thrones’ spirals are all connected? Think again Wednesday 3:13 PM
- Rudy Giuliani retweets prominent QAnon supporter Wednesday 2:03 PM
- India bans TikTok over concerns of child endangerment Wednesday 2:00 PM
- JJ Abrams says there’s more to Rey’s origin story Wednesday 1:16 PM
- Lisa Ann says Equinox trainer looked up her number and sent her a creepy text Wednesday 1:01 PM
Over 70 activists groups call on Mark Zuckerberg to reveal Facebook’s censorship policies
‘Greater transparency means Facebook can no longer operate in secrecy.’
In an open letter to Facebook founder Mark Zuckerberg, a coalition of 73 activist group urged the social network to provide greater transparency about its policies for removing user content upon government request.
The letter, which was singed by the leadership of progressive activist organizations like SumOfUs, the Center for Media Justice, and Daily Kos, argues that Facebook, particularly since the launch of its Facebook Live streaming feature, has become a full-fledge media outlet with “an increasingly central role in controlling media that circulates through the public sphere. News is not just getting shared on Facebook: It’s getting broken there.”
“When Facebook unilaterally censors user content that depicts police brutality at the request of the authorities, it sets a dangerous precedent that further hurts and silences marginalized communities, particularly communities of color,” reads the letter. “With the safety check-in feature, profile solidarity filters, and in countless speeches, you and others in your company present Facebook’s value of human life at the center of its public-facing image. However, Facebook’s repeated silencing of marginalized communities that attempt to make their stories and struggles known proves otherwise.”
Some of recent incidents of Facebook removing potentially sensitive content that sparked the letter include:
- The temporary removal of a Facebook Live video showing the bloody aftermath of police in Falcon Heights, Minnesota, fatally shooting Philando Castile, which the company blamed on a “technical glitch.”
- The blocking of the Instagram account of Korryn Gaines, a 23-year-old Baltimore woman who used the popular Facebook-owned photo-sharing service to post videos of her stand-off with police in real time, before being shot dead by law enforcement officials.
- The blocking of a Pulitzer Prize-winning photograph depicting a nine-year old Vietnamese girl who had been burned by napalm.
- Reports of the censorship of livestreams showing the protests against the construction of the Dakota Access Pipeline in Standing Rock, North Dakota.
- The suspension of the Facebook accounts of several prominent Palestinian journalists.
- Facebook granting data access to tools, such as Geofeedia and Snaptrends, which are marketed to law enforcement agencies as a way to surveil activists and protesters.
The letter called on Facebook to make publicly accessible its guidelines for censoring content. Specifically, the groups want Facebook to reveal the technical and policy details about the company’s internal system for handling censorship requests on individual pieces of content from law enforcement, intelligence agencies, and other government entities. It also urged Facebook to create a public appeal platform for users to contest content removals, institute a blanket policy of only turning over user data to governments when required to do so by the force of law, and undergo an external audit of its content- and data-sharing polices.
“Transparency is a first step. Greater transparency means Facebook can no longer operate in secrecy,” SumOfUs Campaigner Reem Sulieman explained in an email. Sulieman notes that, while Facebook does release regular reports on much of its interactions with governments around the world, much of the company’s decision-making process about how it handles these interactions remains, for many activists, frustratingly vague. “Facebook increasingly decides what the public sees and has become a news source for many. So if Facebook is making decisions about what news reaches the public, then it needs to transparent about and be accountable for how those decisions are getting made.”
A Facebook spokesperson told the Daily Dot that the company has received the letter and is in the process of reviewing it. Sulieman added that some of the signatories to the letter have had conversations with Facebook officials regarding the issue, but those talks have yet to translate to concrete, public-facing actions on the part of the company.
In a recent blog post, Facebook officials noted that setting standards for what content is removed in different countries around the world is a complex process. “Whether an image is newsworthy or historically significant is highly subjective. Images of nudity or violence that are acceptable in one part of the world may be offensive—or even illegal—in another,” the post read. “Respecting local norms and upholding global practices often come into conflict. And people often disagree about what standards should be in place to ensure a community that is both safe and open to expression.”
Facebook vowed to “begin allowing more items that people find newsworthy, significant, or important to the public interest—even if they might otherwise violate our standards,” the blog post continued. “We will work with our community and partners to explore exactly how to do this, both through new tools and approaches to enforcement. Our intent is to allow more images and stories without posing safety risks or showing graphic images to minors and others who do not want to see them.”
Aaron Sankin is a former Senior Staff Writer at the Daily Dot who covered the intersection of politics, technology, online privacy, Twitter bots, and the role of dank memes in popular culture. He lives in Seattle, Washington. He joined the Center for Investigative Reporting in 2016.