- Mastodon is crumbling—and many blame its creator 2 Years Ago
- ‘Star Trek: Discovery’ hints at Spock’s new role in the season 2 premiere 2 Years Ago
- How to watch ‘Rachel Maddow’ online for free 2 Years Ago
- Guy who wants to fund the border wall has privately raised $7 million Thursday 8:41 PM
- Mortal Kombat 11 trailer delights fans with gory fatalities, new characters Thursday 5:46 PM
- What you need to know about the data breach involving 773 email addresses Thursday 5:13 PM
- Senators fear government shutdown may affect FTC investigation of Facebook Thursday 3:43 PM
- Buy beer for a furloughed government worker with this new website Thursday 3:19 PM
- Alexandria Ocasio-Cortez is teaching Congress how to tweet Thursday 2:42 PM
- Congressmen held genetics meeting with Chuck Johnson, despite his past racist claims about genetics Thursday 2:26 PM
- Female bodyguard thriller ‘Close’ is disappointingly un-thrilling Thursday 2:01 PM
- Twitter faces backlash for insensitive ‘triggers’ joke Thursday 1:13 PM
- 10 user-recommended sites for live tarot readings that are almost too good to be true Thursday 12:08 PM
- AsapSCIENCE comes for Jake Paul over Mystery Brand scam Thursday 11:34 AM
- Why ‘I never thought of it like that’ can actually be deeply offensive Thursday 11:26 AM
Over 70 activists groups call on Mark Zuckerberg to reveal Facebook’s censorship policies
‘Greater transparency means Facebook can no longer operate in secrecy.’
In an open letter to Facebook founder Mark Zuckerberg, a coalition of 73 activist group urged the social network to provide greater transparency about its policies for removing user content upon government request.
The letter, which was singed by the leadership of progressive activist organizations like SumOfUs, the Center for Media Justice, and Daily Kos, argues that Facebook, particularly since the launch of its Facebook Live streaming feature, has become a full-fledge media outlet with “an increasingly central role in controlling media that circulates through the public sphere. News is not just getting shared on Facebook: It’s getting broken there.”
“When Facebook unilaterally censors user content that depicts police brutality at the request of the authorities, it sets a dangerous precedent that further hurts and silences marginalized communities, particularly communities of color,” reads the letter. “With the safety check-in feature, profile solidarity filters, and in countless speeches, you and others in your company present Facebook’s value of human life at the center of its public-facing image. However, Facebook’s repeated silencing of marginalized communities that attempt to make their stories and struggles known proves otherwise.”
Some of recent incidents of Facebook removing potentially sensitive content that sparked the letter include:
- The temporary removal of a Facebook Live video showing the bloody aftermath of police in Falcon Heights, Minnesota, fatally shooting Philando Castile, which the company blamed on a “technical glitch.”
- The blocking of the Instagram account of Korryn Gaines, a 23-year-old Baltimore woman who used the popular Facebook-owned photo-sharing service to post videos of her stand-off with police in real time, before being shot dead by law enforcement officials.
- The blocking of a Pulitzer Prize-winning photograph depicting a nine-year old Vietnamese girl who had been burned by napalm.
- Reports of the censorship of livestreams showing the protests against the construction of the Dakota Access Pipeline in Standing Rock, North Dakota.
- The suspension of the Facebook accounts of several prominent Palestinian journalists.
- Facebook granting data access to tools, such as Geofeedia and Snaptrends, which are marketed to law enforcement agencies as a way to surveil activists and protesters.
The letter called on Facebook to make publicly accessible its guidelines for censoring content. Specifically, the groups want Facebook to reveal the technical and policy details about the company’s internal system for handling censorship requests on individual pieces of content from law enforcement, intelligence agencies, and other government entities. It also urged Facebook to create a public appeal platform for users to contest content removals, institute a blanket policy of only turning over user data to governments when required to do so by the force of law, and undergo an external audit of its content- and data-sharing polices.
“Transparency is a first step. Greater transparency means Facebook can no longer operate in secrecy,” SumOfUs Campaigner Reem Sulieman explained in an email. Sulieman notes that, while Facebook does release regular reports on much of its interactions with governments around the world, much of the company’s decision-making process about how it handles these interactions remains, for many activists, frustratingly vague. “Facebook increasingly decides what the public sees and has become a news source for many. So if Facebook is making decisions about what news reaches the public, then it needs to transparent about and be accountable for how those decisions are getting made.”
A Facebook spokesperson told the Daily Dot that the company has received the letter and is in the process of reviewing it. Sulieman added that some of the signatories to the letter have had conversations with Facebook officials regarding the issue, but those talks have yet to translate to concrete, public-facing actions on the part of the company.
In a recent blog post, Facebook officials noted that setting standards for what content is removed in different countries around the world is a complex process. “Whether an image is newsworthy or historically significant is highly subjective. Images of nudity or violence that are acceptable in one part of the world may be offensive—or even illegal—in another,” the post read. “Respecting local norms and upholding global practices often come into conflict. And people often disagree about what standards should be in place to ensure a community that is both safe and open to expression.”
Facebook vowed to “begin allowing more items that people find newsworthy, significant, or important to the public interest—even if they might otherwise violate our standards,” the blog post continued. “We will work with our community and partners to explore exactly how to do this, both through new tools and approaches to enforcement. Our intent is to allow more images and stories without posing safety risks or showing graphic images to minors and others who do not want to see them.”
Aaron Sankin is a former Senior Staff Writer at the Daily Dot who covered the intersection of politics, technology, online privacy, Twitter bots, and the role of dank memes in popular culture. He lives in Seattle, Washington. He joined the Center for Investigative Reporting in 2016.