- Amanda Holden’s bad coronavirus advice sheds light on the struggle of being immunocompromised Friday 9:03 PM
- The World Health Organization is now fighting coronavirus misinformation on TikTok Friday 8:43 PM
- Police are using coronavirus misinformation to trick people into turning in drugs Friday 8:11 PM
- People can’t stop touching their faces–and the CDC really wants them to Friday 7:31 PM
- A TikTok of a girl getting an abortion is going viral—and the internet is divided Friday 3:06 PM
- FCC proposes $200 million fine for T-Mobile, others over data sharing Friday 3:03 PM
- Which ‘Love is Blind’ couples are still together? Friday 2:01 PM
- Review: ‘The Invisible Man’ reboot is thrilling but basic Friday 1:25 PM
- Sex workers speak out after OnlyFans leak Friday 1:21 PM
- Normani addresses Camila Cabello’s racist social media posts Friday 1:07 PM
- Mike Huckabee’s defense of Trump’s coronavirus response will make you nauseous Friday 12:06 PM
- Gmail’s email filtering may affect what candidate emails you are seeing Friday 11:08 AM
- Woman shares aftermath of domestic abuse: ‘This is only to raise awareness’ Friday 10:40 AM
- Skai Jackson gets restraining order against Bhad Bhabie after death threat Friday 10:19 AM
- Taylor Swift shades Scooter Braun in ‘The Man’ video Friday 10:15 AM
In an open letter to Facebook founder Mark Zuckerberg, a coalition of 73 activist group urged the social network to provide greater transparency about its policies for removing user content upon government request.
The letter, which was singed by the leadership of progressive activist organizations like SumOfUs, the Center for Media Justice, and Daily Kos, argues that Facebook, particularly since the launch of its Facebook Live streaming feature, has become a full-fledge media outlet with “an increasingly central role in controlling media that circulates through the public sphere. News is not just getting shared on Facebook: It’s getting broken there.”
“When Facebook unilaterally censors user content that depicts police brutality at the request of the authorities, it sets a dangerous precedent that further hurts and silences marginalized communities, particularly communities of color,” reads the letter. “With the safety check-in feature, profile solidarity filters, and in countless speeches, you and others in your company present Facebook’s value of human life at the center of its public-facing image. However, Facebook’s repeated silencing of marginalized communities that attempt to make their stories and struggles known proves otherwise.”
Some of recent incidents of Facebook removing potentially sensitive content that sparked the letter include:
- The temporary removal of a Facebook Live video showing the bloody aftermath of police in Falcon Heights, Minnesota, fatally shooting Philando Castile, which the company blamed on a “technical glitch.”
- The blocking of the Instagram account of Korryn Gaines, a 23-year-old Baltimore woman who used the popular Facebook-owned photo-sharing service to post videos of her stand-off with police in real time, before being shot dead by law enforcement officials.
- The blocking of a Pulitzer Prize-winning photograph depicting a nine-year old Vietnamese girl who had been burned by napalm.
- Reports of the censorship of livestreams showing the protests against the construction of the Dakota Access Pipeline in Standing Rock, North Dakota.
- The suspension of the Facebook accounts of several prominent Palestinian journalists.
- Facebook granting data access to tools, such as Geofeedia and Snaptrends, which are marketed to law enforcement agencies as a way to surveil activists and protesters.
The letter called on Facebook to make publicly accessible its guidelines for censoring content. Specifically, the groups want Facebook to reveal the technical and policy details about the company’s internal system for handling censorship requests on individual pieces of content from law enforcement, intelligence agencies, and other government entities. It also urged Facebook to create a public appeal platform for users to contest content removals, institute a blanket policy of only turning over user data to governments when required to do so by the force of law, and undergo an external audit of its content- and data-sharing polices.
“Transparency is a first step. Greater transparency means Facebook can no longer operate in secrecy,” SumOfUs Campaigner Reem Sulieman explained in an email. Sulieman notes that, while Facebook does release regular reports on much of its interactions with governments around the world, much of the company’s decision-making process about how it handles these interactions remains, for many activists, frustratingly vague. “Facebook increasingly decides what the public sees and has become a news source for many. So if Facebook is making decisions about what news reaches the public, then it needs to transparent about and be accountable for how those decisions are getting made.”
A Facebook spokesperson told the Daily Dot that the company has received the letter and is in the process of reviewing it. Sulieman added that some of the signatories to the letter have had conversations with Facebook officials regarding the issue, but those talks have yet to translate to concrete, public-facing actions on the part of the company.
In a recent blog post, Facebook officials noted that setting standards for what content is removed in different countries around the world is a complex process. “Whether an image is newsworthy or historically significant is highly subjective. Images of nudity or violence that are acceptable in one part of the world may be offensive—or even illegal—in another,” the post read. “Respecting local norms and upholding global practices often come into conflict. And people often disagree about what standards should be in place to ensure a community that is both safe and open to expression.”
Facebook vowed to “begin allowing more items that people find newsworthy, significant, or important to the public interest—even if they might otherwise violate our standards,” the blog post continued. “We will work with our community and partners to explore exactly how to do this, both through new tools and approaches to enforcement. Our intent is to allow more images and stories without posing safety risks or showing graphic images to minors and others who do not want to see them.”
Aaron Sankin is a former Senior Staff Writer at the Daily Dot who covered the intersection of politics, technology, online privacy, Twitter bots, and the role of dank memes in popular culture. He lives in Seattle, Washington. He joined the Center for Investigative Reporting in 2016.