- ‘Zola’ is a surreal and wild tale of a road trip gone wrong 5 Years Ago
- Sebastian Gorka blocks pundit over Fleshlight joke Today 1:56 AM
- Woman slammed for trying to put UPS driver on blast Sunday 5:23 PM
- Twitter users are sharing which celebrities have blocked them Sunday 4:43 PM
- Conspiracy theorists are already taking advantage of Kobe Bryant’s death Sunday 4:14 PM
- Adam Driver returns to ‘SNL’ as Kylo Ren to reprise role in ‘Undercover Boss’ parody Sunday 3:46 PM
- White men are raging over ‘SNL’s’ white male rage skit Sunday 3:05 PM
- Kobe Bryant dead at 41 Sunday 2:24 PM
- Pete Buttigieg mocked over ‘staged’ walking photo Sunday 12:22 PM
- Louise Linton deletes pro-Greta Thunberg Instagram post Sunday 10:58 AM
- ‘Crip Camp’ shows how a radical summer camp was monumental to the disability rights movement Sunday 9:08 AM
- How to live stream the 2020 Grammy Awards Sunday 7:00 AM
- Technology created deepfakes—does it have a way to stop them, too? Sunday 6:30 AM
- SESTA-FOSTA is ‘detrimental’ to sex workers’ safety, study confirms Sunday 6:00 AM
- Jeff Bezos’ girlfriend allegedly sent his nudes to her brother, who then leaked them Saturday 6:38 PM
Twitter will now remove images of deceased loved ones by request
“In order to respect the wishes of loved ones, Twitter will remove imagery of deceased individuals in certain circumstances.”
Twitter employee Nu Wexler announced the site’s new policy for removing images and video of deceased people late yesterday, following the recent news that Robin Williams’ daughter was leaving Twitter after harassment related to her father’s death.
In a new addendum to a Help Center article about deleting deceased people’s accounts, Twitter explains its policy for the “removal of certain imagery” related to these individuals.
“In order to respect the wishes of loved ones, Twitter will remove imagery of deceased individuals in certain circumstances,” the addendum reads. “Immediate family members and other authorized individuals may request the removal of images or video of deceased individuals, from when critical injury occurs to the moments before or after death, by sending an e-mail to [email protected].”
Removing the images that Twitter trolls used to harass Williams’ daughter seems like a fairly open-and-shut case of Twitter upholding its standards for acceptable conduct. What has touched off some controversy, however, is the latest deployment of the new policy, which has seen Twitter deleting photos, videos, and even entire Twitter accounts to erase multimedia related to the death of photojournalist James Foley.
News of Foley’s death first broke when the terrorist group Islamic State in Iraq and Syria, or ISIS, released a video of his beheading. Almost immediately, Twitter deleted the account of Radio Sawa Washington correspondent Zaid Benjamin, who was the first to post the video. (Benjamin’s account was restored shortly thereafter.)
— Zaid Benjamin (@zaidbenjamin) August 19, 2014
The Washington Post reports that Twitter began removing content related to Foley’s murder after being contacted by the State Department, which asked them and other sites to “take appropriate action consistent with their stated usage policies.”
Twitter’s policy states: “When reviewing such media removal requests, Twitter considers public interest factors such as the newsworthiness of the content and may not be able to honor every request.”
In the case of the Foley beheading, Twitter is erasing the media wherever it can find it. Twitter CEO Dick Costolo wrote, “We have been and are actively suspending accounts as we discover them related to this graphic imagery. Thank you.”
We have been and are actively suspending accounts as we discover them related to this graphic imagery. Thank you https://t.co/jaYQBKVbBF
— dick costolo (@dickc) August 20, 2014
By contrast, the company has not been removing videos of ISIS beheading Syrians as those videos surface online.
Facebook has a similar policy for removing the accounts of deceased people, but it has not published any policy updates for removing media related to deaths or dead individuals. A Facebook spokesperson pointed the Daily Dot to the company’s Community Standards page, which has a section for graphic content:
In many instances, when people share this type of content, it is to condemn it. However, graphic images shared for sadistic effect or to celebrate or glorify violence have no place on our site..
This suggests that Facebook will permit the video to be shared by those commenting on it as a news story but not by ISIS sympathizers.
“For graphic videos,” the page warns, “people should warn their audience about the nature of the content in the video so that their audience can make an informed choice about whether to watch it.”
Image via Andreas Eldh/Flickr (CC BY 2.0)
Eric Geller is a politics reporter who focuses on cybersecurity, surveillance, encryption, and privacy. A former staff writer at the Daily Dot, Geller joined Politico in June 2016, where he's focused on policymaking at the White House, the Justice Department, the State Department, and the Commerce Department.