- Is Trump defiling the U.S. flag in this MAGA dude’s artwork? Sunday 4:41 PM
- White woman claims she invented sleep bonnets, selling them for $100 Sunday 4:03 PM
- Even real cats are transfixed by the enigma that is the ‘Cats’ trailer Sunday 3:04 PM
- Wait, how tall is Peppa Pig? Sunday 1:55 PM
- Twitter suspends Iranian state media outlets for harassing members of a religious minority Sunday 1:06 PM
- Pro-MAGA pageant queen stripped of title over ‘offensive’ tweets Sunday 11:52 AM
- Marvel unveiled its Phase 4 plans at San Diego Comic-Con Sunday 9:16 AM
- How a queer Instagram is helping fight the opioid epidemic in Appalachia Sunday 6:30 AM
- Philadelphia to fire 13 officers for racist, violent Facebook posts Saturday 6:12 PM
- Nick Offerman is so down to play every single role in ‘Cats’ Saturday 4:27 PM
- Woman documents how airport staff broke her wheelchair Saturday 3:04 PM
- Funeral home allegedly posted photos of woman’s dead body on social media Saturday 1:56 PM
- Alinity Divine is being investigated after throwing her cat during stream (updated) Saturday 12:04 PM
- ‘Comedians In Cars Getting Coffee’ returns with Seinfeld making a racist joke about China Saturday 10:26 AM
- YouTubers Eugenia Cooney and Shane Dawson make a joint comeback Saturday 9:06 AM
GitHub bans copies of ‘DeepNude’ app that undresses women in photos
But the copies continue to spread online.
Website GitHub is banning copycat versions of the “DeepNude” app that uses neural networks to remove women’s clothing from photos.
DeepNude’s creator succumbed to the fact that “the probability that people will misuse it is too high.”
“We don’t want to make money this way,” the creator wrote in a statement on Twitter. “Surely some copies of DeepNude will be shared on the web, but we don’t want to be the ones to sell it.”
Despite the creator’s last-minute attempt to keep the app from spreading, countless users already downloaded the tool and began creating their own versions of it. The codes for those new versions were uploaded to GitHub and then further downloaded.
GitHub responded by removing the new apps from their site, pointing to its Community Guidelines.
“Don’t post content that is pornographic. This does not mean that all nudity, or all code and content related to sexuality, is prohibited,” the guidelines read. “We recognize that sexuality is a part of life and non-pornographic sexual content may be a part of your project or may be presented for educational or artistic purposes. We do not allow obscene sexual content or content that may involve the exploitation or sexualization of minors.”
While Github’s action sends a strong signal about its stance on the issue, the proverbial genie has unfortunately already been let out of the bottle. The apps have only continued to spread throughout other online avenues.
DeepNude, which was programmed to only work on photos of women, is raising important and serious questions about how society views such programs.
Lawmakers have attempted to tackle similar issues such as deepfakes, which have been used to place female celebrities and everyday women into pornographic videos. At the beginning of July, Virginia updated a 2014 law banning the distribution of revenge porn to also include deepfake videos.
- Virginia just banned deepfake revenge porn
- ‘DeepNude’ app that removes women’s clothing from photos pulled offline
- An impressionist morphed into 11 different celebrities in this deepfake
- Jon Snow apologizes for the final season of ‘Game of Thrones’ in this deepfake
Got five minutes? We’d love to hear from you. Help shape our journalism and be entered to win an Amazon gift card by filling out our 2019 reader survey.
Mikael Thalen is a tech and security reporter based in Seattle, covering social media, data breaches, hackers, and more.