- More than 40 colleges say they won’t use facial recognition on campus 7 Months Ago
- LeBron’s Instagram tribute to Kobe is devastating Today 7:56 AM
- ‘Rise of Empires: Ottoman’ is ‘Game of Thrones’ for history buffs Today 7:00 AM
- People on Twitter ask whose ancestors would’ve passed immigrant ‘wealth test’ Monday 6:54 PM
- Kobe Bryant helicopter crash mocked in teen’s TikTok video Monday 6:38 PM
- Chiefs, Bears, Packers have Twitter accounts hacked Monday 3:48 PM
- Washington Post reporter suspended amid backlash over Kobe Bryant tweet Monday 3:08 PM
- America is united in hating Ken Starr’s impeachment hat Monday 3:01 PM
- In ‘Cuties,’ the contradictions of growing up come to a head Monday 1:55 PM
- Racist tweets blame fruit bat soup for coronavirus Monday 1:25 PM
- What is the #ILeftTheGOP movement? Monday 1:21 PM
- The Grammys were weird and sad—but the Billy Porter hat memes offered some levity Monday 12:36 PM
- Auschwitz Museum calls on Facebook to ban Holocaust denialism Monday 11:59 AM
- YouTuber who said his girlfriend was dead now says he faked it Monday 11:42 AM
- Review: Kentucky Route Zero is one of the most magical games ever made Monday 11:00 AM
A program that uses a machine learning algorithm to remove women’s clothing from photos was pulled offline after its creator received widespread attention and backlash.
The DeepNude app, first reported on by Motherboard Wednesday, digitally places realistic-looking breasts and female genitals on pictures of clothed women with a click of a button. The app’s anonymous creator, who goes by the alias “Alberto,” reportedly fed an algorithm 10,000 photos of nude women in order to teach his program how to make subjects appear naked.
Photos produced by DeepNude included a large watermark unless users paid $50, in which case the watermark would be removed.
“I’m not a voyeur, I’m a technology enthusiast,” the creator told Motherboard. “Continuing to improve the algorithm. Recently, also due to previous failures (other startups) and economic problems, I asked myself if I could have an economic return from this algorithm. That’s why I created DeepNude.”
When questioned on the ethics of his app, Alberto argued: “If I don’t do it, someone else will do it in a year.”
But the developer’s tune changed shortly after the website for his app began crashing due to overwhelming traffic. The app, which does not work on photos of men, also received criticism for objectifying women.
Alberto wrote on the DeepNude Twitter account early Thursday that he will not continue working on the app.
“Despite the safety measures adopted (watermarks), if 500,000 people use it the probability that people will misuse it is too high,” the statement said.
Although the app was taken down, users who already downloaded the software are still capable of using it and spreading photos it made online. And now that the concept has surfaced, copycat apps are likely to follow.
- An impressionist morphed into 11 different celebrities in this deepfake
- Jon Snow apologizes for the final season of ‘Game of Thrones’ in this deepfake
- Facebook’s Mark Zuckerberg brags about owning all of your data in this deepfake
Got five minutes? We’d love to hear from you. Help shape our journalism and be entered to win an Amazon gift card by filling out our 2019 reader survey.
Mikael Thalen is a tech and security reporter based in Seattle, covering social media, data breaches, hackers, and more.