- Is Trump defiling the U.S. flag in this MAGA dude’s artwork? Sunday 4:41 PM
- White woman claims she invented sleep bonnets, selling them for $100 Sunday 4:03 PM
- Even real cats are transfixed by the enigma that is the ‘Cats’ trailer Sunday 3:04 PM
- Wait, how tall is Peppa Pig? Sunday 1:55 PM
- Twitter suspends Iranian state media outlets for harassing members of a religious minority Sunday 1:06 PM
- Pro-MAGA pageant queen stripped of title over ‘offensive’ tweets Sunday 11:52 AM
- Marvel unveiled its Phase 4 plans at San Diego Comic-Con Sunday 9:16 AM
- How a queer Instagram is helping fight the opioid epidemic in Appalachia Sunday 6:30 AM
- Philadelphia to fire 13 officers for racist, violent Facebook posts Saturday 6:12 PM
- Nick Offerman is so down to play every single role in ‘Cats’ Saturday 4:27 PM
- Woman documents how airport staff broke her wheelchair Saturday 3:04 PM
- Funeral home allegedly posted photos of woman’s dead body on social media Saturday 1:56 PM
- Alinity Divine is being investigated after throwing her cat during stream (updated) Saturday 12:04 PM
- ‘Comedians In Cars Getting Coffee’ returns with Seinfeld making a racist joke about China Saturday 10:26 AM
- YouTubers Eugenia Cooney and Shane Dawson make a joint comeback Saturday 9:06 AM
techy2610/Flickr (Public Domain)
Virginia’s 2014 law banning the distribution of revenge porn has been updated to include deepfakes, or videos in which software and machine learning can create synthetic media of any person.
TechCrunch reports that the law, effective Monday, makes it a Class 1 misdemeanor to share videos or photos of someone without their consent, even if it is not actually them in the video. Photoshopped images or any other manipulated media are also covered by the law. Anyone who shares this content with the intent to “coerce, harass or intimidate” can face up to $2,500 and one year in jail, according to TechSpot.
The updated bill was passed by the Virginia General Assembly in March, CNET reports, and signed by Gov. Ralph Northam. It’s going into effect shortly after the Linux and Windows app DeepNude was pulled by its creator in June, four days after it went live, due to backlash. The app used machine learning to produce videos of women without clothing from videos of them fully clothed.
Privacy and defamation are the largest concerns prompting new legislation that covers deepfakes. The faces of famous women and the bodies of porn stars have been combined to create manipulated pornography that was shared on Reddit. Recently, an altered video of Nancy Pelosi showed her “slurring her words and appearing intoxicated.”
DeepNude was not the first or last program of its kind. The desktop program FakeApp brought deepfakes to the masses in March and remains available for download. Forty-six states currently have laws barring the distribution of revenge porn, but only a handful of states have set out to legislate deepfakes.
- People are Photoshopping Ivanka Trump into places she doesn’t belong
- Border Patrol officers reportedly mocked migrant deaths in secret facebook group
- Eric Trump tries to jab Obama with North Korea tweets, gets called out for wrong dates
Brooke Sjoberg is an editorial intern for the Daily Dot studying journalism at the University of Texas at Austin. She is also the Daily Texan's Life and Arts Editor and an editorial intern for Texas Connect magazine.