- Marvel unveiled its Phase 4 plans at San Diego Comic-Con 4 Years Ago
- How a queer Instagram is helping fight the opioid epidemic in Appalachia Today 6:30 AM
- Philadelphia to fire 13 officers for racist, violent Facebook posts Saturday 6:12 PM
- Nick Offerman is so down to play every single role in ‘Cats’ Saturday 4:27 PM
- Woman documents how airport staff broke her wheelchair Saturday 3:04 PM
- Funeral home allegedly posted photos of woman’s dead body on social media Saturday 1:56 PM
- Alinity Divine is being investigated after throwing her cat during stream (updated) Saturday 12:04 PM
- ‘Comedians In Cars Getting Coffee’ returns with Seinfeld making a racist joke about China Saturday 10:26 AM
- YouTubers Eugenia Cooney and Shane Dawson make a joint comeback Saturday 9:06 AM
- The crushing effects of Trump’s abortion ‘gag rule’ on healthcare Saturday 8:00 AM
- How to live stream Pacquiao vs. Thurman Saturday 6:20 AM
- Review: Hulu with Live TV ensures you always have something to watch Saturday 6:00 AM
- How to live stream UFC on ESPN 4: Rafael dos Anjos vs. Leon Edwards Saturday 5:49 AM
- 2020 Democrats refuse to answer our questions about ‘Cats’ Friday 4:14 PM
- Belle Delphine’s Instagram account removed after mass reporting campaign Friday 4:08 PM
Facebook’s AI could prevent you from uploading private photos, including nudes
As Facebook gets smarter, it will tell you not to upload pictures of your kids.
Now, Facebook is going beyond facial recognition on its app and website, by weaving its machine learning into your camera roll—it will analyze faces before they’re even uploaded to the social network. Of course, this only happens if you opt-in to using Photo Magic, the Messenger feature that prompts you to send your friends the photos on your camera roll that you forget about, but by doing so, you’re voluntarily giving up a bit of privacy for convenience.
Just as much as Facebook wants you to share photos of your friends’ faces, it also wants to help you to not share photos to certain audiences. Jay Parikh, Facebook’s vice president of engineering, spoke at an event in London last week and described the new ways Facebook will be able to encourage or prevent you from uploading pictures, Business Insider reported.
The company is building software that can alert people before they upload a photo they might not want everyone to see. This includes nude photos, or photos of your kids. Facebook’s AI research team has been working on this drunken-selfie prevention for a while. As Parikh describes it:
If I were to upload a photo of my kids playing at the park and I accidentally had it shared with the public, this system could say hey wait a minute, this photo is of your kids. Normally you post this to just your family members. Are you sure you want to do this?
Sure, alerting users to potential privacy flubs might prevent embarrassing posts. It’s easy to forget you’ve changed the audience to “public,” on a status update or picture, and especially when it comes to kids, privacy should be as locked down as parents are comfortable with.
But when you think about it this way, it becomes a bit creepier: the only way Facebook can prevent you from sharing photos of your children with a public audience is by recognizing your child’s face. Before your kid has any opportunity to decide for herself whether or not she wants to be added to a massive database of images that can be identified using algorithms, she’s already been scooped up in the service.
Anyone who uses the Internet has experienced this trade-off already. We give companies our personal information in exchange for using a service for free, and all of the convenience that comes with it. However, as technology becomes more advanced—like facial recognition in camera rolls, preventing that sext from circulating online, and teaching a personal assistant to get you whatever you need whenever you need it—it’s important to weigh the benefits and decide just how much you value convenience over privacy.
Facebook is investing heavily in artificial intelligence research. The company recently expanded its Facebook AI Research (FAIR) organization to Paris, and is working on teaching computers to behave like humans. Additionally, Facebook’s AI has become so advanced, it can even recognize people’s faces when they’re trying to hide them by covering their face with a hand or piece of clothing.
It’s unlikely that the over one billion people who log in to Facebook on a daily basis think much about the potential for AI to change the way we interact with computers, or invade our privacy. But leading scientists and technologists are, and they want to make sure AI isn’t used for nefarious purposes. Earlier this year, the Future of Life institute called for a ban on creating offensive autonomous weapons, and signatories of the letter included Elon Musk and Stephen Hawking, two prominent figures who have warned of the potential downsides to AI.
We’re still a long way off from computers thinking like people. Parikh said the development of the upload warning and prevention tools are still “really early on,” and haven’t made it into products yet. However, as this technology continues to develop, and as Facebook open-sources it for anyone to use, it’s worth thinking about whether or not you’re comfortable with your face, or your children’s faces, being recognized by computers.
If not, you can always opt yourself out. For now, at least.
H/T Business Insider | Illustration by Jason Reed
Selena Larson is a technology reporter based in San Francisco who writes about the intersection of technology and culture. Her work explores new technologies and the way they impact industries, human behavior, and security and privacy. Since leaving the Daily Dot, she's reported for CNN Money and done technical writing for cybersecurity firm Dragos.