- Sri Lankan government shuts down social media in wake of deadly blasts Sunday 7:56 PM
- Amazon Flex drivers now must use selfies to verify identity Sunday 6:34 PM
- #GentrifyingGeorge thinks 152-year-old HBCU should ‘just move’ Sunday 5:27 PM
- Watch out! Tonight’s episode of ‘Game of Thrones’ leaked online (updated) Sunday 3:32 PM
- Videos of people working may be the best thing on TikTok right now Sunday 1:46 PM
- How to watch ‘Game of Thrones’ season 8, episode 2 for free Sunday 7:00 AM
- Gendry is making a new weapon for Arya Stark—but what is it? Sunday 6:30 AM
- The live-action Halo series could be Showtime’s most ambitious project yet Sunday 6:00 AM
- How to watch Turner Classic Movies for free Sunday 5:30 AM
- How to watch Real Madrid vs. Athletic Bilbao online for free Sunday 5:00 AM
- ‘Star Trek’s Jonathan Frakes calls out your lies with this new meme Saturday 3:46 PM
- #JusticeForLucca trends after video shows police slam Black teen’s head into pavement Saturday 3:11 PM
- The internet is shocked to learn that Goombas do, in fact, have arms Saturday 2:02 PM
- PayPal, GoFundMe cut off armed militia that detains migrants at border Saturday 1:16 PM
- Barnwood theft may be on the rise because of ‘Fixer Upper’—and fans aren’t having it Saturday 12:23 PM
Stephen Hawking and Elon Musk are worried about weapons that think for themselves
They want to prevent our annihilation.
A leading artificial-intelligence group is calling for a ban on the development of offensive autonomous weapons, and in a sign of how many people are worried about this threat, the open letter has attracted hundreds of signatures from the brightest minds in artificial intelligence and robotics.
Among the signatories on the letter from the Future of Life Institute are renowned physicist Stephen Hawking, Tesla founder Elon Musk, and Apple co-founder Steve Wozniak. The letter is set to be presented at the International Joint Conferences on Artificial Intelligence in Buenos Aires on Tuesday.
The letter, which also attracted signatures from Noam Chomsky and a host of other high-profile thinkers and researchers, urges world powers to be cautious in using artificial intelligence to create weapons.
The letter draws a bright line between autonomous weapons that require human guidance and weapons that can operate completely on their own.
Cruise missiles and remotely piloted drones are acceptable, the signatories said, because humans still make the targeting decisions. However, “armed quadcopters that can search for and eliminate people meeting certain pre-defined criteria” would be a step too far, as that would remove humans from the decision-making chain.
According to the letter, automatic targeting systems are expected to be feasible within the next few years. The race to develop the technology will lead to autonomous weapons becoming the “Kalashnikovs of tomorrow”—cheap, easy to acquire, and easy to mass produce.
The letter’s signatories urged immediate action to ban these systems in order to avoid a “military AI arms race.”
Autonomous weapons development, the letter argues, would result in technology that is “ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group.”
Researchers and scientists involved in the development of artificial intelligence also made their case by comparing autonomous weapons to nuclear ones.
“Just as most chemists and biologists have no interest in building chemical or biological weapons,” they said, “most AI researchers have no interest in building AI weapons—and do not want others to tarnish their field by doing so, potentially creating a major public backlash against AI that curtails its future societal benefits.”
Hawking and Musk have been particularly vocal in their opposition to weaponized AI in the past. Hawking warned that the development of AI could lead to the end of humanity, and Musk donated $10 million to research on AI safety. He has also compared AI advancements to “summoning the demon.”
AJ Dellinger is a seasoned technology writer whose work has appeared in Digital Trends, International Business Times, and Newsweek. In 2018, he joined Gizmodo as the nights and weekend editor.