- Multimillion-dollar ‘Farming Simulator’ franchise enters world of esports Wednesday 7:38 PM
- Sandra Bullock, Chris McKay are making Netflix comic film ‘Reborn’ Wednesday 7:22 PM
- Instagram revokes Venezuelan President Nicolás Maduro’s verified status Wednesday 5:23 PM
- Transgender people suffer when debates over their rights are framed as ‘distractions’ Wednesday 4:57 PM
- Hulu with Live TV just hiked its prices Wednesday 4:05 PM
- Hacker infiltrates Nest cameras to gain PewDiePie subscribers Wednesday 2:37 PM
- YouTube time traveler claims MLK’s granddaughter will be the last U.S. president Wednesday 2:30 PM
- Media coverage of Alexandria Ocasio-Cortez’s Twitch cameo erases Chelsea Manning Wednesday 1:39 PM
- New Alexa skill lets you sing with Queen’s Freddie Mercury Wednesday 1:13 PM
- Netflix is the first streaming platform to join MPAA Wednesday 12:59 PM
- Can you spot an email from a hacker? Wednesday 12:46 PM
- Gina Rodriguez cries over being called anti-Black, gets dragged for ‘fake tears’ Wednesday 12:21 PM
- Boots Riley explains why he got snubbed by the Oscars Wednesday 12:20 PM
- Review: ‘Buffy’ returns with a modern comic book reboot Wednesday 11:47 AM
- You’re about to see a lot more Netflix on your Instagram Wednesday 11:32 AM
Photo via Usa-Pyon/Shutterstock (Licensed)
Regulations could help avoid a ‘third revolution in warfare.’
The open letter, penned by 116 founders of AI companies from 26 countries, warns of the potentially devastating effects of an arms race for weapons that can think for themselves.
“Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend,” the letter says. “These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways.”
The letter was presented at the International Joint Conference on Artificial Intelligence in Melbourne on the same day the U.N.’s newly established Group of Governmental Experts on Lethal Autonomous Weapon Systems was scheduled to meet for the first time. The letter explains that the meeting was canceled “due to a small number of states failing to pay their financial contributions to the UN,” and pressed the intergovernmental organization to double its efforts at its rescheduled meeting in November.
The statement ends on an ominous note, “We do not have long to act. Once this Pandora’s box is opened, it will be hard to close.”
SpaceX CEO Elon Musk is among those who signed the letter, along with Mustafa Suleyman, founder of Google’s DeepMind. Musk’s involvement should come as no surprise. He’s become the unofficial figurehead of regulating the development of artificial intelligence, claiming it’s humankind’s “biggest existential threat” in a 2014 interview. In 2015, he founded the nonprofit organization OpenAI with Y Combinator President Sam Altman to encourage the creation of friendly AI and to neutralize the threat of malicious actors.
The Tesla CEO recently got into a war of words with Mark Zuckerberg over AI, claiming the Facebook founder had a “limited understanding” of the technology after Zuck called out AI “naysayers” like Musk in a live video stream.
You can read the full note at the Future of Life Institute page.
H/T Business Insider
Phillip Tracy is a former technology staff writer at the Daily Dot. He's an expert on smartphones, social media trends, and gadgets. He previously reported on IoT and telecom for RCR Wireless News and contributed to NewBay Media magazine. He now writes for Laptop magazine.