- Cara Delevingne calls out Justin Bieber for ‘ranking’ wife Hailey’s friends Friday 9:07 PM
- Fans defend Jenna Marbles after some people claimed she mistreated her dogs in a recent video Friday 8:37 PM
- ‘Friends’ gets reunion special on HBO Max, fans go wild Friday 7:37 PM
- Why you should drop everything and start reading ‘Lore Olympus’ Friday 6:27 PM
- ‘Boogaloo’ memes are trying to organize a second civil war—and they’re spreading fast Friday 3:48 PM
- People are disturbed by these McDonald’s-scented candles Friday 3:47 PM
- Season 2 of ‘The Witcher’ is in production Friday 3:16 PM
- Here are some cringey billboards Bloomberg ran in Arizona Friday 2:51 PM
- PewDiePie returns to YouTube after 37-day hiatus Friday 2:01 PM
- Why was a Republican Party Facebook page co-managed by someone in Turkmenistan? Friday 1:26 PM
- The shorthand guide to ‘Star Wars: The Clone Wars’ Friday 1:07 PM
- Congress urges Tinder to screen for sex offenders Friday 1:03 PM
- Video shows 9-year-old threatening suicide after being bullied Friday 12:01 PM
- Ex-Goldman Sachs CEO says he might vote Trump because Sanders is too mean to him Friday 11:40 AM
- Twitch streamer says she was banned for body painting Friday 11:39 AM
Are self-driving cars really safer? In the eight months that Google’s 50 self-driving cars have been on the road, there have been four accidents, reports the AP. But according to the sources quoted in the story, half of these accidents were really caused by human error—not the car’s robotics at all.
Chris Urmson, the director of Google’s self-driving car program fired back against the report in a Medium post on Monday, saying that in the six years since the start of the project in California and Nevada, there has only been “11 minor accidents (light damage, no injuries).” Urmoson stipulates that “not once was the self-driving car the cause of the accident.” Still, most details remain unclear because accident reports are protected under California state law.
According to Urmson’s post, the self-driving cars have been rear-ended by other drivers seven times and also have been hit by another vehicle rolling through a stop sign at least once.
The human errors do back up some controversial comments made Tesla CEO Elon Musk made in March. Speaking at a conference, Musk told the audience that humans driving cars will eventually become illegal.
“It’s too dangerous,” Musk said. “You can’t have a person driving a two-ton death machine.”
Jalopnik, Gawker’s auto-focused blog, simultaneously applauded and chided Google for attempting to answer some questions the AP report raised. The site’s writer Damon Lavrinc noted that although the post cleared up some confusion, calling it “a solid step,” more transparency is necessary if the company wants to truly push self-driving cars into mainstream light.
Photo via Bradley P Johnson/Flickr (CC BY 2.0)
Myles Tanzer is a former contributor to the Daily Dot with an emphasis on technology and viral news. He is currently the Fader's news editor, having previously written and edited for Vogue, BuzzFeed, and Gawker.