- This Twitter account is parodying Amazon customer service Thursday 8:52 PM
- How to live stream Oleksandr Gvozdyk vs. Artur Beterbiev Thursday 7:00 PM
- Shaggy says an online scammer is impersonating him Thursday 6:53 PM
- IRL Barbie Malibu Dreamhouse available to rent on Airbnb Thursday 6:16 PM
- Men’s Humor trolled for unknowingly tweeting Grindr conversation Thursday 5:29 PM
- How to stream Dominick Reyes vs. Chris Weidman Thursday 5:00 PM
- Jennifer Aniston had a finsta before officially joining Instagram Thursday 4:35 PM
- Facebook denies moderating comments under Zuckerberg’s big free speech live stream Thursday 2:38 PM
- ‘My headphones’ meme proves our music is sadder than we look Thursday 1:53 PM
- ‘Time for an upgrade’ meme shows Kamala Harris’ team is too online Thursday 1:35 PM
- Prison guards reportedly mocked trans inmates in private Facebook groups Thursday 1:33 PM
- Gradient is the new celebrity look-alike app winning over influencers Thursday 12:46 PM
- Trolls accuse cosplayer of ‘appropriating’ Joker culture (updated) Thursday 12:28 PM
- Every Studio Ghibli movie will stream exclusively on HBO Max Thursday 12:24 PM
- ‘Stranger Things’ season 3 saw its highest viewer numbers yet Thursday 12:01 PM
Two anonymous sources speaking to the Information said the autonomous software classified Herzberg as a “false positive” and told the car to continue to drive. The self-driving Volvo XC90 struck Herzberg, 49, killing her. Uber had a safety driver in the vehicle, as is required by law, but the employee was seen in police footage looking down moments before the crash.
Uber programs the software powering its self-driving cars with the ability to detect false positives, or objects it detects but determines aren’t dangerous—for example, a plastic bag floating along the road or speed bumps. Uber executives say the system was tuned so it would react less to these types of objects. It appears the tuning went too far.
Designing a system that determines how a car reacts to what it sees on the road requires a fine balance of comfort and safety. If the system is programmed to err on the side of caution, it might brake for every piece of trash it sees flying across the road, potentially leading to more accidents. However, if the tolerances are adjusted to ignore more objects, it could result in the car failing to stop even when it sees a person or object that could cause a dangerous accident.
Uber halted its self-driving testing program on public roads after the accident. Other big names, including Toyota, followed. The ride-hailing giant is currently conducting a “top-to-bottom” safety review of its vehicles and brought in the former chair of the National Transportation Safety Board, Christopher Hart, to advise the program.
“Our review is looking at everything from the safety of our system to our training processes for vehicle operators, and we hope to have more to say soon,” the company said in a statement.
The fatal crash raised questions about the safety practices Uber has in place for its self-driving cars. Most other companies testing autonomous vehicles use two safety drivers; however, Uber only uses one. We also found out via a report from Reuters that Uber uses fewer LIDAR sensors on its Volvo cars than the Ford Fusion models it previously deployed. The report claims the change results in “more blind spots than its own earlier generation.”
While the race to become the first company to launch a self-driving vehicle continues to heat up, each participant—Waymo, Ford, GM—relies on each other to uphold the technology’s image. Otherwise, the public may be reluctant to adopt it once it’s finally ready for primetime.
Phillip Tracy is a former technology staff writer at the Daily Dot. He's an expert on smartphones, social media trends, and gadgets. He previously reported on IoT and telecom for RCR Wireless News and contributed to NewBay Media magazine. He now writes for Laptop magazine.