A Tesla Model S reportedly operating in “Autopilot” mode slammed into the back of a fire truck as it was attending to an accident on a freeway in California. The National Transportation Safety Board (NTSB) said on Tuesday that it has sent two investigators to probe the crash and determine who was at fault. The agency will focus on “driver actions and how the vehicle performed” when assessing the case.
The board said it was still gathering information about the incident, namely, whether the “Autopilot” feature was engaged when the accident took place and whether the driver was ignoring any warnings. The car struck the firetruck around 8:30am Monday on Interstate 450 in Culver City, California. A union representing the firefighters claimed in a tweet that the driver said his car was in Autopilot mode. The group included photos showing significant damage to both vehicles. The firetruck was taken out of service for repairs. Fortunately, no one was injured.
“Amazingly there were no injuries!” the union tweeted. “Please stay alert while driving!”
While working a freeway accident this morning, Engine 42 was struck by a #Tesla traveling at 65 mph. The driver reports the vehicle was on autopilot. Amazingly there were no injuries! Please stay alert while driving! #abc7eyewitness #ktla #CulverCity #distracteddriving pic.twitter.com/RgEmd43tNe— Culver City Firefighters (@CC_Firefighters) January 22, 2018
Tesla responded by reiterating that its poorly named “Autopilot” feature is not capable of full-autonomous driving.
“Autopilot is intended for use only with a fully attentive driver,” a Tesla spokesperson told Jalopnik.
The California Highway Patrol and Culver City Fire Department told the San Jose Mercury News that the EV had rear-ended the truck, which was blocking the emergency or carpool lane at the scene of a previous accident. It couldn’t confirm whether it was in Autopilot mode.
Tesla’s Autopilot is a driver-assist system that includes lane assist and adaptive cruise control. Given its name and how Tesla advertises its vehicles as having “full self-driving hardware,” it’s not hard to see why people believe their electric cars are fully autonomous. The German government even urged Tesla to rename its Autopilot system and for owners to learn about the capabilities of their vehicle. To its credit, Tesla has included safety features that warn drivers to keep their hands on the wheel and pay attention to the road.
In 2016, the NTSB investigated the first fatal crash in a “self-driving” car. It released a 500-page report last year that revealed former Navy SEAL Joshua Brown, 40, kept his hands off the wheel of his Tesla for extended periods of time despite warnings from the automated system. The autopilot system in his EV failed to recognize the white side of a truck against a bright sky and crashed into it at 74 miles per hour.
Tesla gave a similar response at the time of the accident: “[Autpilot] is an assist feature that requires you to keep your hands on the steering wheel at all times… you need to maintain control and responsibility for your vehicle while using it.”
The Tesla incident comes days after an accident surfaced involving a Chevy Bolt, GM’s answer to the Tesla Model 3. GM is being sued by a motorcyclist, who claims the electric vehicle was operating in autonomous mode (what GM calls Cruise Automation technology) when it “suddenly veered back into [the motorcyclist’s] lane, striking [him] and knocking him to the ground.”
However, a report from the California Department of Motor Vehicles (DMV) blames the motorcyclist.
“The motorcyclist was determined to be at fault for attempting to overtake and pass another vehicle on the right under conditions that did not permit that movement in safety,” the DMV wrote in a report.