The US traffic authority has launched a new investigation into Tesla’s “Autopilot” driver assistance system. It is looking into whether an “Autopilot” update from December is sufficient to allay the authority’s safety concerns. In a multi-year investigation, the NHTSA (National Highway Traffic Safety Administration) came to the conclusion that “Autopilot” made it too easy for drivers to leave control completely to the system, even though they have to constantly keep an eye on the traffic situation.
The NHTSA analyzed a total of 956 accidents from January 2018 to August 2023. 29 of these resulted in fatalities. In many cases, the accidents were avoidable if the drivers had been careful, the authority emphasized in its report. In 59 of 109 collisions for which there is enough data for such an analysis, the obstacle was visible at least five seconds before the accident. As an example, the NHTSA cited an accident in March 2023 in which a minor who was getting off a school bus was hit by a Model Y and seriously injured.
Gaps in Tesla’s collection of vehicle data
With the online update carried out as an official recall campaign, Tesla introduced, among other things, additional information for drivers. The electric car manufacturer points out that “autopilot” does not make a Tesla a self-driving car and that the people behind the wheel must be ready to take control at any time. The US accident investigation agency NTSB warned that drivers were relying too much on the technology.
The NHTSA also noted in its report that there are gaps in Tesla’s collection of vehicle data that make it difficult to determine the actual number of “Autopilot” accidents. For the most part, the car manufacturer only receives accident data when airbags or seatbelt tensioners are triggered.
According to the general accident statistics from 2021, this only happens in 18 percent of all collisions reported to the police. In addition, a prerequisite for data transmission to Tesla is that a mobile phone network is available and the antenna works after the accident. In many cases, electric cars burn out after accidents because the batteries burst into flames.
Criticism of the term “autopilot”
The traffic safety authority NHTSA also criticized the name of the system. The term “autopilot” could lead drivers to overestimate the capabilities of the software and to rely on it. US drivers can currently use an advanced “Autopilot” version called “Full Self-Driving” as a test version.
However, FSD does not officially make the car an autonomous vehicle and requires constant human attention. Tesla recently added the addition “supervised” in brackets to the name. Company boss Elon Musk once again promised self-driving Tesla cars this week. He wants to present a robotaxi at the beginning of August.
The standard “autopilot” system can maintain the speed and distance to the vehicle in front, as well as the lane. The FSD version should also control traffic lights, stop signs and right-of-way rules at intersections, among other things. According to the report, U.S. Senators Edward Markey and Richard Blumenthal called on NHTSA to limit the use of Autopilot only to the roads for which the system was designed.