An ongoing federal investigation into how Tesla Full-Self Driving (Supervised) performs in low-visibility conditions was upgraded this week to a level that could prompt a recall affecting as many as 3.2 million vehicles.
The National Highway Traffic Safety Administration’s Office of Defects Investigation filed on Wednesday that it was moving the inquiry started in 2024 to an “engineering analysis,” which will further examine the complaint and product. It’s the step just before the U.S. agency recommends a recall or another action if a defect needs to be remedied.
The complaint covers Teslas with Full-Self Driving (Supervised) and will determine if the cameras in the advanced driver assistance system can detect degraded roadway conditions and alert the driver in sufficient time. Nine crashes have been associated with this specific case, including one fatality in 2023, which prompted the initial investigation.
NHTSA says it will test the system on similar road conditions and test the updates Tesla has issued to date. In its statement, the agency is looking at models using the entirely camera-based Tesla Vision, which was rolled out starting in mid-2021 and included a degradation detection system. An update to that system was being developed the day after the automaker reported the fatal 2023 crash to the NHTSA.
Vehicles affected include Model 3, Model S, Model X, Model Y, and the Cybertruck with supervised FSD.
The upgraded investigation is the latest hiccup this year for supervised FSD. A judge earlier this month upheld a $243 million verdict against Tesla following a fatal 2019 crash in which the automaker was held partially liable when a Model S on Autopilot crashed into a parked vehicle at 50 mph, killing one person and injuring another who were standing outside that stationary SUV. A Cybertruck owner in Texas recently filed a suit against the automaker after the vehicle crashed into a highway barrier last year while using FSD (Supervised) and claims the company was negligent in hiring and retaining Elon Musk as CEO while allowing him to be involved in product and design decisions, to the point of allegedly “[overriding] the concerns of engineers at Tesla.”
Tesla also ran into trouble with the California Department of Motor Vehicles by using the terms “Autopilot” and “Full Self-Driving,” with that state’s agency determining it was false advertising and threatening to stop allowing the automaker to sell its cars through its California sales outlets before the company added “Supervised” to FSD and stopped using the Autopilot name altogether. Last month, Tesla sued the California DMV.
Source: Gizmodo