Autonomous blind flight
The reason for the current investigation is a fatal accident in 2023. There is a risk of a recall of 3.2 million vehicles
When visibility is poor, people should actually intervene, but Tesla’s full self-driving system often does not warn in a timely manner.
It is a worrying conclusion that the US traffic safety authority NHTSA comes to in a recent report: Tesla’s self-driving software (Full Self Driving Supervised – FSD) apparently does not correctly recognize when the vehicle’s cameras no longer perceive the surroundings sufficiently to be able to react to dangerous situations.
You decide how you want to use our content. Unfortunately, your device currently does not allow us to display the relevant options.
Please deactivate All hardware and software components that are capable of blocking parts of our website. E.g. browser add-ons such as ad blockers or network filters.
Do you have a PUR subscription?