US investigates Tesla Autopilot recall

The National Highway Safety Administration also released an analysis of crashes involving the system that found at least 29 fatal crashes over five and a half years.

The main federal auto safety agency said Friday it is investigating Tesla's recall of its Autopilot driver assistance system because regulators fear the company has not done enough to ensure that drivers remain attentive when using the technology.

The National Highway Traffic Safety Administration said in documents published on its website that it was investigating Tesla's December recall of two million vehicles, which covered nearly every car the company has made in the United States since 2012. The safety agency said it was concerned about accidents that occurred after the recall and the results of preliminary tests of recalled vehicles.

The agency also released an analysis revealing that there were at least 29 fatal accidents involving Autopilot and a more advanced system that Tesla calls Full Self-Driving between January 2018 and August 2023. In 13 of these fatal accidents, the fronts of Teslas hit objects or people in their path.

Tesla recall investigation and new accident data add to list of headaches for Tesla, the leading electric automaker in the United States United. The company's sales in the first three months of the year fell more than 8% from a year earlier, the first such decline since the early days of the coronavirus pandemic.

Tesla announced in December that it would recall its Autopilot software after an investigation by the auto safety agency found the automaker had not implemented enough protective measures to ensure that the system, which can accelerate, brake and otherwise control cars, was used safely by drivers expected to be ready at any moment to regain control of their car from the driver automatic.

In its analysis of Tesla accident data, the safety agency found that when the company's cameras, sensors and software do failed to spot obstacles in the car's path and drivers did not compensate for this failure quickly enough, the consequences were often catastrophic.

We are struggling to retrieve the article content.

Please enable JavaScript in your browser settings.

Thank you for your patience while we check the access. If you are in Reader mode, please exit and log in to your Times account, or subscribe to the entire Times.

US investigates Tesla Autopilot recall

The National Highway Safety Administration also released an analysis of crashes involving the system that found at least 29 fatal crashes over five and a half years.

The main federal auto safety agency said Friday it is investigating Tesla's recall of its Autopilot driver assistance system because regulators fear the company has not done enough to ensure that drivers remain attentive when using the technology.

The National Highway Traffic Safety Administration said in documents published on its website that it was investigating Tesla's December recall of two million vehicles, which covered nearly every car the company has made in the United States since 2012. The safety agency said it was concerned about accidents that occurred after the recall and the results of preliminary tests of recalled vehicles.

The agency also released an analysis revealing that there were at least 29 fatal accidents involving Autopilot and a more advanced system that Tesla calls Full Self-Driving between January 2018 and August 2023. In 13 of these fatal accidents, the fronts of Teslas hit objects or people in their path.

Tesla recall investigation and new accident data add to list of headaches for Tesla, the leading electric automaker in the United States United. The company's sales in the first three months of the year fell more than 8% from a year earlier, the first such decline since the early days of the coronavirus pandemic.

Tesla announced in December that it would recall its Autopilot software after an investigation by the auto safety agency found the automaker had not implemented enough protective measures to ensure that the system, which can accelerate, brake and otherwise control cars, was used safely by drivers expected to be ready at any moment to regain control of their car from the driver automatic.

In its analysis of Tesla accident data, the safety agency found that when the company's cameras, sensors and software do failed to spot obstacles in the car's path and drivers did not compensate for this failure quickly enough, the consequences were often catastrophic.

We are struggling to retrieve the article content.

Please enable JavaScript in your browser settings.

Thank you for your patience while we check the access. If you are in Reader mode, please exit and log in to your Times account, or subscribe to the entire Times.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow