[ad_1]

CEO of SpaceX and Tesla and owner of Twitter, Elon Musk attends the Viva Technology conference dedicated to innovation and startups at the Porte de Versailles exhibition center on June 16, 2023 in Paris, France.

Chesnot | Getty Images

Tesla must submit extensive new records to the National Highway Traffic and Safety Administration as part of its Autopilot safety investigation — or face heavy fines.

if Tesla Fails to provide the federal agency with information about advanced driver assistance systems, which are being marketed as Autopilot, Fully Self-Driving, and FSD Beta options in the US, and the company faces “civil penalties of up to $26,315 per violation per day,” with a maximum of 131,564, $183 for a continuum of daily violations, according to NHTSA.

NHTSA launched an Autopilot safety investigation in 2021 after it identified a series of accidents in which Tesla Autopilot vehicles collided with stationary first responders and road work vehicles.

So far, none of Tesla’s driver-assistance systems are autonomous, and the company’s cars can’t function as an automated vehicle like those powered by Cruise or Waymo. Instead, Tesla cars require a driver behind the wheel, ready to steer or brake at any time. Autopilot and FSD only control braking, steering, and acceleration in limited circumstances.

Among other details, the Federal Motor Vehicle Safety Authority wants information about the versions of Tesla’s software, hardware, and other components that were installed in every vehicle sold, leased, or used in the United States from model years 2014 to 2023, as well as the date on which any vehicle was “accepted.” Tesla is in the “Full-Self Driving beta” program.

The company’s FSD Beta consists of driver-assistance features that have been tested internally but not completely broken down. Tesla uses its customers as safety testers for software and vehicles via the FSD Beta program, rather than relying on professional safety drivers, as is the industry standard.

Tesla previously conducted voluntary recalls of its cars due to issues with Autopilot and FSD Beta and promised to provide over-the-air software updates that would address the issues.

A notice on the NHTSA website in February 2023 said Tesla’s FSD Beta driver-assistance system could “allow a vehicle to behave unsafely around intersections, such as traveling straight through an intersection while in a turning lane only, and entering a stop-sign-controlled intersection without stopping.” completely, or entering an intersection while a stationary yellow traffic light is on without due caution.”

According to data tracked by NHTSA, there is There have been 21 known collisions This resulted in deaths involving Tesla cars equipped with the company’s driver assistance systems — higher than any other automaker offering a similar system.

according to separate letter On Thursday, the NHTSA also retracted Nobody’s petition Auto Safety ResearcherRonald Belt, who asked the agency to reopen an earlier investigation to determine the reasons behind the “sudden unintentional acceleration” events reported to NHTSA.

With sudden, unintended acceleration events, a driver may be either stopping or driving at a normal speed when their vehicle sways forward unexpectedly, which could result in a collision.

Tesla vice president of vehicle engineering, Lars Moravi, did not immediately respond to a request for comment.

Read the full message from NHTSA to Tesla Extensive new records request.

[ad_2]

Leave a Reply

Your email address will not be published. Required fields are marked *