Tesla secret configuration disables ‘nag’ for Autopilot, FSD

[ad_1]

The Tesla Model Y is seen in a Tesla parking lot on May 31, 2023 in Austin, Texas.

Brandon Bell | Getty Images

A security researcher using his “GreentheOnly” handle Find a secret place In Tesla cars that the company can enable and allow the driver to use Tesla’s advanced driver assistance systems, marketed as Autopilot and Full Self-Driving, without keeping their hands on the wheel for an extended period of time.

When this mode is enabled in a Tesla car, it eliminates what car owners refer to as the “nag.” He said the researcher called the feature “Elon Mode” but that is not the company’s internal designation for it.

Tesla doesn’t offer a self-driving car today. CEO Elon Musk has promised to deliver a self-driving car since at least 2016, and has said a Tesla car will be able to complete a test drive across the US without human intervention by the end of 2017.

Instead, Tesla’s driver assistance systems require the human driver to remain alert and ready to brake or steer at a moment’s notice.

Typically, when a Tesla driver is using Autopilot or FSD (or their variations), a visual symbol flashes on the car’s touchscreen to prompt drivers to apply resistance to the steering wheel at frequent intervals. If the driver cannot grip the steering wheel, the annoying escalates to a beeping sound. If the driver still does not apply torque to the steering wheel at that point, the vehicle can temporarily disable the use of Autopilot for up to several weeks.

Elon Musk said in a tweet from last year in December that he would remove a “nag” for at least some Tesla owners in January. This plan never came to fruition. By April 2023, Musk said in a tweet, “We’re gradually reducing it, commensurate with improved safety,” referring to the irritant.

The security researcher who exposed the “Elon mode,” whose identity is known to both Tesla and CNBC, asked not to be identified by a pseudonym, citing privacy concerns.

He has been testing features of Tesla cars for years and is the owner of the Tesla Model X. He has also reported bugs to the company constantly, and has earned tens of thousands of dollars offering rewards for successful Tesla bugs, as previously reported.

“Unless you work for Tesla, or have access to relevant databases in the company,” the “white hat hacker” said in a direct message interview Tuesday, “there’s no way to know how many cars the ‘Elon mode’ has.” today.

In February, Tesla issued a voluntary recall in the United States of 362,758 of its vehicles, warning that the fully self-driving Beta system could cause malfunctions. (This was the second such summons.) Tesla provided an over-the-air software update to address the issues.

FSD Beta system at that time may cause crashes, Safety recall report It said, by allowing affected vehicles to: “behave unsafely around intersections, such as driving straight through an intersection while in a turning lane only, entering an intersection controlled by stop signs without stopping completely, or proceeding through an intersection while at a traffic light.” Steady yellow without being careful.”

GreentheOnly said it expects future recalls related to FSD Beta issues and the extent to which the system automatically stops “traffic control devices” such as traffic lights and stop signs.

According to the latest available Data from the National Highway Traffic Safety AdministrationTesla reported to the agency 19 crashes that resulted in at least one fatality, and where the company’s driver assistance systems were in use within 30 seconds of a collision.

There are 21 total accidents reported by Tesla to NHTSA that resulted in fatalities and where the cars were equipped with their driver assistance systems.

Tesla did not immediately respond to a request for comment.



[ad_2]

Leave a Reply

Your email address will not be published. Required fields are marked *