Even if Tesla boss Elon Musk has been certain for years that fully autonomous driving is only a stone’s throw away, it is currently not quite that far. And even if a Tesla with the current version of the “Full Self Driving” software is able to complete long journeys largely without assistance, the driver must remain alert, touch the steering wheel at regular intervals and look ahead. In the event of violations, the car draws attention to itself acoustically. Among Tesla drivers, this is called “nagging” – and it’s hated.
A Tesla hacker who calls himself “Green” claims to have discovered a hidden mode that will end the nagging. He calls it “Elon Mode”, but has no proof that it’s really called that. “The Verge” describes “Green” as someone who has a history of finding secret features and innovations in Tesla vehicles before the manufacturer officially spoke about them.
The software expert made a longer video of the use of “Elon Mode”. It shows that the car spares him warnings and instructions over a longer distance and unrolls the highway independently.
The “Elon Mode” is the complete opposite of what Tesla actually asks of drivers. Because the manufacturer is legally obliged to check the attentiveness and to ensure that people do not sleep at the wheel – as has happened several times in the past. Tesla even goes as far as turning off the $15,000 extra with the normal vehicle setup if you ignore the warnings too often.
However, since the monitoring system is apparently not fully developed and, for example, also warns if the person behind the wheel is wearing a peaked cap, many find it intrusive and annoying.
It’s no secret that Tesla is working to stop driver alerts. Musk already announced in December last year that he wanted to switch off the demands for attention. It is therefore quite possible that such a mode is being tested internally – and may not be available for all vehicles at the moment.
The hacker’s Twitter posts reveal that he apparently spotted the mode on some sort of company vehicle.
“Full Self Driving” in “Elon Mode” doesn’t seem to be suitable for the general public anyway. “Green” describes that his car changed lanes unnecessarily often, at times did not overtake despite the clear lane and was completely confused at construction site signs.
Such behavior is also described in many reports that repeatedly emphasize the accident risk of an autonomous Tesla in certain situations. The “Washington Post” recently published a detailed article about the fact that the number of accidents in connection with the assistance systems is quite high. This was also confirmed by a journalist from the “New York Times Magazine”, who experienced dangerous situations as a passenger in several Teslas with the software, which were clearly due to malfunctions.
In addition, there are ongoing investigations by the American traffic authority NHTSA, which is currently dealing with accidents that are said to be related to the autopilot. When Musk first announced in December that he wanted to switch off the control mechanisms, the investigators knocked directly on Tesla, as “Reuters” reported.
After all: “Green” explains that his method of unlocking the “Elon Mode” is very complicated. It should therefore be almost impossible for someone to accidentally activate this experimental part of the software. Unless you know someone from the manufacturer and ask nicely, he added in response to the question of where to find the option.