US billionaire and self-confessed Tesla critic Dan O’Dowd works with a lot of money and effort to damage the reputation of Tesla vehicles and especially their software. He doesn’t do this entirely unselfishly, because O’Dowd also develops software for vehicles and wants to show with his attacks on Elon Musk and Tesla that his product is better. So keep that in mind for tests involving his Dawn Project. Nevertheless, what the YouTube channel “AI Addict” now shows does not reflect well on Tesla and the $15,000 “Full Self-Driving” software package, or FSD for short.

The task of the FSD is to take over almost all tasks while maintaining the driver’s attention and to bring the car from A to B almost by itself – including parking. According to the video, around 400,000 people are taking part in the beta phase of the software. You read that right: It is officially unfinished test software that is tested by drivers in the wild. And that’s really not going smoothly. A recent report by the New York Times Magazine revealed just how many misjudgements and problems the system causes even at the simplest of intersections (read more here). And the American traffic authority NHTSA has been investigating for months because of numerous accidents that are apparently connected to the system – for example a pile-up in an inner-city tunnel (find out more here).

Officially, every buyer of the software has five points. If the person behind the wheel behaves contrary to the rules, the time will be counted down. If you get zero points, you will be banned. Inattentive behavior at the wheel can also lead to the deductions, because Tesla is legally obliged to check this. The video shows how exactly the manufacturer takes it. The tester John Bernal, who according to “CNBC” once worked at Tesla and was fired at the beginning of 2022 because of his videos, managed to outwit the system with impunity several times.

Sometimes Bernal drives a bear, sometimes a unicorn, sometimes a balloon that looks like a champagne bottle. In the end, nothing at all. Since Tesla not only monitors the seat, but also the weight and the contact with the steering wheel, two weights were also used. The result: The camera, which is responsible for monitoring the interior, is satisfied with everything and the car follows the planned route with the active driving assistant.

Bernal takes the test as an opportunity to check the system in certain situations in which even the most rudimentary software would have to react. On a closed road, he lets the car drive straight ahead again and again at a constant speed. To rule out manipulation, he does this with several vehicles. The test: a person pulls a figure resembling a child across the street with a rope, and then a figure that looks like a dog is added. The task: Please do not run over. The result: Again and again the Tesla flattens the figures, although even warning tones can be heard in the vehicle. Once the vehicle even flees. According to the video, there were no point deductions for the wrong drivers or for the accidents. Finally, Bernal sets up a roadblock based on the American model and wants to know whether the vehicle sees the signs, interprets them correctly and stops. The opposite is the case, the Tesla pulls through and takes the barrier tape with it.

The video demonstrates again that Tesla’s software and hardware do not seem to meet the high requirements for autonomous driving. It doesn’t help that Musk time and again decides against components that are taken for granted by other manufacturers. While Mercedes and Co. rely on infrared for the interior, Tesla works with conventional cameras, as with obstacle detection. This causes recognition problems when parking – and apparently also in traffic and in the interior.