The Tesla Autopilot’s name suggests that it is able to drive a vehicle safely and without assistance. The better “FSD package” (“Full Self-Driving”) even more so. In Germany it is called “full potential for autonomous driving”. Countless practical examples show that a Tesla should never be driven inattentively – and the law requires it.

In January, test drivers for the “New York Times Magazine” uncovered glaring deficiencies – the US Federal Highway and Vehicle Safety Authority (NHTSA) is also investigating the software’s problems in several cases.

After a few fatal accidents, there were understandably complaints from the survivors. And as “Reuters” reports, a preliminary decision has now been reached in one case: According to this, Judge Reid Scott in the US state of Florida found “sufficient evidence” that Tesla boss Elon Musk and other managers allegedly knew that the systems The vehicles had defects that could pose a danger in certain situations.

The lawsuit was filed by the wife of a deceased man whose 2019 Tesla Model 3 drove under the trailer of a shunting heavy truck for reasons that were initially unclear. The roof was severed and the driver was fatally injured.

This is a big problem for Tesla and Musk, because the plaintiff can now go to court over the fatal accident and assert claims for damages against Tesla for intentional misconduct and gross negligence.

What’s more: According to Bryant Walker Smith, a law professor at the University of South Carolina, the ruling “opens the door to a public trial in which the judge appears inclined to allow a lot of testimony and other evidence favorable to Tesla and could make his CEO quite unpleasant,” he told Reuters.

Judge Scott apparently saw similarities with another crash that occurred back in 2016. In that death, too, a truck trailer was incorrectly identified.

The Tesla autopilot is repeatedly said to have driving errors, which could even be reproduced in the New York Times Magazine report. Vehicles with bright lights, be they emergency vehicles or traffic lights, had particular problems.

“It would be plausible to conclude that Tesla was aware of the problem through its CEO and engineers,” the judge wrote, according to Reuters.

Tesla, on the other hand, has often advertised the systems, especially Autopilot and FSD, with the promise that the vehicles would be able to drive themselves. For example, at the beginning of such an advertising message there is a disclaimer stating that the person in the driver’s seat is only present for legal reasons.

If Tesla loses in this case, the decision could trigger a chain reaction. It was only in October that the “Handelsblatt” reported on a lawsuit from two US pension funds that invested in the company in 2019 and now believe that the alleged untruths are responsible for price losses.

Since these proceedings are a class action lawsuit that other people can join, the amount of possible compensation for investors cannot yet be predicted – but it could have a serious impact on Tesla if a judge rules against the company.

Due to personal claims by Elon Musk, who has repeatedly promoted the safety of his vehicles, he himself is also the target of this lawsuit.