Mr. Fintl, things are getting tough for Elon Musk at the moment. He’s received a lot of criticism for his style on Twitter, and Musk has played a backwards role on the subject of Tesla and autonomous driving. What does that mean?
First off, Musk has done a lot for the industry. Without the catalytic effect of the Tesla models, there is a high probability that we would not be where we are today in terms of electromobility and the topic of “software-defined vehicles”.
At least when it came to “autonomous driving”, Elon Musk always talked too much and never kept up with the delivery.
In 2015 he presented his autopilot for the first time. The first demonstrations caused enthusiasm and made the established competition look really uninnovative in the perception of many customers. With the name he suggested: “The vehicle not only keeps the lane and distance to the vehicle in front, but also understands the traffic environment and reacts accordingly.” After trying it out for the first time, many users thought: fully autonomous operation is next.
That was a very effective statement. In a safety-relevant area, however, it is also highly problematic. What was Tesla’s system really capable of?
Back then – 2015 – Musk still had Mobileye from Israel as a technology partner. The systems available at that time used camera or radar sensors to implement driver assistance functions. Tesla has always been full-bodied when it comes to system descriptions and generous when it comes to the limits of driver monitoring. This gave customers the impression – under optimal conditions – that the autopilot system is already very powerful.
The whole thing worked under certain conditions. Where were the limits?
That was and is the great danger. As a driver, you quickly let yourself be lulled into safety by the supposed performance and safety of such systems. When the weather is nice, the road is reasonably straight and the road markings have been freshly painted – then even with the first Tesla autopilot you literally drove as if you were on rails. In fact, it was a balancing act without a strong safety net. Drivers could activate the system and take their eyes off the road for long periods of time without the computer controlling their attention. This resulted not only in negligent Youtube videos, but also in a number of fatal accidents.
Was Mobileye’s system overwhelmed with autonomous driving?
Mobileye’s core technology is the camera-based automation of driving functions. Within the system limits – and this is the crux of the matter – the solutions had and have worked reliably. Track and spacers, emergency braking systems were the original purpose at the time. However, Tesla has not only gone beyond that on the marketing side – that is why the Israelis have ended their cooperation with Tesla.
Understandable, that was too delicate for them. But the exit didn’t slow Musk down?
To the surprise of many, he said at the time: “Then we’ll just do it ourselves. That’s software, that’s our core competence.” The interplay of “electric” and “autonomous” is still the core of the Tesla brand today. Within a year, his system was doing roughly what Mobileye was capable of, with a few caveats. Over the years, Elon Musk has made it his mantra that a self-driving car essentially only needs camera-based environmental awareness.
The analogy to this is man?
Yes, of course, people also use their two eyes to find their way around in traffic. Then there are the brain and mind, translated as optical sensors and software, which develop an understanding of the traffic environment. So Tesla decided to say: with cameras, computing power and clever algorithms we can drive at least as well as a human being.
That was a bit optimistic. In bad conditions such as fog, snowfall and rain, human eyes are stretched to the limit. And cameras?
Everyone in the industry will confirm: Highly automated driving in the real world is actually more complicated than you think. The camera sensors could not compete with the human eye for a long time. The first Tesla cameras only had a modest resolution, were overwhelmed with high contrasts and could not see in full color. Nevertheless, Musk stood there with his legs apart and said that with this hardware we will drive autonomously.
But there were also radar sensors.
Yes, until recently Tesla had installed a front radar in all models. Millions of these systems are in use in the automotive sector, they work very reliably and form the basis for assistance functions for many manufacturers. Think of spacers or emergency brake assistant. However, the previous radar systems had to struggle with a lack of resolution or selectivity. Simple radars cannot reliably detect whether a cyclist is driving next to the large truck. Additional data – such as from cameras or lidar systems – are necessary here.
The radar was then also thrown out at Tesla. How come?
The trend reversal came a few months ago – Musk wanted to rely solely on his “Tesla vision” and began to phase out the radar. Likewise, the ultrasonic sensors, useful at low speeds or in parking situations, have been deleted. This saved a significant amount of money – the cost advantage in series production is considerable. The message was trumpeted: In the future, everything will work with the camera.
Camera, camera, camera, that’s what Tesla said. And Musk began his campaign of abuse against the laser sensors, the lidar technology. Why so aggressive?
“Anyone reyling on Lidar is doomed,” Musk said at the time, branding the technology a mistake from his point of view. For Tesla, however, the technical advantages of laser sensors over cameras in difficult lighting conditions have not outweighed their disadvantages, such as high costs and difficulties in object recognition. It is very difficult for a lidar to distinguish between a plastic bag blowing across the road and a dangerous object such as a car tire lying on the road. This means that camera sensors are still necessary to safely implement automated driving.
And Tesla wanted to save that? The other manufacturers did not shy away from the costs.
The “conventional” manufacturers, but also tech players like Waymo, have always said with their test vehicles that we need a mix of sensors: camera, lidar, radar, you have to capture the environment around the car at 360 degrees. This redundancy of systems and information enables the software to make the right decisions more reliably.
Tesla was a little more carefree. Less information and this primarily from the front.
In fact, the Tesla Vision System relies on cameras that can capture the surroundings almost completely. Unfortunately, there are a few “blind spots” in this solution, for example at the front near the vehicle. Even in the best-case scenario, the vehicles only see a maximum of 100 meters to the rear, significantly less than a classic rear radar. On German autobahns, this can become critical if a significantly faster vehicle is approaching from behind in the left lane and the Tesla is planning an overtaking manoeuvre.
But now for the turnaround. Musk threw out the radar, but now there are indications that one will be installed again. That sounds like a zigzag course. To save one’s honour, however, one has to say that this is a different system. How do these radars differ?
The likely planned system is actually a first in the automotive field. It is a so-called “4D imaging radar”. The key innovation here: A large number of small radar antennas are integrated in one system. This achieves a much finer resolution and thus results similar to a lidar system. These novel systems are now ready for series production and can dissolve a human, for example, including extremities. Likewise, the selectivity is high enough to identify pedestrians or cyclists who are next to or between vehicles. A pioneering technology. In addition to a few promising start-ups in this area, top dogs such as Continental are also represented with products.
And what does 4D mean?
These 4D radars not only detect the distance, direction and speed of an object in space, but also reliably provide height information. In conjunction with the higher resolution, road features such as curbs or potholes can be recognized. That is what is actually new about the invention. And now it looks like a 4D imaging radar will go into production at Tesla.
What looks like a backward roll is actually a giant leap forward?
Absolutely. That roll backwards is a big leap forward. If Tesla actually put such a 4D radar into series production for the upcoming model update of the Model 3 in a few months, it would definitely be a bang. With the 4D imaging radar, Tesla can get out of the camera impasse without losing face and is back at the forefront when it comes to sensor technology. Then only the “construction sites” of the rear radar and the closure of the small blind spots of the cameras remain.
So far, Musk has been economical, what does this innovation cost?
Even as a 4D imaging variant, radar technology is much cheaper than a LIDAR system. Even if you produce laser sensors efficiently, with lidar we are talking about unit costs of hundreds of euros. Since the new radar systems are unbeatably cheaper. They have no optics. It is “only” normal semiconductor production and clever software. In addition, algorithms for object recognition integrated in the radar can not only relieve the control of the car, but also form a fallback level, for example for emergency braking functions. Even manufacturers who swear by lidar struggle with the costs. For example, attempts are being made to eliminate lidars at the corners and maximize the benefit of a single, forward-facing lidar on the roof above the windshield. Volvo recently presented this in a similar form at the launch of its new electric SUV.
So yes: the big leap forward for Tesla?
Adding a 4D imaging radar makes the detection performance of an automated system significantly more robust, potentially improving the customer experience. Tesla drivers are currently suffering from so-called phantom braking. Today’s camera system and the associated algorithms still have an error rate that is too high and cause unnecessary emergency braking. This is extremely annoying and disturbs the customer. And leads to dangerous situations.
Would that be a no-go for fully autonomous driving?
Safety is the key point. Approval authorities attach great importance to this and require corresponding proof. Not only the function must be given, the architecture must also be designed accordingly. Without meeting these requirements, such a system cannot be approved on the world markets. Today’s fully automatic taxis that are on the road in China or the USA also have a full range of sensors on board for safety reasons. In addition to the camera, this also means radar and lidar – for complete detection of the surroundings.
For reasons of reliability, but also to expand the operating range of its assistance or driving functions – a radar can also see through fog – Tesla is urgently dependent on additional sensor channels. This imaging radar would be a step forward, Tesla would once again set the pace.
This is also necessary to maintain a leading position.
Absolutely. When you look at it in the light of day, Tesla has definitely lost ground in the field of driving automation. Even the current beta version of the “Full-Self-Driving” function requires constant monitoring by the driver, in some everyday traffic situations it still needs human assistance. Whether the Tesla FSD system can be approved with its current architecture is the subject of lively expert discussions.
When it comes to highly automated driving, such as for robo-taxis, nobody can currently get past Waymo, Cruise or the Chinese. But if Musk pulls a Tesla radar innovation out of a hat, it can be another nice story for customers and investors.