Tesla eschews radar and is going down the computer vision route for autonomous drive, but keep an eye on the LiDAR partnership with Luminar.

Tesla is the most well-known electric vehicle brand in the world but it is also one of the leading players in autonomous vehicle development. While rivals in the space such as Google’s Waymo and GM’s Cruise are pushing to develop fully autonomous vehicles that drive themselves in all scenarios, Tesla has taken a different approach.

The Californian company has focused on bringing partial automation to the market through functions built into its production cars. Primarily by way of its ‘Autopilot’ system – a level 2 semi-autonomous system designed for highways that can steer the Tesla in its lane, monitor the environment for obstacles and adjust speed accordingly, and even perform overtakes in certain models.

Some Teslas are even equipped with beta versions of the company’s ‘Full Self-Driving’ feature. This allows the car to navigate local and surface streets, bringing more of the route under partial automation. Despite the name, this feature does not actually offer ‘full’ self-driving and still requires constant oversight from a human driver to operate.

Regardless of the claims these systems make of their performance, all Teslas use the same basic types of sensors to allow the car to perceive its environment. Primarily, they are based on a number of traditional vision cameras that provide a 360-degree view around the car. With the cameras taking the lead, the sensor suite is supplemented by front-facing radar to monitor objects in the distance when moving at speed, and ultrasonic sensors for extremely close obstacles such as other vehicles when entering a parking space.

However, in a change to this established strategy, Tesla has now confirmed that it will transition away from radar sensors, choosing instead to rely solely on its vision-based cameras for self-driving functions. Tesla’s sensor suite was already heavily weighted towards vision cameras, with the company developing highly advanced image processing systems to ‘read’ what was going on in a scene and interpret a safe, legal route through it. Within this suite, radar was previously used as a front-facing long-range sensor, which would provide a general idea of any obstacles far down the road along with their approximate distance and closing speed, but without the level of detail offered by vision cameras.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

Tesla clearly now feels that it can offer the same level of sensor suite performance and safety without radar so has pulled the system from the most recent Model 3 and Model Y cars built in the US. On these models, the monitoring previously provided by radar will now be handled by two front-facing vision cameras operating in stereo – this will provide depth information to estimate how far away obstacles are in the same manner as a human’s pair of eyes. So far, there are no plans to remove radar from Model 3 and Model Y cars made in China, or US-built Model S and Model X cars.

Tesla says the move will help accelerate its autonomous vehicle plans. It has chosen to start with its most popular models – the Model 3 and Model Y – which means the rollout of its purely vision-based sensor suite will spread quickly across most of the new models it sells. Tesla says that, by having so many models adopt this new system, it can use the wealth of data gathered by those cars to train the system more effectively, hastening its development.

Critics point out that there are risks associated with this decision. For one, while vision-based cameras have developed greatly in recent years, along with the visual processing algorithms that support them, they are not infallible. In particular, they can be badly affected by low light or high glare situations, and are known to struggle in poor weather conditions. Radar is known to excel in these areas, which is why so many other autonomous systems also incorporate radar sensors as additional redundancy should one set of sensors fail to see an obstacle. Tesla clearly feels that its visual processing software can make up for any of these potential shortcomings.

Tesla has also been notorious for its past public scepticism over LiDAR sensors. These operate similarly to radar but use light waves rather than radio waves to sense obstacles. However, observers were recently surprised to see a Tesla Model Y on manufacturer plates being tested in Florida with LiDAR units mounted on the roof. This comes at the same time as Tesla has announced a partnership with LiDAR developer Luminar for product testing. This is unlikely to preview a whole new sensor suite that incorporates LiDAR and, instead, is probably being used to gather data to help develop Tesla’s visual sensor suite. Nevertheless, it shows Tesla is willing to shake up its existing AV strategy to maintain its strong position.