2023 marks the first year Level 3 autonomous vehicles (AVs) will be commercially available in all three major automotive markets — the United States, China, and Europe. While sales are likely to start slow, even our most conservative forecasts predict a relatively steep growth curve beginning in 2026. We expect anywhere from eight million to 17 million advanced AVs on the road globally by 2030, assuming governments begin to approve their use. Today, only a few nations and localities allow Level 3 or above AV technology to be used on the open road, including Japan, Germany, South Korea, and the state of Nevada in the United States.
But AVs continue to be hounded by a troubled accident record and high underlying technology costs. Their next few years will be clouded by a raft of unanswered questions, almost all of which revolve around the competing remote sensing technologies looking to define AVs’ future development. The winner of that rivalry will determine whether advanced AVs — Level 3 through Level 5 — become more standard technology or remain automotive unicorns.
Today, newer cars on the road have what is called driver-assist features. When these driver-support systems are available, vehicles are categorized as Levels 1 and 2. With a Level 1, features include lane-following assistance and blind spot detection. Level 2 involves advanced driving assistance systems (ADAS) that can take over braking or steering when needed, even though the driver is still in control, such as in the emergency braking capability.
At Level 3 and above, drivers begin to turn over control to the automobile’s software. Level 3 is considered conditional driving automation. It requires drivers to be prepared to take over when the vehicle asks them to. Levels 4 and 5 allow drivers’ to be disengaged.
How advanced remote sensing technologies work
To properly support ADAS functionality, data from cameras, radar, liDAR, and other sensors like ultrasound need to be integrated to give the vehicle an accurate sense of what’s going on around it. LiDAR and radar are the two competing advanced remote sensing technologies — liDAR for light detection and ranging and radar for radio detection and ranging. These scan the terrain ahead of a car and provide a continuous stream of information back to the vehicle’s central processing unit.
LiDAR, which uses light to measure distances between objects, is currently considered the more accurate of the two sensing technologies, providing high-resolution, three-dimensional mapping under almost all-weather conditions. It’s optimal for emergency braking, pedestrian detection, and collision avoidance — important AV safety features where radar has come up short. But liDAR’s technological prowess comes at a price: three to five times the cost of radar.
Despite the cost, liDAR has attracted more investment dollars. In 2022, investment in liDAR startups and technology exceeded radar by $3 billion — with about $8 billion invested across radar startups and about $11 billion across liDAR.
On the other hand, conventional radar has become more mainstream. It is highly compact and lower cost and performs well in various weather conditions and darkness. It has accurate speed detection but insufficient resolution for accurate object detection and a limited angular resolution of around one degree compared with liDAR at around 0.1 degree. Given the lower resolution, radar is usually used for short- and medium-range applications like blind spot detection, rear collision alerts, and cross-traffic warnings, as well as long range for the classic adaptive cruise control. While radar has a 90-degree field of view, liDAR has up to a 360-degree field of view depending on the surroundings.
For these safety reasons, liDAR is likely to become the dominant solution for passenger cars over the next few years for both long-range and short-range sensing. The challenge for liDAR is cost. Currently, long-range liDAR systems cost around $500. Mass penetration requires a price point of less than $300 and less than $100 for short-range liDAR.
Radar catching up on liDAR
But the latest versions of radar have closed the gap with liDAR on resolution. Now, they are essentially on par given the newest radar’s greater aperture. The new radar also uses echolocation and the principle of time-of-flight measurement, similar to liDAR. With this, it creates point-cloud images of the vehicle’s surroundings, which allows it to detect not only the distance, relative speed, and azimuth (at 0.11 degree) of nearby objects but also their height above road level.
To enable the new features, radar systems now use multiple input/multiple output (MIMO) antenna arrays for high-resolution mapping. Traditional radar systems usually contain two to three transmitting antennas and three to four receiving antennas, which lead to a beam providing limited short-range coverage and a narrow field of view unable to generate images. The limited angular resolution is insufficient to differentiate among vehicles, pedestrians, or objects that are close. The MIMO approach increases the underlying channels from only nine to anywhere between 128 and 2,000. Given radar’s significantly lower costs — even with all the enhanced technology — it’s easy to see how the two technologies will increasingly be on more equal footing.
As the decade progresses, we expect to see a combination of cameras and radar begin to replicate liDAR’s current expensive technological prowess. For instance, the reduced cost of radar may help make the radar-camera combination a choice for robotaxi services when they begin to emerge in earnest. As they tend to operate in smaller urban environments at lower speeds, their needs would play to radar’s current strengths. Here there will be lower requirements for sensing distance, and these vehicles will have a longer time-to-market and will benefit from radar technology improvements.
Already, five automakers in China are producing Level 3 AVs, while there is only one model each coming out of Europe, the US, Japan, and South Korea. Most car manufacturers, making Level 2 and 3 AVs, utilize so-called sensor fusion, which combines cameras, radar, and liDAR, although one has shifted to a camera-based autopilot, previously using ultrasonic sensors and cameras. While cameras are cost-effective and well-known technology, using cameras alone provides a system that lacks 3-D sensing and struggles in adverse weather or poor lighting.
Step by step, we expect radar to challenge liDAR for market share as its technology improves and it performs at a similar level to liDAR. This assumes the radar technology maintains its lower price points, which so far it has.