Through sensor fusion, the combined benefits of radar and light detection and ranging (LiDAR) sensing are being implemented in each new generation of advanced driver assistance systems (ADAS). Improvements in these sensing technologies will provide improved performance for future applications. However, they will also require advancements in cameras and other system aspects, including inertial measurement units (IMUs with accelerometers, gyroscopes, and sometimes magnetometers), as well as high-performance computing (HPC) systems with powerful algorithms and input from the cloud with 5G networks and the Global Navigation Satellite System (GNSS).
Overview of new radar technologies
Today, radar is an essential sensing aspect of automotive features such as automatic emergency braking, adaptive cruise control, and blind spot detection that are often standard on many vehicles. For more advanced systems, including higher Society of Automotive Engineers (SAE) levels of autonomous driving (AD) (before achieving the highest level (L5)), higher performance (high-resolution) radar sensors are required.
ADAS technology on existing production vehicles today relies on multi-sensor fusion (typically 5+ radars and 8+ cameras) to deliver L2+ hands-free driving on mapped highways. These designs balance convenience with driver accountability through real-time monitoring. For more complex real-world situations and ultimately L5 operation, today’s 3D radar must progress to 4D and even 4D imaging radar, and/or implement other more sophisticated design approaches. Newer 4D imaging radar systems provide vertical information.
The size of a radar system, including its aperture (the antenna’s opening that transmits and receives microwave energy), is directly related to its performance—larger apertures result in higher resolution. Different designs provide increased apertures.
Massive multiple-input, multiple-output (MIMO) radar uses many closely packed transmit (Tx) and receive (Rx) antennas to create a large virtual antenna array with multiple focused beams to address the need for improved spatial resolution, better target detection, and more. Another existing design approach is phased array radar (PAR) that uses its baseband signal generator to produce a scalar signal, which is copied to each Tx antenna. In this case, many small antennas steer beams electronically by changing signal timing (phase).
Existing medium-range radar (MRR) modules typically have up to 16 (but could go to 256) virtual antennas, consume a lot of power, and take a significant amount of space. One way to avoid today’s issues is a distributed aperture radar (DAR) design. DAR uses multiple small radar sensors in a coherent system to form a large virtual aperture.

The impact of radar advances on sensor fusion performance
DAR’s large virtual aperture delivers excellent spatial resolution with enhanced azimuth resolution of 0.5 degrees or lower. In resolution KPI (key performance indicator) testing, DAR has demonstrated the clear separation of two corner reflectors up to 320 meters and beyond.
One company’s forward-facing, 4D imaging radar provides a 300-meter detection range and excellent angular resolution. The first families in production offer six to seven times better classification of vulnerable road users.
Part 2 of this blog will discuss LiDAR systems and advances in other technologies to achieve higher (up to L5) ADAS capabilities.
References
The Rising Role of Radar in the Future of ADAS and Autonomous Driving – Edge AI and Vision Alliance
A complete guide to ADAS sensors – Focal Point
Boost Your ADAS Performance with Distributed Aperture Radar
Radar Without Limits
A technical guide to distributed aperture radar
Gen 7 Radar Family
Related EE World content
How can designers evaluate and benchmark sensor fusion systems for autonomous machines?
Key Considerations for integrating LiDAR and radar data for robust perception: part 1
How to implement multi-sensor fusion algorithms for autonomous vehicles
The power of sensor fusion
Sensor fusion: What is it?
Sensor fusion levels and architectures
How does fusion timing impact sensors?





Leave a Reply
You must be logged in to post a comment.