• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Subscribe
  • Advertise

Sensor Tips

Sensor Product News, Tips, and learning resources for the Design Engineering Professional.

  • Motion Sensing
  • Vision systems
    • Smart cameras
    • Vision software
    • Lighting
    • Optics
  • Pressure
  • Speed
  • Temperature
  • Suppliers
  • Video
    • EE Videos
    • Teardown Videos
  • EE Learning Center
    • Design Guides
    • Tech Toolboxes

What are the latest advances in radar and LiDAR technologies for sensor fusion: part 1

December 31, 2025 By Randy Frank Leave a Comment

Through sensor fusion, the combined benefits of radar and light detection and ranging (LiDAR) sensing are being implemented in each new generation of advanced driver assistance systems (ADAS). Improvements in these sensing technologies will provide improved performance for future applications. However, they will also require advancements in cameras and other system aspects, including inertial measurement units (IMUs with accelerometers, gyroscopes, and sometimes magnetometers), as well as high-performance computing (HPC) systems with powerful algorithms and input from the cloud with 5G networks and the Global Navigation Satellite System (GNSS).

Overview of new radar technologies

Today, radar is an essential sensing aspect of automotive features such as automatic emergency braking, adaptive cruise control, and blind spot detection that are often standard on many vehicles. For more advanced systems, including higher Society of Automotive Engineers (SAE) levels of autonomous driving (AD) (before achieving the highest level (L5)), higher performance (high-resolution) radar sensors are required.

ADAS technology on existing production vehicles today relies on multi-sensor fusion (typically 5+ radars and 8+ cameras) to deliver L2+ hands-free driving on mapped highways. These designs balance convenience with driver accountability through real-time monitoring. For more complex real-world situations and ultimately L5 operation, today’s 3D radar must progress to 4D and even 4D imaging radar, and/or implement other more sophisticated design approaches. Newer 4D imaging radar systems provide vertical information.

The size of a radar system, including its aperture (the antenna’s opening that transmits and receives microwave energy), is directly related to its performance—larger apertures result in higher resolution. Different designs provide increased apertures.

Massive multiple-input, multiple-output (MIMO) radar uses many closely packed transmit (Tx) and receive (Rx) antennas to create a large virtual antenna array with multiple focused beams to address the need for improved spatial resolution, better target detection, and more. Another existing design approach is phased array radar (PAR) that uses its baseband signal generator to produce a scalar signal, which is copied to each Tx antenna. In this case, many small antennas steer beams electronically by changing signal timing (phase).

Existing medium-range radar (MRR) modules typically have up to 16 (but could go to 256) virtual antennas, consume a lot of power, and take a significant amount of space. One way to avoid today’s issues is a distributed aperture radar (DAR) design. DAR uses multiple small radar sensors in a coherent system to form a large virtual aperture.

Figure 1. Multiple (two or three) medium-range radar sensors combined into a distributed aperture radar system create a large virtual aperture. (Image: NXP)

The impact of radar advances on sensor fusion performance

DAR’s large virtual aperture delivers excellent spatial resolution with enhanced azimuth resolution of 0.5 degrees or lower. In resolution KPI (key performance indicator) testing, DAR has demonstrated the clear separation of two corner reflectors up to 320 meters and beyond.

One company’s forward-facing, 4D imaging radar provides a 300-meter detection range and excellent angular resolution. The first families in production offer six to seven times better classification of vulnerable road users.

Part 2 of this blog will discuss LiDAR systems and advances in other technologies to achieve higher (up to L5) ADAS capabilities.

References

The Rising Role of Radar in the Future of ADAS and Autonomous Driving – Edge AI and Vision Alliance
A complete guide to ADAS sensors – Focal Point
Boost Your ADAS Performance with Distributed Aperture Radar
Radar Without Limits
A technical guide to distributed aperture radar
Gen 7 Radar Family 

Related EE World content

How can designers evaluate and benchmark sensor fusion systems for autonomous machines?
Key Considerations for integrating LiDAR and radar data for robust perception: part 1
How to implement multi-sensor fusion algorithms for autonomous vehicles
The power of sensor fusion
Sensor fusion: What is it?
Sensor fusion levels and architectures
How does fusion timing impact sensors?

You may also like:


  • How can designers evaluate and benchmark sensor fusion systems for…

  • What are the key considerations for integrating lidar and radar…

  • Key Considerations for integrating LiDAR and radar data for robust…

  • How to implement multi-sensor fusion algorithms for autonomous vehicles

  • The power of sensor fusion

Filed Under: Featured, Frequently Asked Question (FAQ), Sensor Fusion Tagged With: FAQ, sensor fusion

Reader Interactions

Leave a Reply

You must be logged in to post a comment.

Primary Sidebar

Featured Contributions

Automotive sensor requirements for software-defined vehicles: latency, resolution, and zonal architecture

High-current, low-impedance systems need advanced current sensing technology

A2L refrigerants drive thermal drift concerns in HVAC systems

Integrating MEMS technology into next-gen vehicle safety features

Fire prevention through the Internet

More Featured Contributions

EE TECH TOOLBOX

“ee
Tech Toolbox: Connectivity
AI and high-performance computing demand interconnects that can handle massive data throughput without bottlenecks. This Tech Toolbox explores the connector technologies enabling ML systems, from high-speed board-to-board and PCIe interfaces to in-package optical interconnects and twin-axial assemblies.

EE LEARNING CENTER

EE Learning Center
“sensor
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for EE professionals.

RSS Current EDABoard.com discussions

RSS Current Electro-Tech-Online.com Discussions

  • Sine wave distortion
  • Silicon insulated wire
  • Unable To Get Advertised Op-Amp Slew Rate
  • Wien bridge oscillator
  • Flip Flop for My Mirrors

EE ENGINEERING TRAINING DAYS

engineering
“bills

RSS Featured White Papers

  • 4D Imaging Radar: Sensor Supremacy For Sustained L2+ Vehicle Enablement
  • Amphenol RF solutions enable the RF & video signal chains in industrial robots
  • Implementing Position Sensors for Hazardous Areas & Safety

Footer

EE WORLD ONLINE NETWORK

  • 5G Technology World
  • EE World Online
  • Engineers Garage
  • Analog IC Tips
  • Battery Power Tips
  • Connector Tips
  • EDA Board Forums
  • Electro Tech Online Forums
  • EV Engineering
  • Microcontroller Tips
  • Power Electronic Tips
  • Test and Measurement Tips

SENSOR TIPS

  • Subscribe to our newsletter
  • Advertise with us
  • Contact us
  • About us

Copyright © 2026 · WTWH Media LLC and its licensors. All rights reserved.
The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media.

Privacy Policy