• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Subscribe
  • Advertise

Sensor Tips

Sensor Product News, Tips, and learning resources for the Design Engineering Professional.

  • Motion Sensing
  • Vision systems
    • Smart cameras
    • Vision software
    • Lighting
    • Optics
  • Pressure
  • Speed
  • Temperature
  • Suppliers
  • Video
    • EE Videos
    • Teardown Videos
  • EE Learning Center
    • Design Guides
    • Tech Toolboxes

Key Considerations for integrating LiDAR and radar data for robust perception: part 1

November 26, 2025 By Randy Frank Leave a Comment

For autonomous vehicle (AV) applications, both radar and LiDAR (light detection and ranging) are often used in addition to high-definition cameras and other sensors. The proper fusion of the output from these devices enables the detection of obstacles, identification of lane markings, and the accurate recognition of other vehicles and pedestrians.

The radio waves (typically 76–81 GHz frequency range) in the radar portion of the system measure object velocity and distance (up to distances of 250 meters) and are particularly useful in poor visibility conditions such as fog or heavy rain.

LiDAR sensing uses up to 1 million laser pulses per second to create accurate 3D maps of the vehicle’s surroundings. This allows measuring distances within a range of up to 200 meters and detecting objects even in challenging weather conditions.

Processing this sensor data within milliseconds using artificial intelligence (AI) algorithms, including the high-resolution visual data at up to 120 frames per second from cameras, allows the system to make real-time decisions, predict movements, and adapt to dynamic driving environments.

Sensor calibration

The use of lasers in LiDAR sensing provides cm-level precision (mm precision in some 2D LiDAR systems). In contrast, the longer radio wavelengths in radar sensing mean the resolution is significantly lower. In a multiple sensor tracking system, two types of architecture are typically used to align data from the radar and LiDAR sensors. In central-level tracking, the data from all the sensors is sent directly to a tracking system that maintains tracks based on all the detections. An alternative is a hierarchical structure with sensor-level tracking combined with track-level fusion for a multiple-sensor system.

For several reasons (including shared data, when sensors directly output tracks instead of detections, limited communication bandwidth, and more), a track-to-track, or track-level fusion architecture may be preferable to a central-level tracking architecture in some applications. One company has explained how a track-level fusion scheme processes the radar measurements using an extended object tracker with a Gaussian mixture probability hypothesis density (GM-PHD) filter and the LiDAR measurements using a joint probabilistic data association (JPDA) tracker based on an Interacting Multiple Model – Unscented Kalman Filter (UMM-UKF).

Figure 1. Schematic of track-level fusion of radar and LiDAR data workflow. (Image: Mathworks)

The track-level fusion example explains the radar tracking and LiDAR tracking algorithms, how to set up the fuser, metrics, and visualization, run scenario and trackers, evaluate performance, and track maintenance.

The ego vehicle (the one being controlled by the autonomous driving system) has four 2-D radar sensors, where the front and rear radar sensors have a field of view (FOV) of 45 degrees, and the left and right radar sensors have an FOV of 150 degrees. Each radar sensor has a resolution of 6 degrees in azimuth and 2.5 meters in range. The vehicle also has one 3-D LiDAR sensor with a field of view of 360 degrees in azimuth and 40 degrees in elevation with a resolution of 0.2 degrees in azimuth and 1.25 degrees in elevation (using 32 elevation channels).

Other companies and researchers are addressing AV simulation as well. The References section has links to some examples of their efforts.

Part 2 will address practical techniques for improving sensor fusion accuracy and provide real-world examples of simulation success stories for LiDAR and radar fusion.

References

Sensor Fusion in Autonomous Transport: Integrating LiDAR, Cameras, and AI for Enhanced Safety
An in-depth comparison of LiDAR, Cameras, and Radars’ technology
Introduction to Track-To-Track Fusion
Track-Level Fusion of Radar and Lidar Data
AdvFuzz: Finding More Violations Caused by the EGO Vehicle in Simulation Testing by Adversarial NPC Vehicles
First steps – CARLA Simulator
EGO-Centric, Multi-Scale Co-Simulation to Tackle Large Urban Traffic Scenarios
V2X Testing with Simulation of Multiple Ego Vehicles |
RobustStateNet: Robust ego vehicle state estimation for Autonomous Driving
Ego Vehicle – AWSIM Labs Documentation

Related EE World content

How to implement multi-sensor fusion algorithms for autonomous vehicles
The power of sensor fusion
Sensor fusion: What is it?
Sensor fusion levels and architectures
How does fusion timing impact sensors?
Sensors in the driving seat

You may also like:


  • Ultrasonic sensing part 1: static presence

  • How to implement multi-sensor fusion algorithms for autonomous vehicles

  • The power of sensor fusion

  • How does multi-sensor calibration work in autonomous vehicles?

  • What tools are available for calibrating sensors?

Filed Under: Applications, Automotive, Featured, Frequently Asked Question (FAQ) Tagged With: FAQ, LIDAR

Reader Interactions

Leave a Reply

You must be logged in to post a comment.

Primary Sidebar

Featured Contributions

Automotive sensor requirements for software-defined vehicles: latency, resolution, and zonal architecture

High-current, low-impedance systems need advanced current sensing technology

A2L refrigerants drive thermal drift concerns in HVAC systems

Integrating MEMS technology into next-gen vehicle safety features

Fire prevention through the Internet

More Featured Contributions

EE TECH TOOLBOX

“ee
Tech Toolbox: Connectivity
AI and high-performance computing demand interconnects that can handle massive data throughput without bottlenecks. This Tech Toolbox explores the connector technologies enabling ML systems, from high-speed board-to-board and PCIe interfaces to in-package optical interconnects and twin-axial assemblies.

EE LEARNING CENTER

EE Learning Center
“sensor
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for EE professionals.

RSS Current EDABoard.com discussions

RSS Current Electro-Tech-Online.com Discussions

  • updating circuit with power on off switching
  • potenciometer attachment / screwdriver
  • Steering angle sensor question
  • flexible copper cable
  • factory device from 2017'ish with web ui - too old to function with Microsoft Edge ?

EE ENGINEERING TRAINING DAYS

engineering
“bills

RSS Featured White Papers

  • 4D Imaging Radar: Sensor Supremacy For Sustained L2+ Vehicle Enablement
  • Amphenol RF solutions enable the RF & video signal chains in industrial robots
  • Implementing Position Sensors for Hazardous Areas & Safety

Footer

EE WORLD ONLINE NETWORK

  • 5G Technology World
  • EE World Online
  • Engineers Garage
  • Analog IC Tips
  • Battery Power Tips
  • Connector Tips
  • EDA Board Forums
  • Electro Tech Online Forums
  • EV Engineering
  • Microcontroller Tips
  • Power Electronic Tips
  • Test and Measurement Tips

SENSOR TIPS

  • Subscribe to our newsletter
  • Advertise with us
  • Contact us
  • About us

Copyright © 2026 · WTWH Media LLC and its licensors. All rights reserved.
The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media.

Privacy Policy