To achieve autonomous vehicle (AV) operation, sensing techniques include radar, LiDAR, and cameras, as well as infrared (IR) and/or ultrasonic sensors, among others. No single sensing technique is adequate by itself, and each of these sensors has its strengths and weaknesses. As shown in Table 1, one or more sensing techniques overcome the weaknesses of another, and their combined capabilities also provide redundancy for safe autonomous vehicle operation. In addition to the sensors themselves, combining or fusing the data from multiple sensors to obtain a more accurate and reliable estimate requires coding (software algorithms), advanced filtering, and performance simulation.

Code examples/Tech stacks for integrating sensor data
Different coding/software approaches have been used to fuse or integrate the data from the key sensors. Similar to other widely pursued applications, tech stacks that combine programming languages, frameworks, databases, and tools are used to expedite the development of sensor fusion software for AV applications. In addition to C/C++, two popular approaches for developing sensor fusion software include Autoware and the Robot Operating System (ROS).
Since it released its first modular open-source software stacks for autonomous driving in 2015, the Autoware Foundation has periodically offered improved versions with the most recent architecture called Autoware Core/Universe.
ROS is not an operating system but a software development kit (SDK) – a set of software libraries and tools that help engineers build robot applications. This open-source robotics framework provides developers with the middleware required for communicating between hardware components and software algorithms.
In addition to these design approaches, Python, a multi-paradigm (object-oriented, procedural (imperative), functional, structured, and reflective) general-purpose programming language, is frequently used for sensor fusion of inertial measurement unit (IMU) outputs (accelerometer, gyroscope, and magnetic sensing signals).
Filtering techniques for combining sensor data
Effectively using the data from different sensors, especially when the output from one sensor conflicts with the output from another, requires specialized filtering of the data. Since three-dimensional readings must be evaluated, Kalman (Gaussian-based) filters provide a common starting point. For non-linear systems, the extended Kalman filter (EKF) uses linearization to model nonlinear functions, making it applicable for fusing data from LiDAR, radar, and camera sensors. A variation of non-linear Kalman filters includes the unscented Kalman filter (UKF) that provides an improvement over the EKF, using a deterministic sampling approach to address the EKF’s flaws in nonlinear estimations. Figure 1 shows how it works for LiDAR and radar data measurements.

In contrast to Kalman filtering, particle filters combine predictions from a system dynamics model with new observations to update the estimated state of a system. The Bayesian filtering technique uses a set of weighted random samples, called “particles,” where the weight of each particle is updated based on how well it aligns with the readings from all sensors.
Simulating sensor fusion performance
Using simulation tools like MATLAB or Simulink, engineers can evaluate different approaches to design and evaluate their system.

To ease into the evaluation process, several simulation tools suggest starting with just two variables (see Figure 1). For example, the MathWorks multi-object tracker to fuse information from radar and video camera sensors uses Kalman filters to estimate the state of motion of a detected object. Another tracker generates an object-level track list from measurements of a radar and a LiDAR sensor and fuses them using a track-level fusion scheme.
Baidu, Inc., a Chinese multinational technology company, has an open platform called Apollo to help developers with simulations, an autonomous perception system, and high definition (HD) mapping data to minimize costs and improve the precision of their designs.
For all the design activities required for sensor fusion in AV applications, engineers have choices, including those mentioned above, as well as options from other providers.
References
Sensor fusion for multi-sensor lidar data
The 6-Step Roadmap to Learn Sensor Fusion
ROS – Robot Operating System
How is Autoware Core/Universe different from Autoware.AI and Autoware.Auto?
LiDAR and Radar Sensor Fusion using Unscented Kalman Filter
Sensor Fusion With Kalman Filter
Sensor Fusion and Navigation for Autonomous Systems Using MATLAB & Simulink
What Is Sensor Fusion and Tracking Toolbox?
Track-Level Fusion of Radar and Lidar Data
Apollo
Related EE World content
What is sensor fusion?
Sensor fusion: What is it?
Sensor fusion levels and architectures
How does fusion timing impact sensors?
Sensors in the driving seat
What sensors make the latest Waymo Driver smarter?
What is the role of sensor fusion in robotics?




Leave a Reply
You must be logged in to post a comment.