LAPSE:2023.11397
Published Article
LAPSE:2023.11397
Multi-Sensor Data Fusion for Real-Time Multi-Object Tracking
Numan Senel, Klaus Kefferpütz, Kristina Doycheva, Gordon Elger
February 27, 2023
Abstract
Sensor data fusion is essential for environmental perception within smart traffic applications. By using multiple sensors cooperatively, the accuracy and probability of the perception are increased, which is crucial for critical traffic scenarios or under bad weather conditions. In this paper, a modular real-time capable multi-sensor fusion framework is presented and tested to fuse data on the object list level from distributed automotive sensors (cameras, radar, and LiDAR). The modular multi-sensor fusion architecture receives an object list (untracked objects) from each sensor. The fusion framework combines classical data fusion algorithms, as it contains a coordinate transformation module, an object association module (Hungarian algorithm), an object tracking module (unscented Kalman filter), and a movement compensation module. Due to the modular design, the fusion framework is adaptable and does not rely on the number of sensors or their types. Moreover, the method continues to operate because of this adaptable design in case of an individual sensor failure. This is an essential feature for safety-critical applications. The architecture targets environmental perception in challenging time-critical applications. The developed fusion framework is tested using simulation and public domain experimental data. Using the developed framework, sensor fusion is obtained well below 10 milliseconds of computing time using an AMD Ryzen 7 5800H mobile processor and the Python programming language. Furthermore, the object-level multi-sensor approach enables the detection of changes in the extrinsic calibration of the sensors and potential sensor failures. A concept was developed to use the multi-sensor framework to identify sensor malfunctions. This feature will become extremely important in ensuring the functional safety of the sensors for autonomous driving.
Keywords
autonomous vehicle, environmental perception, object tracking, roadside units, sensor fusion, unscented Kalman filter
Suggested Citation
Senel N, Kefferpütz K, Doycheva K, Elger G. Multi-Sensor Data Fusion for Real-Time Multi-Object Tracking. (2023). LAPSE:2023.11397
Author Affiliations
Senel N: Technische Hochschule Ingolstadt, Esplanade 10, 85049 Ingolstadt, Germany
Kefferpütz K: Technische Hochschule Ingolstadt, Esplanade 10, 85049 Ingolstadt, Germany; Fraunhofer-Anwendungszentrum Vernetzte Mobilität und Infrastruktur, Stauffenbergstrasse 2a, 85051 Ingolstadt, Germany
Doycheva K: Fraunhofer-Anwendungszentrum Vernetzte Mobilität und Infrastruktur, Stauffenbergstrasse 2a, 85051 Ingolstadt, Germany
Elger G: Technische Hochschule Ingolstadt, Esplanade 10, 85049 Ingolstadt, Germany; Fraunhofer-Anwendungszentrum Vernetzte Mobilität und Infrastruktur, Stauffenbergstrasse 2a, 85051 Ingolstadt, Germany
Journal Name
Processes
Volume
11
Issue
2
First Page
501
Year
2023
Publication Date
2023-02-07
ISSN
2227-9717
Version Comments
Original Submission
Other Meta
PII: pr11020501, Publication Type: Journal Article
Record Map
Published Article

LAPSE:2023.11397
This Record
External Link

https://doi.org/10.3390/pr11020501
Publisher Version
Download
Files
Feb 27, 2023
Main Article
License
CC BY 4.0
Meta
Record Statistics
Record Views
201
Version History
[v1] (Original Submission)
Feb 27, 2023
 
Verified by curator on
Feb 27, 2023
This Version Number
v1
Citations
Most Recent
This Version
URL Here
https://psecommunity.org/LAPSE:2023.11397
 
Record Owner
Auto Uploader for LAPSE
Links to Related Works
Directly Related to This Work
Publisher Version

[0.59 s]