Reliability Assessment and Adaptive Fusion Algorithm for Multi-Sensor Data in Autonomous Driving under Adverse Weather Conditions
Main Article Content
Abstract
Atmospheric disturbances impose systematic degradation on multi-sensor perception systems in autonomous vehicles, necessitating a fundamental rethinking of sensor fusion strategies. This study presents a comprehensive reliability assessment framework combined with an adaptive fusion algorithm designed to mitigate sensor-specific performance deterioration under adverse meteorological conditions. Using an empirical dataset encompassing 10,000 hours of vehicle operation, we establish quantitative correlations between atmospheric parameters and measurement uncertainty across heterogeneous sensor modalities. Real-time trustworthiness estimation is achieved through a dynamic scoring mechanism that integrates environmental context with temporal performance evolution. The proposed adaptive fusion algorithm performs reliability-weighted integration through probabilistic decision modeling, optimizing sensor data combination while minimizing perception errors. Experimental validation demonstrates that the method improves object detection accuracy by 22.9 percentage points compared with majority-vote fusion (AP@0.5) and by 18.4 percentage points compared with fixed-weight fusion in heavy rain (AP@0.5). Furthermore, the false positive rate is reduced by 66-73% relative to the fixed-weight and majority-vote baselines, respectively. These improvements directly enhance collision avoidance performance, substantially advancing the safety and robustness of autonomous vehicles operating in challenging environmental conditions.
Article Details
Section
How to Cite
References
1. C. Xiang, C. Feng, X. Xie, B. Shi, H. Lu, Y. Lv, and Z. Niu, "Multi-sensor fusion and cooperative perception for autonomous driving: A review," IEEE Intelligent Transportation Systems Magazine, vol. 15, no. 5, pp. 36-58, 2023. doi: 10.1109/mits.2023.3283864
2. A. Abdulmaksoud, and R. Ahmed, "Transformer-based sensor fusion for autonomous vehicles: A comprehensive review," IEEE Access, 2025. doi: 10.1109/access.2025.3545032
3. L. Han, J. Wang, C. Li, F. Tao, and Z. Fu, "A novel multi-object tracking framework based on multi-sensor data fusion for autonomous driving in adverse weather environments," IEEE Sensors Journal, 2025.
4. Y. Zhang, C. Tu, K. Gao, and L. Wang, "Multisensor information fusion: Future of environmental perception in intelligent vehicles," Journal of Intelligent and Connected Vehicles, 2024. doi: 10.26599/jicv.2023.9210049
5. F. A. Butt, J. N. Chattha, J. Ahmad, M. U. Zia, M. Rizwan, and I. H. Naqvi, "On the integration of enabling wireless technologies and sensor fusion for next-generation connected and autonomous vehicles," IEEE Access, vol. 10, pp. 14643-14668, 2022. doi: 10.1109/access.2022.3145972
6. A. S. Alam, X. Hei, and Y. Zhang, "In-progress: Enhancing traffic signal perception for connected and autonomous vehicles (CAVs) via multi-sensor fusion of camera, LiDAR, radar, and SPaT data," In 2025 IEEE Security and Privacy Workshops (SPW), May, 2025, pp. 361-363.
7. Y. Qiu, Y. Lu, Y. Wang, and C. Yang, "Visual perception challenges in adverse weather for autonomous vehicles: A review of rain and fog impacts," In 2024 IEEE 7th Information Technology, Networking, Electronic and Automation Control Conference (ITNEC), September, 2024, pp. 1342-1348. doi: 10.1109/itnec60942.2024.10733168
8. I. P. P. A. Sumalatha, P. Chaturvedi, S. Patil, H. P. Thethi, and A. A. Hameed, "Autonomous multi-sensor fusion techniques for environmental perception in self-driving vehicles," In 2024 International Conference on Communication, Computer Sciences and Engineering (IC3SE), May, 2024, pp. 1146-1151.
9. K. Huang, B. Shi, X. Li, X. Li, S. Huang, and Y. Li, "Multi-modal sensor fusion for auto driving perception: A survey," arXiv preprint arXiv:2202.02703, 2022.
10. D. Khan, S. Aslam, and K. Chang, "Vehicle-to-infrastructure multi-sensor fusion (V2I-MSF) with reinforcement learning framework for enhancing autonomous vehicle perception," IEEE Access, 2025. doi: 10.1109/access.2025.3551367
11. Z. Shao, H. Wang, Y. Cai, L. Chen, and Y. Li, "UA-Fusion: Uncertainty-aware multimodal data fusion framework for 3D object detection of autonomous vehicles," IEEE Transactions on Instrumentation and Measurement, 2025.
12. H. Su, H. Gao, X. Wang, X. Fang, Q. Liu, G. Huang, and Q. Cao, "Object detection in adverse weather for autonomous vehicles based on sensor fusion and incremental learning," IEEE Transactions on Instrumentation and Measurement, 2024. doi: 10.1109/tim.2024.3472860
13. M. Nawaz, S. Khan, M. Daud, M. Asim, G. A. Anwar, A. R. Shahid, and W. Yuan, "Improving autonomous vehicle cognitive robustness in extreme weather with deep learning and thermal camera fusion," IEEE Open Journal of Vehicular Technology, 2025. doi: 10.1109/ojvt.2025.3529495
14. D. J. Yeong, G. Velasco-Hernandez, J. Barry, and J. Walsh, "Sensor and sensor fusion technology in autonomous vehicles: A review," Sensors, vol. 21, no. 6, p. 2140, 2021. doi: 10.3390/s21062140
15. E. Ghiasi, G. Ghajari, M. Gottipati, P. S. S. Gogineni, R. Galla, and N. Bourbakis, "Association of multi-sensor data for autonomous car driving: A comparative evaluation," In 2024 IEEE 36th International Conference on Tools with Artificial Intelligence (ICTAI), October, 2024, pp. 427-436.