Mobile QR Code QR CODE

2024

Acceptance Ratio

21%

Title A Survey of Multi-Sensor Fusion in SLAM Utilizing Camera, LiDAR or IMU
Authors (Yein Choi) ; (Sung Soo Hwang)
DOI https://doi.org/10.5573/IEIESPC.2025.14.5.705
Page pp.705-713
ISSN 2287-5255
Keywords SLAM; Sensor-fusion; Multi-sensor SLAM; Visual-inertial SALM; LiDAR-inertial SLAM; LiDARvisual-inertial SLAM
Abstract This paper explains recent researches that integrate multiple sensors in a field of simultaneous localization and mapping (SLAM), which is of high interest in areas such as autonomous driving. Basic sensors commonly used in SLAM, such as LiDAR, camera, and inertial measurement unit (IMU), have individual drawbacks when used as single sensors. Therefore, in many SLAM researches, these shortcomings are overcome by fusing different sensors to complement each other and enhance performance, aiming for more accurate state estimation. Various methods are available for optimizing the information from each sensor during this process. In this paper, we aim to explain methods for integrating sensors information such as MAP, Kalman Filter, and MLE. Moreover, we will introduce research that utilizes information obtained from sensors. We hope that this paper seeks to understand the types of sensor data fusion methods employed when multiple sensors information is available.