Michael Himmelsbach, Sebastian Schneider, and Hans-Joachim Wuensche
Multi-Sensor System, Extrinsic Calibration, Error Metric, Data Fusion, Mobile Robot
This work develops and compares different error metrics for calibration of a 3D geometric mapping between multi-layer LIDAR point clouds and video images.Most approaches to calibration are based on minimizing the distance between pairs of corresponding points found in both LIDAR and vision data by adjusting the extrinsic calibration parameters. Such a metric does not respect the complementary information provided by the two sensors: a point's depth is more precisely perceived by the LIDAR whereas the direction to the point is better resolved by the camera.Taking this into account, this work investigates the properties of different error metrics, each considering the diverse sensor characteristics to a different degree. Calibration results are demonstrated in simulation experiments and on real world data taken on the autonomous ground vehicle MuCAR-3 \end{blind}while moving in traffic.
Important Links:
Go Back