LIGHTWEIGHT AND FAST MATCHING METHOD FOR LIDAR-INERTIAL ODOMETRY AND MAPPING, 338-348.

Chuanjiang Li, Ziwei Hu, Yanfei Zhu, Xingzhao Ji, Chongming Zhang, and Ziming Qi

References

  1. [1] M. Bloesch, S. Omari, M. Hutter, and R. Siegwart, Robustvisual inertial odometry using a direct EKF-based approach,Proc. IEEE/RSJ International Conf. on Intelligent Robotsand Systems, Hamburg, 2015, 298–304.
  2. [2] T. Qin, P. Li, and S. Shen, Vins-mono: A robust and versatilemonocular visual-inertial state estimator, IEEE Transactionson Robotics, 34(4), 2018, 1004–1020.
  3. [3] C. Campos, R. Elvira, J.J.G. Rodr’ıguez, and J.M. Montiel,and J.D. Tard´os, ORB-SLAM3: An accurate open-sourcelibrary for visual, visual–inertial, and multimap SLAM, IEEETransactions on Robotics, 37(6), 2021, 1874–1890.
  4. [4] J. Ni, X. Wang, and T. Gong, An improved adaptive ORB-SLAM method for monocular vision robot under dynamicenvironments, International Journal of Machine Learning andCybernetics, 2022, 1–16.
  5. [5] Y. Chen, J. Ni, and E. Mutabazi, A variable radius sidewindow direct SLAM method based on semantic information,Computational Intelligence and Neuroscience, 2022, 4075910.
  6. [6] S. Badalkhani, R. Havangi, and M. Farshad, An improvedsimultaneous localization and mapping for dynamic environ-ments, International Journal of Robotics and Automation,36(6), 2021, 374–382.
  7. [7] B. Han and L. Xu, MLC-SLAM: Mask loop closing formonocular SLAM, International Journal of Robotics andAutomation, 37(1), 2022, 107–114.
  8. [8] J. Levinson, J. Askeland, J. Becker, J. Dolson, D. Held, S.Kammel, J. Zico Kolter, D. Langer, O. Pink, V. Pratt, M.Sokolsky, G. Stanek, D. Stavens, A. Teichman, M. Werling,and S. Thrun, Towards fully autonomous driving: Systemsand algorithms, Proc. IEEE Intelligent Vehicles Symposium,Baden-Baden, 2011, 163–168.
  9. [9] J. Saarinen, J. Andreasson, and T. Stoyanov, Normaldistributions transform occupancy maps: Application to large-scale online 3D mapping, Proc. 2013 IEEE International Conf.on Robotics and Automation, Karlsruhe, 2013, 2233–2238.
  10. [10] Z. Ji and S. Singh, Loam: Lidar odometry and mapping inreal-time, Robotics: Science and Systems Conference, 2(9),2014, 1–9.
  11. [11] S. Rusinkiewicz and M. Levoy, Efficient variants of the ICPalgorithm, Proc. of the Third International Conf. on 3-D Digital Imaging and Modeling, Quebec City, QC, 2001,145–152.
  12. [12] A. Segal, D. Haehnel, and S. Thrun, Generalized-ICP,Proceedings of Robotics: Science and Systems, 2(4), 2009, 435.
  13. [13] M. Magnusson, The three-dimensional normal-distributionstransform: An efficient representation for registration, surfaceanalysis, and loop detection, Doctoral Dissertation, ¨Orebrouniversitet, 2009.
  14. [14] W.S. Grant, R.C. Voorhies, and L. Itti, Finding planes inLiDAR point clouds for real-time registration, Proc. IEEE/RSJInternational Conf. on Intelligent Robots and Systems, Tokyo,2013, 4347–4354.
  15. [15] A. Geiger, P. Lenz, and R. Urtasun, Are we ready forautonomous driving? The KITTI vision benchmark suite, Proc.of the IEEE International Conf. on Computer Vision andPattern Recognition, Providence, RI, 2012, 3354–3361.
  16. [16] T. Shan and B. Englot, LeGO-LOAM: Lightweight and ground-optimized lidar odometry and mapping on variable terrain,Proc. 2018 IEEE/RSJ International Conf. on Intelligent Robotsand Systems, Madrid, 2019, 4758–4765.
  17. [17] M. Himmelsbach, F.V. Hundelshausen, and H.J. Wuensche,Fast segmentation of 3D point clouds for ground vehicles, Proc.of the IEEE Intelligent Vehicles Symposium, La Jolla, CA,2010, 560–565.
  18. [18] X. Ji, L. Zuo, C. Zhang, and Y. Liu, LLOAM: LiDAR odometryand mapping with loop-closure detection based correction,Proc. 2019 IEEE International Conf. on Mechatronics andAutomation, Tianjin, China, 2019, 2475–2480.
  19. [19] J. Lin and F. Zhang, Loam LIVOX: A fast, robust, high-precision lidar odometry and mapping package for LiDARS ofsmall FOV, Proc. 2020 IEEE International Conf. on Roboticsand Automation, Paris, 2020, 3126–3131.
  20. [20] W. Xu, Y. Cai, D. He, J. Lin, and F. Zhang, FAST-LIO2:Fast direct LiDAR-inertial odometry, IEEE Transactions onRobotics, 2022, 1–21.
  21. [21] S. Hening, C.A. Ippolito, K.S. Krishnakumar, V. Stepanyan,and M. Teodorescu, 3D LiDAR SLAM integration withGPS/INS for UAVs in urban GPS-degraded environments,Proc. AIAA Information Systems-AIAA Infotech@ Aerospace,Grapevine, TX, 2017, 448–457.
  22. [22] T. Shan, B. Englot, D. Meyers, W. Wang, C. Ratti, and D.Rus, LIO-SAM: Tightly-coupled LiDAR inertial odometry viasmoothing and mapping, Proc. 2020 IEEE/RSJ InternationalConf. on Intelligent Robots and Systems, Las Vegas, NV, 2020,5135–5142.
  23. [23] C. Forster, L. Carlone, F. Dellaert, and D. Scaramuzza,On-manifold preintegration for real-time visual–inertialodometry, IEEE Transactions on Robotics, 33(1), 2017,1–21.
  24. [24] T. Shan, B. Englot, C. Ratti, and D. Rus, LVI-SAM:Tightly-coupled lidar-visual-inertial odometry via smoothingand mapping, Proc. 2021 IEEE International Conf. on Roboticsand Automation, Xi’an, 2021, 5692–5698.
  25. [25] S. Leutenegger, S. Lynen, and M. Bosse, R. Siegwart, andP. Furgale, Keyframe-based visual-inertial odometry usingnonlinear optimization, The International Journal of RoboticsResearch, 34(3), 2015, 314–334.347
  26. [26] I. Bogoslavskyi and C. Stachniss, Fast range image-basedsegmentation of sparse 3D laser scans for online operation,Proc. of the IEEE/RSJ International Conf. on IntelligentRobots and Systems, Daejeon, 2016, 163–169.
  27. [27] M. Quigley, K. Conley, B. Gerkey, J. Faust, T. Foote, J. Leibs,E. Berger, R. Wheeler, and A. Ng, ROS: An open-source robotoperating system, IEEE ICRA Workshop on Open SourceSoftware, 3(3.2), 2009, 5.

Important Links:

Go Back