A NOVEL METHOD FOR FUSION OF GNSS AND VISUAL-INERTIAL-WHEEL ODOMETRY, 195-203.

Yahui Zhang, Linxuan Wang, Aimin Li, Yongsheng Zheng, and Mingzhuang Wu, 195-203.

References

  1. [1] M. Dong, G. Yao, J. Li, and L. Zhang, Calibration of lowcost IMU’s inertial sensors for improved attitude estimation,Journal of Intelligent and Robotic Systems, 100(2), 2020,1015–1029.
  2. [2] C. Campos, R. Elvira, J.J.G. Rodriguez, J.M.M. Montiel,and J.D. Tardos, ORB-SLAM3: An accurate open-sourcelibrary for visual, visual–inertial, and multimap SLAM, IEEETransactions on Robotics, 37(6), 2021, 1874–1890.201
  3. [3] G. Chen and B. Jin, Position-posture trajectory tracking ofa six-legged walking robot, International Journal of Roboticsand Automation, 34(1), 2019, 24–37.
  4. [4] S. Han, F. Deng, T. Li, and H. Pei, Tightly coupledoptimization-based GPS-visual-inertial odometry with onlinecalibration and initialization, 2022, arXiv:2203.02677.
  5. [5] M. Boujelben, C. Rekik, and N. Derbel, A multi-agentarchitecture with hierarchical fuzzy controller for a mobilerobot, International Journal of Robotics and Automation,30(3), 2015, 289–298.
  6. [6] H.W. Lee, C.L. Shih, and C.-L. Hwang, Design by applyingcompensation technology to achieve biped robots with stablegait, International Journal of Robotics and Automation, 29(1),2014.
  7. [7] T. Qin and S. Shen, online temporal calibration for monocularvisual-inertial systems. Proc. IEEE. IEEE, Madrid, Spain,2018, 3662–3669.
  8. [8] T. Qin, S. Cao, J. Pan, and S. Shen, A general optimization-based framework for global pose estimation with multiplesensors, 2019, arXiv:1901.03642.
  9. [9] J. Liu, W. Gao, and Z. Hu, Optimization-based visual-inertialslam tightly coupled with raw GNSS measurements, 2020,arXiv:2010.11675.
  10. [10] S. Cao, X. Lu, and S. Shen, GVINS: tightly coupledGNSS–visual–inertial fusion for smooth and consistent stateestimation, IEEE Transactions on Robotics, 38(4), 2022,2004–2021.
  11. [11] S. Lynen, M.W. Achtelik, S. Weiss, M. Chli, and R.Siegwart, A robust and modular multi-sensor fusion approachapplied to MAV navigation, Proc. IEEE. IEEE, 2013,3923–3929.
  12. [12] S. Leutenegger, OKVIS2: Realtime scalable visual-inertial slamwith loop closure, 2022, arXiv:2202.09199.
  13. [13] A.I. Mourikis and S.I. Roumeliotis, A multi-state constraintKalman filter for vision-aided inertial navigation, Proc. Roboticsand Automation, 2007 IEEE International Conf. on. IEEE,(LVBO), 2007, 3565–3572.
  14. [14] M. Li and A.I. Mourikis, High-precision, consistent EKF-basedvisual-inertial odometry, The International Journal of RoboticsResearch, 32(6), 2013, 690–711.
  15. [15] S. Leutenegger, S. Lynen, M. Bosse, R. Siegwart, andP. Furgale, Keyframe-based visual-inertial odometry usingnonlinear optimization, The International Journal of RoboticsResearch, 34(3), 2014, 314–334.
  16. [16] M. Bloesch, S. Omari, M. Hutter, and R. Siegwart,Robust visual inertial odometry using a direct EKF-basedapproach, Proc. of the IEEE/RSJ International Conf. onIntelligent Robots and Systems, Hamburg, Germany, 2015,298–304.
  17. [17] R. Mur-Artal and J.D. Tard´os, Visual-inertial monocular SLAMwith map reuse, IEEE Robotics and Automation Letters, 2(2),2017, 796–803.
  18. [18] H. Strasdat, J. Montiel, and A.J. Davison, Real-time monocularSLAM: Why filter? Proc. IEEE. IEEE, Anchorage, AK, USA,2010, 2657–2664.
  19. [19] R. Mascaro, L. Teixeira, T. Hinzmann, R. Siegwart, and M.Chli, GOMSF: Graph-optimization based multi-sensor fusionfor robust UAV pose estimation, Proc. IEEE InternationalConf. on Robotics and Automation (ICRA), Brisbane, QLD,Australia, 2018, 1421–1428.
  20. [20] H. Liu, M. Chen, G. Zhang, H. Bao, and Y. Bao, ICE-BA: Incremental, consistent and efficient bundle adjustmentfor visual-inertial SLAM. Proc. 2018 IEEE/CVF Conf. onComputer Vision and Pattern Recognition (CVPR), Salt LakeCity, UT, USA, 2018, 1974–1982.
  21. [21] K. Eckenhoff, Y. Yang, P. Geneva, and G. Huang, Tightly-coupled visual-inertial localization and 3D rigid-body targettracking, IEEE Robotics and Automation Letters, 4(2), 2019,1541–1548.
  22. [22] Z. Gong, P. Liu, F. Wen, R. Ying, and W. Xue, Graph-basedadaptive fusion of GNSS and VIO under intermittent GNSS-degraded environment, IEEE Transactions on Instrumentationand Measurement, 70, 2021, 1–16.
  23. [23] K.J. Wu, C.X. Guo, G. Georgiou, and S.I. Roumeliotis,VINS on wheels. Proc. 2017 IEEE International Conf. onRobotics and Automation (ICRA), IEEE, Singapore, 2017,5155–5162.
  24. [24] R. Kang, L. Xiong, M. Xu, J. Zhao, and P. Zhang, VINS-vehicle:A tightly-coupled vehicle dynamics extension to visual-inertialstate estimator, Proc. 2019 IEEE Intelligent TransportationSystems Conf. - ITSC, IEEE, Auckland, New Zealand, 2019,3593–3600.
  25. [25] T. Qin, P. Li, and S. Shen, VINS-mono: A robust and versatilemonocular visual-inertial state estimator, IEEE Transactionson Robotics, 34(4), 2018, 1004–1020.
  26. [26] G. Cioffi and D. Scaramuzza, Tightly-coupled fusion of globalpositional measurements in optimization-based visual-inertialodometry, 2020, arXiv:2003.04159.
  27. [27] C.V. Angelino, V.R. Baraniello, and L. Cicala, UAV positionand attitude estimation using IMU, GNSS and camera, Proc.2012 15th International Conf. on Information Fusion, IEEE,Singapore, 2012, 735–742.

Important Links:

Go Back