EAGLE-VISION-INSPIRED VISUAL MEASUREMENT ALGORITHM FOR UAV’S AUTONOMOUS LANDING

Haibin Duan, Long Xin, Yan Xu, Guozhi Zhao, and Shanjun Chen

References

  1. [1] C. Cadena, L. Carlone, and H. Carrillo, Past, present, and future of simultaneous localization and mapping: Toward the robust-perception age, IEEE Transactions on Robotics, 32(6), 2016, 1309–1332.
  2. [2] H.J. Asl, M. Yazdani, and J. Yoon, Vision-based tracking control of quadrotor using velocity of image features, International Journal of Robotics and Automation, 31(4), 2016, 301–309.
  3. [3] H.J. Asl, G. Oriolo, and H. Bolandi, An adaptive scheme for image-based visual servoing of an underactuated UAV, International Journal of Robotics and Automation, 29(1), 2014, 92–104.
  4. [4] K. Rawat and E. Lawrence, A mini-UAV VTOL platform for surveying applications, International Journal of Robotics and Automation, 3(4), 2014, 259.
  5. [5] S. Saripalli, J.F. Montgomery, and G.S. Sukhatme, Visually guided landing of an unmanned aerial vehicle, IEEE Transactions on Robotics and Automation, 19(3), 2003, 371–380.
  6. [6] C.S. Sharp, O. Shakernia, and S.S. Sastry, A vision system for landing an unmanned aerial vehicle, Proc. of IEEE International Conf. on Robotics and Automation, Seoul, Korea, 2, 2001, 1720–1727.
  7. [7] S. Lin, M.A. Garratt, and A.J. Lambert, Real-time 6-DoF deck pose estimation and target tracking for landing an UAV in a cluttered shipboard environment using on-board vision, Proc. of IEEE International Conf. on Mechatronics and Automation, Beijing, China, 2015, 474–481.
  8. [8] S. Zhao, Z. Hu, M. Yin, et al., A robust real-time vision system for autonomous cargo transfer by an unmanned helicopter, IEEE Transactions on Industrial Electronics, 62(2), 2015, 1210–1219.
  9. [9] K.V. Fite and S. Rosenfield-wessels, A comparative study of deep avian fove as, Brain, Behavior and Evolution, 12(1–2), 1975, 97–115.
  10. [10] V.A. Tucker, The deep fovea, sideways vision and spiral flight paths in raptors, Journal of Experimental Biology, 203(24), 2000, 3745–3754.
  11. [11] C.T. O’Rourke, M.I. Hall, T. Pitlik, et al., Hawk eyes I: Diurnal raptors differ in visual fields and degree of eye movement, PLoS One, 5(9), 2013, e12802.
  12. [12] C.T. O’Rourke, P. Todd, H. Melissa, et al., Hawk eyes II: Diurnal raptors differ in head movement strategies when scanning from perches, PLoS One, 5(9), 2010, e12169. 99
  13. [13] H.B. Duan, Y.M. Deng, X.H. Wang, et al., Biological eagle- eye-based visual imaging guidance simulation platform for unmanned flying vehicles, IEEE Aerospace and Electronic Systems Magazine, 28(12), 2013, 36–45.
  14. [14] F.J. Varela and E. Thompson, Color vision: A case study in the foundations of cognitive science, Revue De Synthese, 111(1–2), 1990, 129–138.
  15. [15] M. Marco, C. Giampiero, R.N. Marcello, et al., Comparison of point matching algorithms for the UAV aerial refueling problem, Machine Vision and Applications, 21(3), 2010, 241–251.
  16. [16] S. Chen, H. Duan, Y. Deng, et al., Drogue pose estimation for unmanned aerial vehicle autonomous aerial refueling system based on infrared vision sensor, Optical Engineering, 56(12), 2017, 124105.
  17. [17] C.P. Lu, G.D. Hager, and E. Mjolsness, Fast and globally convergent pose estimation from video images, IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(6), 2000, 610–622.
  18. [18] F. Moreno-Noguer, V. Lepetit, and P. Fua, Accurate noniterative o (n) solution to the PnP problem, Proc. of IEEE International Conf. on Computer Vision, Rio de Janeiro, Brazil, 2007, 1–8.
  19. [19] S.Q. Li, C. Xu, and M. Xie, A robust O (n) solution to the perspective-n-point problem, IEEE Transactions on Pattern Analysis and Machine Intelligence, 34(7), 2012, 1444–1450.

Important Links:

Go Back