Yanlin He, Xu Zhang, Lianqing Zhu, Guangkai Sun, and Junfei Qiao


  1. [1] J. Fabian, T. Young, J.C. Peyton Jones, and G.M. Clay-ton, Integrating the Microsoft Kinect with Simulink: Real-time target tracking example, IEEE/ASME Transactions onMechatronics, 19(1), 2014, 249–257.
  2. [2] O. Araar, N. Aouf, and J.L. Vallejo Dietz, Power pylondetection and monocular depth estimation from detectionUAVs, Industrial Robot—An International Journal, 42(3),2015, 200–213.
  3. [3] Z. Zhang, A. Beck, and N. Magnenat-Thalmann, Human-like behavior generation based on head-arms model for robottracking external targets and body parts, Proceedings of IEEETransactions on Cybernetics, 45(8), 2015, 1390–1400.
  4. [4] E. Wirbel, B. Steux, S. Bonnabel, and D.L.F. Arnaud, Hu-manoid robot navigation: From a visual SLAM to a visualcompass, Proc. IEEE Int. Conf. Networking, Sensing andControl, Evry, France, 10–12 April 2013, 678–683.
  5. [5] A. Delibasi, E. Zergeroglu, I.B. K¨u¸c¨ukdemiral, and G. Can-sever, Adaptive self-tuning control of robot manipulatorswith periodic disturbance estimation, International Journal ofRobotics & Automation, 25(1), 2010, 48–56.
  6. [6] F. Zhang, S. Zheng, H. Yun, and X. Shao, The research on atti-tude correction method of robot monocular vision positioningsystem, IEEE Proc. IEEE Int. Conf. Robotics and Biomimet-ics, Macau SAR, China, 5–8 December 2017, 1972–1976.
  7. [7] X. Huang, Y. Jia, and S. Xu, Path planning of a free-floatingspace robot based on the degree of controllability, ScienceChina Technological Sciences, 60(2), 2017, 1–13.
  8. [8] L. Li, X. Wang, D. Xu, and T. Min, An accurate pathplanning algorithm based on triangular meshes in robotic fibreplacement, International Journal of Robotics & Automation,32(1), 2017. DOI: 10.2316/Journal.206.2017.1.206-4673.
  9. [9] C.J. Lin, A GPU-based evolution algorithm for motion planningof a redundant robot, International Journal of Robotics &Automation, 2(2), 2017, 00015.
  10. [10] F. Castelli, S. Michieletto, S. Ghidoni, and E. Pagello, A ma-chine learning-based visual servoing approach for fast robotcontrol in industrial setting, International Journal of AdvancedRobotic Systems, 16(6), 2017, 172988141773888.428
  11. [11] S. Lemaignan, M. Warnier, E.A. Sisbot, A. Clodic, andR. Alamiet, Artificial cognition for social human–robot inter-action, Artificial Intelligence, 247(C), 2017, 45–69.
  12. [12] C.A. Cifuentes, A. Frizera, R. Carelli, and T. Bastos, Human–robot interaction based on wearable IMU sensor and laserrange finder, Robotics and Autonomous Systems, 62(10), 2014,1425–1439.
  13. [13] K. Kesorn and S. Poslad, An enhanced bag-of-visual wordvector space model to represent visual content in athleticsimages, IEEE Transactions on Multimedia, 14(1), 2012, 211–222.
  14. [14] Y. Liu, Q. Li, H. Fang, and H. Xu, Research on embeddedsystem with implementation of a moving target tracking al-gorithm based on improved meanshift on DM6437, AdvancedMaterials Research, 1003, 2014, 207–210.
  15. [15] J. Liu, K.R. Subramanian, and T.S. Yoo, An optical flowapproach to tracking colonoscopy video, Computerized MedicalImaging and Graphics, 37(3), 2013, 207–223.
  16. [16] K. Li, B. Hu, J. Gao, and G. Feng, Nonlinear robust detectionKalman filter algorithm based on M-estimation, Journal ofComputer Applications, 34(11), 2014, 3214–3217.
  17. [17] S. Shamshirband, D. Petkovic, H. Javidnia, and A. Gani,Sensor data fusion by support vector regression methodology –A comparative study, IEEE Sensors Journal, 15(2), 2015,850–854.
  18. [18] Y. Qi, K. Suzuki, H. Wu, and Q. Chen, EK-means tracker:A pixel-wise tracking algorithm using Kinect, Proc. ThirdChinese Conf. Intelligent Visual Surveillance (IVS), Beijing,1–2 December 2011, 77–80.
  19. [19] C. Bibby and I. Reid, Real-time tracking of multiple occludingtargets using level sets, Proc. IEEE Int. Conf. Computer Visionand Pattern Recognition (CVPR), San Francisco, CA, 13–18June 2010, 1307–1314.
  20. [20] S. Hare, A. Saffari, and H.S. Torr Philip, Struck: Structuredoutput tracking with Kernels, Proc. IEEE Int. Conf. Com-puter Vision (ICCV), Barcelona, Spain, 06–13 November 2011,263–270.
  21. [21] Z. Kalal, K. Mikolajczyk, and J. Matas, Tracking-learning-detection, IEEE Transactions on Pattern Analysis and Ma-chine Intelligence, 34(7), 2012, 1409–1422.
  22. [22] K. Zhang, L. Zhang, and M.-H. Yang, Real-time compressivetracking, Proc. European. Conf. Computer Vision (ECCV2012), Florence, Italy, 7–13 October 2012, 864–877.
  23. [23] M. Yahya and M. Arshad, Tracking of multiple light sourcesusing computer vision for underwater docking, Procedia Com-puter Science, 76, 2015, 192–197.
  24. [24] L. Zhang, B. He, Y. Song, and T. Yan, Consistent targettracking via multiple underwater cameras, Proc. OCEANS2016 – Shanghai, Shanghai, China, 2016, 10–13.
  25. [25] M. Chuang, J. Hwang, J. Ye, S. Huang, and K. Williams, Un-derwater fish tracking for moving cameras based on deformablemultiple Kernels, IEEE Transactions on Systems, Man, andCybernetics: Systems, 47(9), 2017, 2467–2477.
  26. [26] S. Pan, L. Shi, and S. Guo, A Kinect-based real-time com-pressive tracking prototype system for amphibious sphericalrobots, Sensors, 15(4), 2015, 8232–8252.
  27. [27] K. Wang, Y. Liu, and L. Li, Visual servoing based trajectorytracking of underactuated water surface robots without directposition measurement, IEEE/RSJ Int. Conf. Intelligent Robotsand Systems, Chicago, IL, 14–18 September 2014, 767–772.
  28. [28] X. Cheng, N. Li, T. Zhou, L. Zhou, and Z. Wu, Object trackingvia collaborative multi-task learning and appearance modelupdating, Applied Soft Computing, 31, 2015, 81–89.
  29. [29] Y. Li, S. Guo, and C. Yue, Preliminary concept of a novel spher-ical underwater robot, International Journal of Mechatronicsand Automation, 5(1), 2015, 11–21.
  30. [30] L. Shi, S. Guo, S. Mao, C. Yue, M. Li, and K. Asaka,Development of an amphibious turtle-inspired spherical motherrobot, Journal of Bionic Engineering, 10(4), 2013, 446–455.
  31. [31] L. Shi, Y. He, and S. Guo, Skating motion analysis of theamphibious quadruped mother robot, Proc. IEEE Int. Conf.Mechatronics and Automation, Takamatsu, 4–7 August 2013,1749–1754.
  32. [32] C. Yue, S. Guo, M. Li, Y. Li, H. Hirata, and H. Ishihara,Mechantronic system and experiments of a spherical underwaterrobot: SUR-II, Journal of Intelligent and Robotic Systems,2015, DOI: 10.1007/s10846-015-0177-3.
  33. [33] Q. Fu, S. Guo, Y. Yamauchi, H. Hirata, and H. Ishihara,A novel hybrid microrobot using rotational magnetic field formedical applications, Biomedical Microdevices, 17(2), 2015,DOI: 10.1007/s10544-015-9942-0.
  34. [34] C. Yue, S. Guo, L. Shi, Design and performance evaluationof a biomimetic microrobot for the father–son underwaterintervention robotic system, Microsystem Technologies, 22(4),2016, 831–841.
  35. [35] Y. He, S. Guo, L. Shi, S. Pan, and Z. Wang, 3D printingtechnology-based an amphibious spherical underwater robot,Proc. of 2014 IEEE Int. Conf. Mechatronics and Automation,Tianjin, China, 2014, 1382–1387.
  36. [36] Y. He, L. Shi, S. Guo, S. Pan, and Z. Wang, Preliminarymechanical analysis of an improved amphibious spherical fatherrobot, Microsystem Technologies, 2015, 1–16, DOI: 10.1007/s00542-015-2504-9.
  37. [37] S. Pan, S. Guo, L. Shi, Y. He, Z. Wang, and Q. Huang,A spherical robot based on all programmable SoC and 3-Dprinting, Proc. of 2014 IEEE Int. Conf. Mechatronics andAutomation, Tianjin, China, 2015, 150–155.
  38. [38] S. Guo, Y. He, L. Shi, S. Pan, K. Tang, R. Xiao, andP. Guo, Modal and fatigue analysis of critical componentsof an amphibious spherical robot, Microsystem Technologies,2016, 1–15, DOI: 10.1007/s00542-016-3083-0.
  39. [39] S. Guo, Y. He, L. Shi, S. Pan, R. Xiao, K. Tang, and P. Guo,Modeling and experimental evaluation of an improved amphibi-ous robot with compact structure, Robotics and Computer-Integrated Manufacturing, 51, 2018, 37–52.
  40. [40] X. Zhou, Q. Qian, Y. Ye, and C. Wang, Improved TLD visualtarget tracking algorithm, Journal of Image and Graphics,18(9), 2013, 1115–1123.
  41. [41] T. Xu, C. Huang, Q. He, Q.G. Guang, and Y. Zhang, Animproved TLD target tracking algorithm, IEEE InternationalConf. on Information and Automation, Macau, China, 2017,2051–2055.
  42. [42] M. Cheng, Z. Zhang, W. Lin, and P. Torr, Binarized normedgradients for targetness estimation at 300 fps, Proc. IEEEConf. Computer Vision & Pattern Recognition, Columbus,OH, 23–28 June 2014, 3286–3293.
  43. [43] B. Alexe, T. Deselaers, and V. Ferrari, Measuring the objectnessof image windows, IEEE Transactions on Pattern Analysisand Machine Intelligence, 34(11), 2012, 2189–2202.
  44. [44] S. Cheng, Y. Cao, J. Sun, G. Liu, and G. Han, Efficient targettracking by TLD based on binary normed gradients, Opticsand Precision Engineering, 23(8), 2015, 2339–2348.
  45. [45] K. Zhang, L. Zhang, and M. Yang, Fast compressive track-ing, IEEE Transactions on Pattern Analysis and MachineIntelligence, 36(10), 2014, 2002–2015.

Important Links:

Go Back