Yun Shi and Yanyan Zhu
[1] W. Fei, L. Chen, H. Xiaoguang, R. Changlei, and L. Jinghong.Visual guidance of welding robots based on welding partrecognition and posture estimation, Control and Decision-Making, 35(8), 2020, 1873–1878. [2] K. Abhishek, Reinforcement learning: Application and advancestowards stable control strategies, Mechatronic Systems andControl, 51(1), 2023, 53–57. [3] Y. Li, C.L. Song, and X.Y. Chao, Modelling and applicationof a rectifier transformer with primary winding in series ina metallurgical rolling mill system, Mechatronic Systems andControl, 51(4), 2023, 128–192. [4] W. Xiangming, L. Mingchun, W. Haoren, and Z. Licheng,Automatic feeding robot visual recognition system, Jour-nal of Shenyang University of Technology, 40(5), 2018,564–570. [5] J. Baohua, Y. Changkui, Z. Weizheng, and Z. Weiwei, Reviewof research on fruit recognition in apple orchards basedon machine vision, Journal of Light Industry, 34(2), 2019,71–81. [6] T. Zhen, C. Guohua, G. Peng, and C. Qi, Identificationof intravenous drug dispensing robot medicine bottles basedon machine vision and deep learning, Machine Tools andHydraulics, 50(5), 2022, 33–37. [7] Z. Huimin, J. Liu, H. Chen, J. Chen, Y. Li, J. Xu, and W. Deng,Intelligent diagnosis using continuous wavelet transform andgauss convolutional deep belief network, IEEE Transactionson Reliability, 72(2), 2023, 692–702. [8] T. Souhir, Optimisation of network injected power of aninnovated structure of wind turbine, Mechatronic Systems andControl, 51(2), 2023, 67–78. [9] L. Jing, C. Jinhai, P. Zhixuan, L. Jie, W. Wanneng, andZ. Guangbing, Robot grasping experimental system based onmachine vision, Experimental Technology and Management,39(4), 2022, 45–50. [10] Y. Hiroaki, T. Hasegawa, K. Nagahama, and M. Inaba,A research of construction method for autonomous tomatoharvesting robot focusing on harvesting device and visualrecognition, Journal of the Robotics Society of Japan, 36(10),2018, 693–702. [11] Z. Dehong and D. Yan, Construction and software developmentof robot visual handling system, Packaging Engineering, 40(1),2019, 149–155. [12] W. Guoyang, G. Wang, K. Xing, Y. Fan, and T. Yi, Robotvisual measurement and grasping strategy for roughcastings,International Journal of Advanced Robotic Systems, 18(2),2021, 715–720. [13] Z. Tianpeng and S. Longfei, Fault analysis of transmissionline based on big data algorithm, Mechatronic Systems andControl, 50(4), 2022, 216–223. [14] Z. Ya, G. Jiahui, and L. Panchi, A median filtering schemefor quantum images, Journal of Electronics and Information,43(1), 2021, 204–211. [15] X. Yuchao and L. Zhen, Research on the distribution ofmagnetic field in reinforced concrete beams after damage basedon the force-magnetic coupling model, Mechatronic Systemsand Control, 50(3), 2022, 130–137. [16] Z. Liu, P. Wan, L. Ling, L. Chen, and W. Zhou, Recognitionand grabbing system for workpieces exceeding the visual fieldbased on machine vision, Jiqiren/Robot, 40(3), 2018, 294–300+308. [17] B.A. Gunes, B.A. Pearlmutter, A.A. Radul, and J.M. Siskind,Automatic differentiation in machine learning: A survey,Journal of Machine Learning Research, 18(153), 2018, 1–43. [18] W. Li, Automatic tracking algorithms based on wearabletechnology, Mechatronic Systems and Control, 50(1), 2022,16–21. [19] A. Kumar, Reinforcement learning: Application and advancestowards stable control strategies, 53-57. Si, MechatronicSystems and Control, 51(1), 2023. [20] A. Kumar, J.J. Anand, and B.N. Hemanth Kumar, Intrusivevideo oculographic device: An eye-gaze-based device forcommunication, Innovation and Emerging Technologies, 9,2022, 2250002.
Important Links:
Go Back