Kang Li, Xiaoguang Zhao, Shiying Sun, and Min Tan
Robust tracking and following, convolutional operation, Bayesian framework, PID control, Microsoft Kinect, mobile robot
In this paper, we present a novel and robust visual tracking algorithm to obtain accurate target position for a mobile robot following control. To identify and localize tracked target in consecutive frames, the spatio-temporal relationship between the object of interest and its local context is formulated by using the Bayesian framework, and the best target location is ascertained by computing a confidence map which maximizes an object location likelihood. Specifically, convolution operation on middle-level feature space is utilized to measure the similarity between the target and its surrounding regions, and convolution theorem is applied to speed up detecting and locating the tracked target. Based on the proposed tracking algorithm, a robust target tracker is designed for a mobile robot to estimate the image position of the target. In conjunction with the depth information captured by Microsoft Kinect and typical Proportional, Integral, Derivative control method for the mobile robot, the robust target tracking and following system is developed to integrate tracking accuracy and agile following. Extensive experiments on the tracking benchmark illustrate the impressive performance of our tracker. Despite illumination changes and partial occlusion, several real-life tests are executed on mobile robot platform pretty well.
Important Links:
Go Back