On Tracking of Eye for Human-Robot Interface

M.A. Bhuiyan, V. Ampornaramveth, S. Muto, and H. Ueno


Eye tracking, gaze direction, human-robot interface, colour histogram, smart board, Aibo


This article presents a real-time eye-tracking system for human- robot interaction. Depending on the position and movement of the eyes, the system determines where on the display the user is looking. The user can make a selection by moving his/her eyes in different directions; depending on these selections, the system provides instructions for controlling the robot using the gaze direction. The system is based on visual and geometrical information from the user’s face gathered from the video streams and is commenced with the estimation of the face area, depending on the hue components of the images in the HSV colour histograms. Eyes are then localized from the face skeleton with the knowledge of the face geometry. Eye tracking is established by computation of the optical flow in consecutive frames of the video sequences. Experimental results demonstrate that the system is fast and efficient to use in real-time applications. This eye tracker system has been employed for the implementation of a human-robot interface.

Important Links:

Go Back