Hand as Natural Man-Machine Interface in Smart Environments

W. Xie, E.K. Teoh, R. Venkateswarlu, and X. Chen (Singapore)


Stereovision, Human Computer Interaction, Hand Postures, Fuzzy Neural Network


In near future the computers, the vision/ sense /speech sensors, intelligent adaptive wireless networks, etc. would disappear into the environment creating smart spaces with embedded information around us. But human is the prime user of that embedded information. As a part of human interaction with embedded machines, we need human like natural interfaces. In this paper, we propose a proto-type system for a meeting room using an active stereo-vision to create an interactive digital space (embedded smart space) and use hand as a (i) laser pointer, (ii) as virtual pen enabling us to write on the screen remotely, and (iii) as virtual mouse to drag and drop or highlight a particular part on the screen, and (iv) an auxiliary mode to erase what we have written on the screen. Besides robust tracking of hand in 3D space, we need reliable algorithm to robustly recognize different modes/gestures of hand to perform different functions. The focus in this paper is to develop a robust algorithm based on fuzzy neural network (FNN) that could distinguish hand as three different modes of operation. Our experiments have successfully demonstrated the above three modes in real time using an active stereo vision system.

Important Links:

Go Back