INTEGRATION OF MULTIPLE SENSOR SPACES WITH LIMITED SENSING RANGE AND REDUNDANCY

Yuichi Kobayashi, Eisuke Kurita, and Manabu Gouko

Keywords

Robot motion learning, limited-sensing range, multiple sensors, redundant observation

Abstract

Robot sensors sometimes do not have a complete view of a scene, because of occlusion or limited-sensing range. It is a challenge to accurately navigate such robots without integrating the information from multiple sensors in the world coordinate. This paper presents a motion-generation method for a robot having multiple sensors with limited sensing ranges. The proposed method introduces an extension of action-observation mapping to outside the sensing range of a sensor on the basis of diffusion-based learning of Jacobian matrices between control input and observation variables. Multiple observation spaces can be integrated by finding a correspondence among the virtually extended observation spaces. When a target observation variable is given to the robot, using an extended observation space, it can generate a motion from an observation space toward the target having another observation space. In addition, using a nonlinear surface-approximation framework, a dimension-reduction method is presented to deal with the case in which sensors provide redundant information on the robot’s configuration. The proposed framework is verified by two robot tasks; reaching motion toward the floor using a manipulator and navigation of the mobile robot around a wall. In both cases, the observation space of a camera with a limited view was extended and appropriate motion trajectories were obtained.

Important Links:



Go Back