F.F. Khalil, P. Payeur, and A.-M. Cretu (Canada)
Sensor fusion, deformable objects, elastic properties, dexterous manipulation.
Designing a dexterous robotic hand able to interact intelligently with deformable objects constitutes a challenging area of research where many issues are yet to be solved. The complexity of such interactions requires the assistance of intelligent multisensory robotic systems that combine measurements collected from different sensors in order to accurately plan for the forces to be applied on the deformable object. This paper presents the development of a real-time multisensory robotic hand platform that incorporates live measurements of its internal position, velocity and force parameters along with data from external tactile sensors and a stereoscopic vision device. The resulting prototype of the integrated multisensory system is validated experimentally by the computation of deformable object models in which the measurements are merged. A formal dynamic model is discussed and a neural network representation model is presented. The results demonstrate the performance and suitability of the multisensory platform for the development of enhanced robotic hand capabilities when manipulating deformable objects.
Important Links:
Go Back