Publication Date

1998

Abstract

This work describes the architecture of an integrated multi-modal sensory (vision and touch) computational system. We propose to use an approach based on robotics control theory that is motivated by biology and developmental psychology, in order to integrate the haptic and visual information processing. We show some results carried out in simulation and discuss the implementation of this system using a platform consisting on an articulated stereo-head and an arm, which is currently under development.

Comments

This paper was harvested from CiteSeer

Share

COinS