This work describes the architecture of an integrated multi-modal sensory (vision and touch) computational system. We propose to use an approach based on robotics control theory that is motivated by biology and developmental psychology, in order to integrate the haptic and visual information processing. We show some results carried out in simulation and discuss the implementation of this system using a platform consisting on an articulated stereo-head and an arm, which is currently under development.
Gonçalves, Luiz M. G.; Grupen, Roderic A.; and Oliveira, Antonio A. F., "A Control Architecture forMulti-modal Sensory Integration" (1998). Computer Science Department Faculty Publication Series. 184.
Retrieved from https://scholarworks.umass.edu/cs_faculty_pubs/184