Mapping haptic exploratory procedures to multiple shape representations

Research in human haptics has revealed a number of exploratory procedures (EPs) that are used in determining attributes on an object, particularly shape. This research has been used as a paradigm for building an intelligent robotic system that can perform shape recognition from touch sensing. In particular, a number of mappings between EPs and shape modeling primitives have been found. The choice of shape primitive for each EP is discussed, and results from experiments with a Utah-MIT dextrous hand system are presented. A vision algorithm to complement active touch sensing for the task of autonomous shape recovery is also presented.<<ETX>>