Learning Manipulation Graphs from Demonstrations Using Multimodal Sensory Signals

Complex contact manipulation tasks can be decomposed into sequences of motor primitives. Individual primitives often end with a distinct contact state, such as inserting a screwdriver tip into a screw head or loosening it through twisting. To achieve robust execution, the robot should be able to verify that the primitive's goal has been reached as well as disambiguate it from erroneous contact states. In this paper, we introduce and evaluate a framework to autonomously construct manipulation graphs from manipulation demonstrations. Our manipulation graphs include sequences of motor primitives for performing a manipulation task as well as corresponding contact state information. The sensory models for the contact states allow the robot to verify the goal of each motor primitive as well as detect erroneous contact changes. The proposed framework was experimentally evaluated on grasping, unscrewing, and insertion tasks on a Barrett arm and hand equipped with two BioTacs. The results of our experiments indicate that the learned manipulation graphs achieve more robust manipulation executions by confirming sensory goals as well as discovering and detecting novel failure modes.

[1]  Oussama Khatib,et al.  Experimental Analysis of Human Control Strategies in Contact Manipulation Tasks , 2016, ISER.

[2]  Gaurav S. Sukhatme,et al.  Learning to Switch Between Sensorimotor Primitives Using Multimodal Haptic Signals , 2016, SAB.

[3]  Gaurav S. Sukhatme,et al.  Force estimation and slip detection/classification for grip control using a biomimetic tactile sensor , 2015, 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids).

[4]  Jan Peters,et al.  Learning movement primitive libraries through probabilistic segmentation , 2017, Int. J. Robotics Res..

[5]  Rachid Alami,et al.  Two manipulation planning algorithms , 1995 .

[6]  Scott Niekum,et al.  Efficient Hierarchical Robot Motion Planning Under Uncertainty and Hybrid Dynamics , 2018, CoRL.

[7]  Ulrike von Luxburg,et al.  A tutorial on spectral clustering , 2007, Stat. Comput..

[8]  Christopher G. Atkeson,et al.  Online Bayesian changepoint detection for articulated motion models , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[9]  Ryan P. Adams,et al.  Bayesian Online Changepoint Detection , 2007, 0710.3742.

[10]  Oliver Kroemer,et al.  Learning robot tactile sensing for object manipulation , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[11]  Jing Xiao,et al.  Planning Motions Compliant to Complex Contact States , 2001, Int. J. Robotics Res..

[12]  Marc Toussaint,et al.  Active exploration of joint dependency structures , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[13]  Stefan Schaal,et al.  Towards Associative Skill Memories , 2012, 2012 12th IEEE-RAS International Conference on Humanoid Robots (Humanoids 2012).

[14]  Stefan Schaal,et al.  Online movement adaptation based on previous sensor experiences , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[15]  Rachid Alami,et al.  A geometrical approach to planning manipulation tasks. The case of discrete placements and grasps , 1991 .

[16]  Gerald E. Loeb,et al.  Utility of contact detection reflexes in prosthetic hand control , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[17]  J. Randall Flanagan,et al.  Coding and use of tactile signals from the fingertips in object manipulation tasks , 2009, Nature Reviews Neuroscience.

[18]  Sachin Chitta,et al.  Human-Inspired Robotic Grasp Control With Tactile Sensing , 2011, IEEE Transactions on Robotics.

[19]  Scott Kuindersma,et al.  Robot learning from demonstration by constructing skill trees , 2012, Int. J. Robotics Res..

[20]  Stefan Schaal,et al.  Data-Driven Online Decision Making for Autonomous Manipulation , 2015, Robotics: Science and Systems.

[21]  Leslie Pack Kaelbling,et al.  Hierarchical planning for multi-contact non-prehensile manipulation , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[22]  Oliver Kroemer,et al.  Learning to predict phases of manipulation tasks as hidden states , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[23]  Jan Peters,et al.  Learning movement primitive attractor goals and sequential skills from kinesthetic demonstrations , 2015, Robotics Auton. Syst..

[24]  Veronica J. Santos,et al.  Biomimetic Tactile Sensor Array , 2008, Adv. Robotics.

[25]  Scott Niekum,et al.  Learning grounded finite-state representations from unstructured demonstrations , 2015, Int. J. Robotics Res..

[26]  Jitendra Malik,et al.  Normalized cuts and image segmentation , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[27]  A. Thomaz,et al.  The Role of Multisensory Data for Automatic Segmentation of Manipulation Skills , 2017 .

[28]  Dominik Henrich,et al.  Automatic adaptation of sensor-based robots , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[29]  Stefan Schaal,et al.  Movement segmentation using a primitive library , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[30]  Miles C. Bowman,et al.  Control strategies in object manipulation tasks , 2006, Current Opinion in Neurobiology.