BiGS: BioTac Grasp Stability Dataset

Autonomous grasping of unknown objects is a fundamental requirement for robots performing manipulation tasks in real world environments. Even though there has been a lot of progress in the area of grasping, it is still considered an open challenge and even the state-of-the-art grasping methods may result in failures [1]. A reliable prediction of grasp stability helps to avoid such failures and provides an option to re-grasp the object safely. Since the majority of grasping failures happen at the contact points, which are occluded for vision systems, tactile feedback plays a major role for predicting grasp stability. The human-inspired biomimmetic tactile sensor (BioTac) [2] is equipped with a 19-electrode array and a hydroacoustic sensor surrounded by silicon skin inflated with incompressible and conductive liquid. This design provides rich tactile feedback similar to the slowly-adapting and fastadapting afferents present in the human skin [3]. Latest developments in classification algorithms [4] allow us to explore the potential of large amounts of data from these sensors. Our goal is to provide a publicly accessible grasp-stability dataset collected using the BioTacs and, thus, enable further development of algorithms capable of reliable grasp stability prediction.

[1]  Gaurav S. Sukhatme,et al.  Force estimation and slip detection/classification for grip control using a biomimetic tactile sensor , 2015, 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids).

[2]  Veronica J. Santos,et al.  Biomimetic Tactile Sensor Array , 2008, Adv. Robotics.

[3]  Danica Kragic,et al.  ST-HMP: Unsupervised Spatio-Temporal feature learning for tactile data , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[4]  Danica Kragic,et al.  Data-Driven Grasp Synthesis—A Survey , 2013, IEEE Transactions on Robotics.

[5]  J. Randall Flanagan,et al.  Coding and use of tactile signals from the fingertips in object manipulation tasks , 2009, Nature Reviews Neuroscience.