Surface EMG for force control of mechanical hands

The dexterity of active hand prosthetics is limited not only due to the limited availability of dexterous prosthetic hands, but mainly due to limitations in interfaces. How is an amputee supposed to command the prosthesis what to do (i.e., how to grasp an object) and with what force (i.e., holding a hammer or grasping an egg)? So far, in literature, the most interesting results have been achieved by applying machine learning to forearm surface electromyography (EMG) to classify finger movements; but this approach lacks, in general, the possibility of quantitatively determining the force applied during the grasping act. In this paper we address the issue by applying machine learning to the problem of regression from the EMG signal to the force a human subject is applying to a force sensor. A detailed comparative analysis among three different machine learning approaches (Neural Networks, Support Vector Machines and Locally Weighted Projection Regression) reveals that the type of grasp can be reconstructed with an average accuracy of 90%, and the applied force can be predicted with an average error of 10%, corresponding to about 5N over a range of 50N. None of the tested approaches clearly outperforms the others, which seems to indicate that machine learning as a whole is a viable approach.