User-prosthesis interface for upper limb prosthesis based on object classification

J Fajardo, V Ferman, A Muñoz… - 2018 Latin American …, 2018 - ieeexplore.ieee.org
2018 Latin American Robotic Symposium, 2018 Brazilian Symposium on …, 2018ieeexplore.ieee.org
The complexity of User-Prosthesis Interfaces (UPIs) to control and select different grip
modes and gestures of active upper-limb prostheses, as well as the issues presented by the
use of electromyography (EMG), along with the long periods of training and adaptation
influence amputees on stopping using the device. Moreover, development cost and
challenging research makes the final product too expensive for the vast majority of
transradial amputees and often leaves the amputee with an interface that does not satisfy his …
The complexity of User-Prosthesis Interfaces (UPIs) to control and select different grip modes and gestures of active upper-limb prostheses, as well as the issues presented by the use of electromyography (EMG), along with the long periods of training and adaptation influence amputees on stopping using the device. Moreover, development cost and challenging research makes the final product too expensive for the vast majority of transradial amputees and often leaves the amputee with an interface that does not satisfy his needs. Usually, EMG controlled multi grasping prosthesis are mapping the challenging detection of a specific contraction of a group of muscle to one type of grasping, limiting the number of possible grasps to the number of distinguishable muscular contraction. To reduce costs and to facilitate the interaction between the user and the system in a customized way, we propose a hybrid UPI based on object classification from images and EMG, integrated with a 3D printed upper-limb prosthesis, controlled by a smartphone application developed in Android. This approach allows easy updates of the system and lower cognitive effort required from the user, satisfying a trade-off between functionality and low cost. Therefore, the user can achieve endless predefined types of grips, gestures, and sequence of actions by taking pictures of the object to interact with, only using four muscle contractions to validate and actuate a suggested type of interaction. Experimental results showed great mechanical performances of the prosthesis when interacting with everyday life objects, and high accuracy and responsiveness of the controller and classifier.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果