EMG and IMU based real-time HCI using dynamic hand gestures for a multiple-DoF robot arm
Academic Article
Overview
Research
Identity
Additional Document Info
Other
View All
Overview
abstract
2018 - IOS Press and the authors. This study focuses on a myoelectric interface that controls a robotic manipulator via neuromuscular electrical signals generated when humans make hand gestures. The proposed system recognizes dynamic hand motions, which change shapes, poses, and configuration of a hand over time, in real-time. Varying muscle forces controls the activation/inactivation modes. Gradients of a limb orientation give directions of movements of the robot arm. Classified dynamic motions are used to change the control states of the HCI system. The performance of the myoelectric interface was measured in terms of real-time classification accuracy, path efficiency, and time-related measures. The usability of the developed myoelectric interface was also compared to a button-based jog interface. A total of sixteen human subjects were participated. The average real-time classification accuracy of the myoelectric interface was over 95.6%. The path efficiency of the myoelectric interface of the majority of the subjects showed similar performance to that of the jog interface. The results of the jog interface in the time-measures outperformed the results of the myoelectric interface. However, with the consideration of the overall advantages of the myoelectric interface, the decrease in the time-related performances may be offset.