Publication | Closed Access
A Framework for Hand Gesture Recognition Based on Accelerometer and EMG Sensors
600
Citations
34
References
2011
Year
Chinese Sign LanguageEngineeringEmg SensorsBiometricsSign Language RecognitionWearable TechnologyAccelerometerSpeech RecognitionEmg SignalsKinesiologyPattern RecognitionMultimodal InteractionHuman MotionGesture ProcessingMultimodal Human Computer InterfaceAmerican Sign LanguageGesture StudiesHealth SciencesComputer EngineeringComputer ScienceGesture RecognitionHuman MovementActivity RecognitionHand Gesture Recognition
The paper proposes a hand gesture recognition framework that fuses three‑axis accelerometer and multichannel EMG sensor data. The framework automatically segments gestures using EMG intensity, fuses the ACC and EMG signals with a decision tree and multistream hidden Markov models, and applies this to Chinese sign language word and sentence classification as well as a real‑time Rubik’s cube control system evaluated on ten subjects. Experiments on 72 Chinese sign language words and 40 sentences demonstrate that ACC and EMG fusion improves recognition accuracy, and the system enables natural, intelligent control in a real‑time gesture‑based interface.
This paper presents a framework for hand gesture recognition based on the information fusion of a three-axis accelerometer (ACC) and multichannel electromyography (EMG) sensors. In our framework, the start and end points of meaningful gesture segments are detected automatically by the intensity of the EMG signals. A decision tree and multistream hidden Markov models are utilized as decision-level fusion to get the final results. For sign language recognition (SLR), experimental results on the classification of 72 Chinese Sign Language (CSL) words demonstrate the complementary functionality of the ACC and EMG sensors and the effectiveness of our framework. Additionally, the recognition of 40 CSL sentences is implemented to evaluate our framework for continuous SLR. For gesture-based control, a real-time interactive system is built as a virtual Rubik's cube game using 18 kinds of hand gestures as control commands. While ten subjects play the game, the performance is also examined in user-specific and user-independent classification. Our proposed framework facilitates intelligent and natural control in gesture-based interaction.
| Year | Citations | |
|---|---|---|
Page 1
Page 1