Cited 0 time in
Cited 1 time in
Intention recognition method for sit-to-stand and stand-to-sit from electromyogram signals for overground lower-limb rehabilitation robots
- Intention recognition method for sit-to-stand and stand-to-sit from electromyogram signals for overground lower-limb rehabilitation robots
- Chung, Sang Hun; Lee, Jong Min; Kim, Seung Jong; Hwang, Yo Ha; An, Jin Ung
- DGIST Authors
- An, Jin Ung
- Issue Date
- IEEE/ASME International Conference on Advanced Intelligent Mechatronics, AIM 2015, 2015-August, 418-421
- Article Type
- Conference Paper
- This paper presents a framework for classifying sit-to-stand and stand-to-sit from just two channel EMG signals taken from the left leg. Our proposed framework uses linear discriminant analysis (LDA) as the classifier and a multi-window feature extraction approach termed Consecutive Time-Windowed Feature Extraction (CTFE). We present the prelimnary results from 2 healthy subjects as a proof of concept. With the two tested subjects, we got predictive accuracies above 90%. The results show promise for a framework capable of recognizing the user's intention of sit-to-stand and stand-to-sit. Potential applications include rehabilitation robots for hemiparesis patients and exoskeleton control. © 2015 IEEE.
- Institute of Electrical and Electronics Engineers Inc.
- Related Researcher
Brain Robot Interaction Lab
There are no files associated with this item.
- ETC2. Conference Papers
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.