Detail View

Multivariate CNN Model for Human Locomotion Activity Recognition with a Wearable Exoskeleton Robot
Citations

WEB OF SCIENCE

Citations

SCOPUS

Metadata Downloads

DC Field Value Language
dc.contributor.author Son, Chang-Sik -
dc.contributor.author Kang, Won-Seok -
dc.date.accessioned 2024-01-02T20:10:12Z -
dc.date.available 2024-01-02T20:10:12Z -
dc.date.created 2023-09-15 -
dc.date.issued 2023-09 -
dc.identifier.issn 2306-5354 -
dc.identifier.uri http://hdl.handle.net/20.500.11750/47524 -
dc.description.abstract This study introduces a novel convolutional neural network (CNN) architecture, encompassing both single and multi-head designs, developed to identify a user’s locomotion activity while using a wearable lower limb robot. Our research involved 500 healthy adult participants in an activities of daily living (ADL) space, conducted from 1 September to 30 November 2022. We collected prospective data to identify five locomotion activities (level ground walking, stair ascent/descent, and ramp ascent/descent) across three terrains: flat ground, staircase, and ramp. To evaluate the predictive capabilities of the proposed CNN architectures, we compared its performance with three other models: one CNN and two hybrid models (CNN-LSTM and LSTM-CNN). Experiments were conducted using multivariate signals of various types obtained from electromyograms (EMGs) and the wearable robot. Our results reveal that the deeper CNN architecture significantly surpasses the performance of the three competing models. The proposed model, leveraging encoder data such as hip angles and velocities, along with postural signals such as roll, pitch, and yaw from the wearable lower limb robot, achieved superior performance with an inference speed of 1.14 s. Specifically, the F-measure performance of the proposed model reached 96.17%, compared to 90.68% for DDLMI, 94.41% for DeepConvLSTM, and 95.57% for LSTM-CNN, respectively. © 2023 by the authors. -
dc.language English -
dc.publisher MDPI AG -
dc.title Multivariate CNN Model for Human Locomotion Activity Recognition with a Wearable Exoskeleton Robot -
dc.type Article -
dc.identifier.doi 10.3390/bioengineering10091082 -
dc.identifier.wosid 001077096000001 -
dc.identifier.scopusid 2-s2.0-85172268284 -
dc.identifier.bibliographicCitation Son, Chang-Sik. (2023-09). Multivariate CNN Model for Human Locomotion Activity Recognition with a Wearable Exoskeleton Robot. Bioengineering, 10(9), 1082. doi: 10.3390/bioengineering10091082 -
dc.description.isOpenAccess TRUE -
dc.subject.keywordAuthor human activity recognition -
dc.subject.keywordAuthor hyperparameter optimization -
dc.subject.keywordAuthor multi-head CNN -
dc.subject.keywordAuthor single-head CNN -
dc.subject.keywordAuthor time series classification -
dc.subject.keywordAuthor wearable robot -
dc.citation.number 9 -
dc.citation.startPage 1082 -
dc.citation.title Bioengineering -
dc.citation.volume 10 -
dc.description.journalRegisteredClass scie -
dc.description.journalRegisteredClass scopus -
dc.relation.journalResearchArea Biotechnology & Applied Microbiology; Engineering -
dc.relation.journalWebOfScienceCategory Biotechnology & Applied Microbiology; Engineering, Biomedical -
dc.type.docType Article -
Show Simple Item Record

File Downloads

  • There are no files associated with this item.

공유

qrcode
공유하기

Related Researcher

강원석
Kang, Won-Seok강원석

Division of Intelligent Robotics

read more

Total Views & Downloads