Detail View

Few-Shot Relation Learning with Attention for EEG-based Motor Imagery Classification
Citations

WEB OF SCIENCE

Citations

SCOPUS

Metadata Downloads

Title
Few-Shot Relation Learning with Attention for EEG-based Motor Imagery Classification
Issued Date
2020-10-28
Citation
An, Sion. (2020-10-28). Few-Shot Relation Learning with Attention for EEG-based Motor Imagery Classification. IEEE/RSJ International Conference on Intelligent Robots and Systems, 10933–10938. doi: 10.1109/IROS45743.2020.9340933
Type
Conference Paper
ISBN
9781728162126
ISSN
2153-0858
Abstract
Brain-Computer Interfaces (BCI) based on Electroencephalography (EEG) signals, in particular motor imagery (MI) data have received a lot of attention and show the potential towards the design of key technologies both in healthcare and other industries. MI data is generated when a subject imagines movement of limbs and can be used to aid rehabilitation as well as in autonomous driving scenarios. Thus, classification of MI signals is vital for EEG-based BCI systems. Recently, MI EEG classification techniques using deep learning have shown improved performance over conventional techniques. However, due to inter-subject variability, the scarcity of unseen subject data, and low signal-to-noise ratio, extracting robust features and improving accuracy is still challenging. In this context, we propose a novel two-way few shot network that is able to efficiently learn how to learn representative features of unseen subject categories and how to classify them with limited MI EEG data. The pipeline includes an embedding module that learns feature representations from a set of samples, an attention mechanism for key signal feature discovery, and a relation module for final classification based on relation scores between a support set and a query signal. In addition to the unified learning of feature similarity and a few shot classifier, our method leads to emphasize informative features in support data relevant to the query data, which generalizes better on unseen subjects. For evaluation, we used the BCI competition IV 2b dataset and achieved an 9.3% accuracy improvement in the 20-shot classification task with state-of-the-art performance. Experimental results demonstrate the effectiveness of employing attention and the overall generality of our method. © 2020 IEEE
URI
https://scholar.dgist.ac.kr/handle/20.500.11750/58644
DOI
10.1109/IROS45743.2020.9340933
Publisher
IEEE Robotics and Automation Society
Show Full Item Record

File Downloads

  • There are no files associated with this item.

공유

qrcode
공유하기

Related Researcher

김수필
Kim, Soopil김수필

Division of Intelligent Robotics

read more

Total Views & Downloads