Concepedia

Publication | Closed Access

An Efficient Temporal Network with Dual Self-Distillation for Electroencephalography Signal Classification

36

Citations

12

References

2022

Year

Abstract

Over the years, several deep learning algorithms have been proposed for electroencephalography (EEG) signal classification. The performance of any learning method usually relies on the quality of the learned representation that provides semantic information for downstream tasks such as classification. Thus, it is crucial to improve the model’s representation learning capability. This paper proposes an Efficient Temporal Network with dual self-distillation for EEG signal classification, ETNEEG. It enhances the model’s representation learning by promoting mutual learning between higher-level and lower-level semantic information. The proposed ETNEEG consists of two main components: a parallel dual-network-based feature extractor called MLN-GRN and a dual self-distillation module. MLN-GRN includes a multi-scale local network (MLN) and a global relation network (GRN). MLN pays attention to local features of EEG data, and GRN is designed for learning global patterns of EEG data. Meanwhile, the dual self-distillation module extracts semantic information by mutual learning among the output layer and the low-level features. To evaluate the proposed method’s performance, seven widely used public EEG datasets, i.e., FaceDetection, FingerMovements, HandMovementDirection, MotorImagery, PenDigits, SelfRegulationSCP1, and SelfRegulationSCP2, are applied to a set of experiments. Experimental results demonstrate that the proposed ETNEEG achieves excellent performance on these datasets compared with fourteen existing algorithms.

References

YearCitations

Page 1