Concepedia

Publication | Open Access

GraphSleepNet: Adaptive Spatial-Temporal Graph Convolutional Networks for Sleep Stage Classification

210

Citations

20

References

2020

Year

TLDR

Sleep stage classification is essential for sleep assessment and disease diagnosis, yet effectively leveraging brain spatial features and transition information remains challenging due to limited knowledge of brain connectivity. The study proposes GraphSleepNet, a deep graph neural network for automatic sleep stage classification. GraphSleepNet adaptively learns an adjacency matrix of EEG channel connections and applies a spatial‑temporal graph convolution network that extracts spatial features and captures stage transition rules. Experiments on the MASS dataset show that GraphSleepNet outperforms state‑of‑the‑art baselines.

Abstract

Sleep stage classification is essential for sleep assessment and disease diagnosis. However, how to effectively utilize brain spatial features and transition information among sleep stages continues to be challenging. In particular, owing to the limited knowledge of the human brain, predefining a suitable spatial brain connection structure for sleep stage classification remains an open question. In this paper, we propose a novel deep graph neural network, named GraphSleepNet, for automatic sleep stage classification. The main advantage of the GraphSleepNet is to adaptively learn the intrinsic connection among different electroencephalogram (EEG) channels, represented by an adjacency matrix, thereby best serving the spatial-temporal graph convolution network (ST-GCN) for sleep stage classification. Meanwhile, the ST-GCN consists of graph convolutions for extracting spatial features and temporal convolutions for capturing the transition rules among sleep stages. Experiments on the Montreal Archive of Sleep Studies (MASS) dataset demonstrate that the GraphSleepNet outperforms the state-of-the-art baselines.

References

YearCitations

2025

16K

2017

8.3K

2024

5.3K

2012

5.1K

2016

5.1K

2013

2.7K

2019

2.6K

2017

1.3K

1969

786

2019

544

Page 1