Concepedia

Publication | Closed Access

A²Tformer: Addressing Temporal Bias and Nonstationarity in Transformer-Based IoT Time Series Classification

16

Citations

36

References

2025

Year

Abstract

Sensor devices continuously generate large volumes of time series data in the Internet of Things (IoT) environment. These voluminous streams require models that scale to massive data while discerning the intricate, multi-scale patterns embedded in diverse temporal sequences. Transformer models have been widely used for IoT time series analysis due to their strong feature representation and global modeling capability. However, existing architectures struggle to explicitly capture temporal structures and adapt to non-stationary data, limiting classification performance. To address these issues, we propose a novel attention mechanism based on the autocorrelation function, named A2T, which leverages lag characteristics to unify temporal modeling and feature extraction. We further introduce a Parameterized Wavelet Transform Module that learns scale and bandwidth end-to-end and uses an attention gate to fuse multi-resolution coefficients. Building on this, we design a Dual-Channel Time-Frequency Feature Extraction module to improve adaptability to distribution shifts. Integrating these components, we develop A2Tformer for IoT time series classification. Experimental results on the UCR dataset demonstrate that A2Tformer achieves an average accuracy of 84.49% and ranks first on 26 out of all datasets, outperforming state-of-the-art Transformer-based models.

References

YearCitations

Page 1