Skip to yearly menu bar Skip to main content


Poster

TSLANet: Rethinking Transformers for Time Series Representation Learning

Emadeldeen Eldele · Mohamed Ragab · Zhenghua Chen · Min Wu · Xiaoli Li


Abstract:

Time series data, characterized by its intrinsic long and short-range dependencies, poses a unique challenge across analytical applications. While Transformer-based models excel at capturing long-range dependencies, they face limitations in noise sensitivity and computational efficiency, especially with smaller datasets. In response, we introduce a novel Time Series Lightweight Adaptive Network (TSLANet), as a convolutional foundation model for diverse time series tasks. Specifically, we propose an Adaptive Spectral Block (ASB), harnessing Fourier analysis to enhance feature representation and to capture both long-term and short-term interactions while mitigating noise via adaptive thresholding. Additionally, we introduce an Interactive Convolution Block (ICB) and leverage self-supervised learning to refine TSLANet capacity for decoding complex temporal patterns and improve its robustness on different datasets. Our comprehensive experiments demonstrate that TSLANet outperforms state-of-the-art models in tasks of spanning classification, forecasting, and anomaly detection, showcasing its resilience and adaptability across a spectrum of levels of noise and data sizes.

Live content is unavailable. Log in and register to view live content