Semantic Scholar Open Access 2022 76 sitasi

Learning to Rotate: Quaternion Transformer for Complicated Periodical Time Series Forecasting

Weiqiu Chen Wen-wu Wang Bingqing Peng Qingsong Wen Tian Zhou +1 lainnya

Abstrak

Time series forecasting is a critical and challenging problem in many real applications. Recently, Transformer-based models prevail in time series forecasting due to their advancement in long-range dependencies learning. Besides, some models introduce series decomposition to further unveil reliable yet plain temporal dependencies. Unfortunately, few models could handle complicated periodical patterns, such as multiple periods, variable periods, and phase shifts in real-world datasets. Meanwhile, the notorious quadratic complexity of dot-product attentions hampers long sequence modeling. To address these challenges, we design an innovative framework Quaternion Transformer (Quatformer), along with three major components: 1). learning-to-rotate attention (LRA) based on quaternions which introduces learnable period and phase information to depict intricate periodical patterns. 2). trend normalization to normalize the series representations in hidden layers of the model considering the slowly varying characteristic of trend. 3). decoupling LRA using global memory to achieve linear complexity without losing prediction accuracy. We evaluate our framework on multiple real-world time series datasets and observe an average 8.1% and up to 18.5% MSE improvement over the best state-of-the-art baseline.

Topik & Kata Kunci

Penulis (6)

W

Weiqiu Chen

W

Wen-wu Wang

B

Bingqing Peng

Q

Qingsong Wen

T

Tian Zhou

L

Liang Sun

Format Sitasi

Chen, W., Wang, W., Peng, B., Wen, Q., Zhou, T., Sun, L. (2022). Learning to Rotate: Quaternion Transformer for Complicated Periodical Time Series Forecasting. https://doi.org/10.1145/3534678.3539234

Akses Cepat

PDF tidak tersedia langsung

Cek di sumber asli →
Lihat di Sumber doi.org/10.1145/3534678.3539234
Informasi Jurnal
Tahun Terbit
2022
Bahasa
en
Total Sitasi
76×
Sumber Database
Semantic Scholar
DOI
10.1145/3534678.3539234
Akses
Open Access ✓